You know how sometimes you try to do something good, and it gets all misinterpreted and people get all mad and you wonder why you try to do anything nice at all? I imagine that's exactly how Apple feels right now. 

On Thursday, the company rolled out a series of changes that will add protections for children against being victimized, as well as detecting child sexual abuse material (CSAM) uploaded to iCloud Photo Libraries. We should all be really happy, right? Well, no.

Let's just say the response has been less than enthusiastic from privacy experts, iPhone users, and people who write things on the internet (myself included). 

The biggest concern is that the latter of the two changes represents a major shift in the promise Apple makes to its users around privacy. The impression (whether it's right or not) is that Apple is sifting through your photo library looking for illegal images. 

That's not exactly what's happening. Actually, it's not what's happening at all. But it's what it sounds like to people who don't dive into the details, and let's face it, most people don't dive into the details. I write about this kind of thing on a pretty regular basis and even I have a hard time with the details sometimes. As a result, Apple is facing what can only be seen as its biggest controversy in a very long time.

The thing I can't stop thinking about, however, is that Apple could have avoided much of this. In that sense, the controversy is largely of Apple's making, which is a lesson for every business. And yet, Apple clearly did not expect the response it received. 

Except, if you hang billboards that say "What happens on iPhones stays on iPhones," people take that as a promise. It's a pretty simple marketing message but it means something.  

Apple doesn't monetize your data the way other tech companies do. It has also taken a principled stand against efforts by outside forces to make its devices less secure by weakening encryption with a backdoor. It's even stood its ground on several occasions against the FBI when asked to help access the devices of accused terrorists and mass shooters.

All of that is true, but--in this case--no one will hear anything other than it sounds like you're scanning my photo library, even if there are complicated technical reasons it isn't. 

That's really the problem. The entire thing is very complicated and technical. Almost no one (including most of us who wrote about it) understands hashing and cryptography and private set intersection--which allows a system to determine whether there's a match without revealing the result.

All of that means that Apple is taking a much more privacy-centric approach to solving a very real (and horrifying) problem than its competitors, but no one hears anything other than "we're going to detect whether you have illegal photos on your device." 

Look, of any company I follow, Apple is the most disciplined at crafting its message and public image. Its product events are meticulously produced with production values that rival most of what we see on television. Its communications team is generally as good at getting its message out and controlling the narrative as any company. 

In this case, however, it never controlled the narrative. The entire thing got out of hand quickly because what Apple announced looks like a broken promise. In this case, the rollout made it look like Apple preaches privacy, but then caved to outside pressure to backdoor our devices.  

Again, I want to be clear that I'm not making the argument that that is what happened, but that's definitely the outside view. Mostly that seems to have happened because information about the new features filtered across social media before Apple had officially rolled it out with a narrative about what it was doing.

Imagine, instead, how it might have been different if Apple had put together one of its highly crafted videos with a script that said something like this:

At Apple, we believe privacy is a fundamental human right. We design our devices and services to protect your data from outside attackers that might want to use it for nefarious purposes or companies that want to monetize it with targeted advertising. In fact, because we design the hardware and software to work together, we're able to protect your privacy in ways that other companies can't.

In most cases, even we can't access your data, and we'll never collect more than the absolute minimum we need to do things like show you the local weather or give you directions. In those cases, all of the data processing happens on your device, and isn't sent to our servers. 

At the same time, we believe that we can, and should do something--in a way that respects your privacy--to help fight against one of the worst problems on the internet, child sexual abuse material. Because of the hard work of our privacy team and engineers, we've designed technology that does both. 

In the near future, this technology will help us identify known CSAM that is uploaded to iCloud Photos without ever revealing the contents of your photo library to Apple or anyone else. Only when there's a confirmed match will it be reported to the National Center for Missing and Exploited Children. Because we know how much you care about your privacy, the details on how we're doing it are available at apple.com/child-safety.

Or, say it differently. Say it in whatever way Apple wants to say it, but the point is, there's no way that a company with as much reputational investment in privacy can afford to not be out in front of this. That's true, by the way, for every company. If you want to control the narrative, it's helpful if you have one.