Apple has taken a lot of criticism over its decision to incorporate the detection of child sexual abuse material (CSAM) for users who upload their photos to iCloud Photo Library. Much of that criticism, I wrote over the weekend, is a self-inflicted wound due to Apple's poor rollout of the information surrounding precisely what it's doing.

There are some that call the move a backdoor to your phone's encryption, and that Apple is violating user privacy. I was critical of the move not because anyone thinks eliminating CSAM isn't a noble cause, but because of the precedent it sets for a company that has made "what happens on your iPhone stays on your iPhone" a core value.

Among the loudest voices in the firestorm is Will Cathcart, the CEO of WhatsApp. Cathcart posted to Twitter, calling this change "a wrong approach and a setback for people's privacy all over the world."

I mean, there's a lot to unpack in Cathcart's Twitter thread, and it deserves a look. Cathcart is a very smart guy, and he's in charge of the world's largest messaging platform. His view would seem to carry a lot of weight if it wasn't such a blatant misrepresentation of what Apple is actually doing. We'll get to all of that in a minute. 

First, however, I can't help wondering whether Cathcart knows he works for Facebook.

I mean, he's worked there for a while. He reports to Chris Cox, the head of product, who reports to CEO Mark Zuckerberg. Previously Cathcart led "product development for News Feed and Facebook's introduction of advertising into News Feed and on mobile," according to the company's website, before later overseeing the entire Facebook app. 

I think it's fair to say he's relatively familiar with Facebook's business model. You know, the one where the company scoops up personal information and monetizes it through what it calls "personalized advertising." He's literally the person that brought advertising to Facebook. 

Now he's the head of WhatsApp, which Facebook bought seven years ago for $19 billion dollars. I mean, it's true that WhatsApp is a privacy-focused app. Its messages are encrypted end-to-end. But, that's true of Apple's Messages as well. That hasn't changed. In fact, the change that has gotten all of the pushback from privacy advocates has nothing to do with messaging at all. 

By the way, do you know which messaging service isn't encrypted? Facebook's. That's why Facebook is able to detect and report more than 20 million CSAM images every year sent on its services. Obviously, it doesn't detect any messages sent with WhatsApp because those messages are encrypted, unless users report them. Apple doesn't detect CSAM within Messages either.

Instead, the changes Apple announced apply to photos uploaded to the cloud. The last time I checked, WhatsApp doesn't have a cloud photo service, so when Cathcart says that WhatsApp won't be adopting a similar system, that's true, but also irrelevant.

Here's the thing: Of all the companies that exist in the world today, none is a bigger threat to your privacy than Facebook. There is literally no company that does more to capture your online activity and information than the world's largest social media platform. It does that because it discovered a long time ago that it's very profitable to monetize its users' personal information. Facebook is working on ways to analyze encrypted messages within WhatsApp--not to detect CSAM--but so that it can show you more ads.

Facebook has been in a year-long battle with Apple over the latter's requirement that developers request permission before tracking users across apps and websites. It knows what has since become obvious to everyone--that most people will opt-out of having their personal information tracked when given a choice. As a result, the company has said it expects a material effect on its revenue and profit. 

It does seem a bit hypocritical that Cathcart, who works for the worst privacy offender, to lecture anyone--let alone Apple--about what is or is not "privacy." Facebook suffers from a credibility problem any time it talks about privacy, and that's a real problem for the company. Once you lose credibility, you lose trust, and trust is your most valuable asset.

That isn't to say that Apple should be off the hook. I still think there are real problems with introducing a system that scans your photos looking for illegal content, not because I'm in favor of illegal content, but because it breaks the promise Apple has long made with its users.

I just don't think Cathcart--or Facebook, for that matter--is the best critic on this subject. Then again, it's not particularly out-of-character for a company that often seems to be completely lacking in self-awareness. It's just surprising that Cathcart can't see what a bad look this is.