On Tuesday, ProPublica reported that Facebook employs 1,000 people whose job it is to read WhatsApp messages reported by users. The piece, entitled: "How Facebook Undermines Privacy Protections for Its 2 Billion WhatsApp Users," would lead you to believe that the publication discovered a new way Facebook is involved in a gross invasion of user privacy.
I think, by now, we've all come to recognize that Facebook's entire business model is, in fact, an invasion of user privacy. That's not something unique to this situation. But, in this case, it's not exactly right. More important, the fact that it seemed entirely possible is the real issue. No one trusts Facebook, even when it's arguably doing the right thing--which I think is true in this case.
Here's what I mean:
When you send messages using WhatsApp, they are encrypted, meaning that only the sender and recipient are able to read them. WhatsApp can't read them, and neither can Facebook. It's one of the reasons WhatsApp is the world's largest messaging service. It earned users' trust by protecting their privacy.
That was one of the reasons it was such a big deal when Facebook bought the company in 2015. People rightly worried that WhatsApp's commitment to encryption and privacy might change. Remarkably, it hasn't. That isn't even really the issue here, even though the headline might lead you to believe it is.
Although your messages are encrypted, when a user reports a message as abusive or harassment, those messages can be reviewed by a human. ProPublica's report suggests that this means that while messages are end-to-end encrypted, the review represents a backdoor. Except, it doesn't.
Think of it this way. I have four young children. On occasion, because they are kids, they do things they shouldn't. Sometimes they say mean things to each other. Most of the time they are savvy enough to say them outside of mom or dad's earshot. In that case, I have no way of knowing what's happening in their conversation. Their "messages" are kept secret, at least from me.
Sometimes, however, one of them will come to me and--as kids do--repeat whatever was said by their big mean brother or sister. That "message" so to speak, is no longer private because it was revealed by the recipient. We'll set aside the fact that kids sometimes embellish, or simply make up the offending statements. In the case of messaging, there is a literal copy of the message on the recipient's device and they can forward such to anyone they want.
Just because the message is encrypted, and kept private from WhatsApp, there is literally nothing preventing the recipient from revealing the contents of the message to anyone they want--including WhatsApp or Facebook. That's exactly what happens when someone reports a message as harassment.
Part of the problem is that people assume "encryption" means that no one they don't want to read a message will ever have access to it. Or that there is no way to be held accountable for anything you send to anyone else. Except, if you send an offensive message, the recipient isn't bound by encryption. They're at the other end of end-to-end encryption. They can do what they want with it.
That said, Facebook has a much bigger problem here, which is that it didn't take much of a leap to assume that the company is, in fact, reading your messages. Facebook has such a trust deficit, especially when it comes to privacy, that no one is willing to give them the benefit of the doubt. That's an obvious problem.
As the ProPublica piece points out, WhatsApp isn't transparent about the fact that it reviews reports. Users are led to assume that their messages are both secure and private. That's true to the extent that end-to-end encryption means no one can peer into your messages and read what you send to your friends or contacts. It doesn't mean, however, that those friends can't share those messages or report them to WhatsApp.
And, the idea that the company is paying 1,000 people to review those reports gives the impression that there are a lot of messages to review. It also paints a picture that Facebook just has people sitting around looking at your private messages.
Personally, I think it's a good thing that Facebook recognizes that it needs a system in place to handle the torrent of abusive content generated by 2 billion users. The fact that users can report such content is an important balance between respecting user privacy and providing a mechanism for people who are being victimized to get help.
On the other hand, the fact that people are readily willing to believe the worst about Facebook is an indictment of the company's reputation. The lesson here should be obvious: Trust is your most valuable asset.
Trust is earned over time, as you consistently uphold your values and demonstrate that you have your users' best interests in mind. When you don't, you break trust, even when it turns out you were trying to do the right thing.