Hate speech. Fake News. Freedom of expression. Election interference.
Facebook has real problems. Those four are easily among the most challenging, but they are nowhere near the biggest problem with the world's largest social media platform. In fact, in many ways, they are merely symptoms.
That doesn't mean that Facebook doesn't need to deal with those issues, it absolutely does. Especially as the world tries to find its way out of a pandemic, and Facebook has become a hotspot for misinformation and protest. That is ostensibly what the company is attempting to do with the announcement of the first 20 members of its advisory board.
The board itself is impressive. No one can argue it isn't a collection of extraordinary humans who have well-considered positions on the relevant subject matter. The group includes journalist Tawakkol Karman, a Nobel Peace Prize winner; Andras Sajo, a former judge and vice president at the European Court of Human Rights; and Helle Thorning-Schmidt, the former prime minister of Denmark, among others with equally impressive backgrounds.
You may remember that this is the board that will have the ultimate authority to hear appeals when Facebook removes content. According to an op-ed in The New York Times last week, the board's mandate is to "make final and binding decisions on whether specific content should be allowed or removed from Facebook and Instagram (which Facebook owns)."
The problem is that the decisions it is able to make are hardly the ones the company needs the most help with. The biggest problem with Facebook, as I've written in the past, is that the company's founder, Mark Zuckerberg, is a true believer.
The ultimate problem with Facebook is that its interests are in direct conflict with the privacy interests of its users. That's a problem for a company with so much direct impact on our daily lives.
I wrote earlier this year about a room full of people who literally laughed when a representative from Facebook described everything the company did as "privacy-protective." I'm not even sure what that means, but the same rep, Erin Egan, the vice president of public policy at Facebook, said the company builds every product with "privacy by design."
The thing is, no one who doesn't work at Facebook actually believes that's true. That's a problem. And this super-board has no ability to do anything about it. It can't force Facebook to make real changes to protect users. It won't review products to flag areas that infringe on user privacy. It has no influence on the company's business decisions, which are at the core of the conflict it has over protecting user information.
Which is really--at its heart--what is wrong with this board. It's almost as if Facebook wants to get credit for doing something without actually having to do something. In Facebook's case, that's a very real issue considering the level of scrutiny the company has faced from regulators, privacy advocates, investigators, and, ultimately, its users.
Perhaps the company hopes that if it just regulates itself, everyone else will leave it alone. I'm not a particular fan of the government stepping in--rarely does that solve a problem like this. I am, however, in favor of real changes to how Facebook monetizes its users' personal information.
In any number of ways, Facebook is arguably the most powerful company on earth. It knows more about most of us than probably any other entity, including the government. It has more influence over whom we interact with, what information we see, and what we buy (and why) than any other company ever.
That's why we need it to do more than simply appoint a board without any authority to make changes in how Facebook operates. That is, after all, what we really need as we become more dependent on technology giants that influence almost every aspect of our daily lives. Of course, it doesn't take a board to figure out how to do the right thing. It would just be nice if it at least had the power to help Facebook do just that.