Facebook may have won a battle in Congress, but the core issue of Facebook's societal and moral responsibility will be a protracted war. Having spent years in both the tech world and in Washington, I can see where Facebook's current posturing is going to lead it, and the social network will not come out unscathed.

Nearly 100 lawmakers in the House and Senate showed up for Mark Zuckerberg's testimony. They attended to engage in political theater--to appear tech-savvy for Millennials and those in their jurisdictions that crave innovation. They were unsuccessful on that front.  But they also came because the people they serve are concerned.

The country is not entirely clear about what's at stake, and it will take time before we see decisive action. This works to Facebook's advantage, but it won't last forever.

Here's how you can tell Facebook is on shaky ground:

Zuckerberg is playing word games, and not addressing the real issue.

Zuckerberg tried to evade the matter of using personal data to earn advertising revenue by saying, "Facebook doesn't sell data." This might be technically true, but the company certainly monetizes that data. 

Uber used a similar argument when it claimed it is not a transportation company, but instead, a software platform for transportation providers and users. This line of reasoning protects you in the short term, but eventually fails. You can put lipstick on a pig, but it's still a pig. Saying Facebook doesn't "sell" data is semantic, and doesn't sync with Facebook's "we are taking responsibility" claim.

Facebook is failing to educate stakeholders. 

Facebook is playing a game with enormous social and political implications. It's not just lawmakers who bear responsibility to understand Facebook; it's also Facebook's responsibility to educate our elected leaders who are tasked with our protection. If you don't educate regulators, they'll regulate you anyway, but without the tools to get it right.

Many tech companies have DC offices with political advocacy staff for this very reason.  Rather than criticizing Congress's inadequate understanding of Facebook, we should ask "What have Facebook's public policy folks been doing all day?" It's standard practice for companies to feed lawmakers information, and even specific questions, for these hearings. Congress dropped the ball, but so did Facebook.

Facebook cannot escape its misaligned incentives.

Facebook is not just a social network; it's also a business. In some cases, its incentives as a business conflict with its social and moral responsibility. 

This is illustrated well in the company's attempt to combat fake news. Facebook's solution was to let users rate news stories for accuracy--a solution that directly benefits Facebook by increasing engagement: more clicks, more interaction with content, and more rallying of users by self-interested parties. It is consistent with the company's business interests, but absurdly disconnected to solving the actual problem.

Facebook is relying on loopholes to evade responsibility.

Facebook is using data in ways that violate people's privacy, their sensibilities, and arguably, their safety. Like all CEOs coached by attorneys, Zuckerberg cited the "freedom to change settings" argument to abrogate responsibility. 

While this helps with legal responsibility, it's disingenuous. Any user who has ever tried to change their privacy settings for any application knows how difficult companies make it.  And even when you can find the settings, you rarely know what protections they afford or dangers changing them might bring. It's easy to argue that the onus should be on the user, but in politics, the user often drives the bus. The onus can be easily shifted to the companies.

After scandals involving Cambridge Analytica and Russian disinformation, regulation of Facebook is inevitable. Facebook's platform promotes content, whether it created that content or not. Therefore, it must take responsibility for it. 

And while the Valley likes to insist on self-regulation and consumer choice, our society doesn't let people make choices that have dangerous results for our privacy and safety, for our personal lives and our democracy.

Europe already learned these lessons. Facebook's data use was ruled illegal by a German court, and the European Union ruled against Google in a matter known as "the right to be forgotten." The U.S. may not go as far as Europe, but it won't ignore the problem.

Ultimately, it's reasonable to expect that regulation will lead Facebook to charge for its platform, given the hit it will take on the data and advertising front. 

And Facebook should worry about whether we would pay. But recent testimony suggests they're either not worried enough or are woefully confused about how to act on that worry. Lucky for them, Congress is even more confused. For now.

Facebook weathered the opening shots, but the war has only just begun.
 

Published on: Apr 25, 2018