If the recent presidential election showed us one thing, it's that Americans increasingly live in separate realities. In one reality, global warming is an existential threat, racism is on the rise and Hillary Clinton is a capable if flawed leader. In the other, climate change is a hoax, Clinton is a criminal and the ugly monster rearing its head in America is political correctness. There can be, it often seems, no meaningful debate between these two worldviews; the starting points are too far apart.
In the days since the election, many critics have pointed a finger at Facebook for its role in bringing us to this pass. The social network's sorting algorithms, they say, creates "filter bubbles" within which opinions harden and appealing falsehoods spread more easily than inconvenient truths.
Facebook founder Mark Zuckerberg's response to the critics has been a measured one, giving the impression of listening without admitting any fault.
At a conference in Silicon Valley two days after the election, Zuckerberg said it's a "crazy idea" that Facebook had anything to do with Donald Trump winning the election. If there was pro-Trump fake news circulating on the platform, its effect would have been balanced out by pro-Clinton fake news, he said, but the amount of either was small. And, anyway, if Facebook users prefer fake or biased news to the genuine kind, that's their prerogative. "Our real goal is to reflect what our community wants," he said.
Zuckerberg has made similar arguments privately to Facebook employees who expressed concern over the company's distorting effect on public opinion. In a public post on his Facebook page, he affected a tone of concern, but cautioned that efforts to police hoaxes could result in censorship of legitimate news and opinion.
It's easy to quibble with Zuckerberg on the fine points here. Yes, there have been phony anti-Trump stories and memes shared on Facebook, plus plenty of satirical pieces mistaken for news, but there was no equivalent to the pop-up industry of pro-Trump/anti-Clinton content, much of it produced overseas to capitalize on the viral nature of Trump's campaign. BuzzFeed identified more than 100 pro-Trump websites run out of a single town in Macedonia. "People in America prefer to read news about Trump," one of the teenage proprietors told BuzzFeed, which noted "the most successful stories from these sites were nearly all false or misleading." (Facebook says it will discourage the practice by banning publishers of hoax stories from its ad network.)
Contra Zuckerberg's contention, there's simply no obvious reason there would have been equal amounts of bogus content aimed at each side. Internally, Facebook executives have even acknowledged this, reportedly pumping the brakes on a Newsfeed update designed to cull hoaxes because it would have "disproportionately impact[ed] right-wing or conservative-leaning sites." Among those not buying their CEO's spin are the several dozen members of an underground group of employees who have formed a task force to explore the fake-news problem, according to BuzzFeed.
If Zuckerberg is having a tough time coming to grips with Facebook's effect on the electorate, it's because he has long been living in his very own reality. In his dimension, the 1.8 billion humans who use Facebook do so because they find it fun, useful and informative, full stop. They don't do it out of addictive compulsion, or to harass and bully people, or to stalk their exes, or to avoid talking to people at a party, or for any other venial reason. Zuckerberg's Platonic Facebook is a place that brings people together, where they exchange diverse points of view and change each other's minds. In this alternate universe, users make informed decisions about what they share and whom they share it with, how their personal information gets used by marketers and what sorts of ideas they wish to be exposed to.
Maybe this is actually how Zuckerberg himself uses Facebook. Maybe he even knows people who use it this way. It's definitely not how the typical Facebook user uses Facebook. If it were, would we be having this discussion?
A few months ago, a person I know told me about a conversation she had with Zuckerberg. At the time, he was feeling confused and irritated about the backlash that so often greets actions he views as philanthropic, like his promise to devote 99 percent of his wealth to causes like combating poverty and disease and his efforts to spread internet access around the world.
Why, he wondered, was there so much resentment toward Facebook, a free product hundreds of millions of Americans evidently like enough to use every day? Delicately, this person suggested that many avid Facebook users nevertheless have complicated feelings toward it, viewing their Newsfeeds as a trashy but addictive waste of time, a guilty pleasure on par with watching reality television. Zuckerberg dismissed the comparison out of hand. (Facebook's public relations office did not respond to an inquiry seeking comment for this story.
Zuckerberg has long been careful to avoid the appearance of partisanship, particularly since conservatives threatened to boycott Facebook last summer in the wake of claims editors employed by the company took a biased approach to news curation.
But there's one ideology he makes no attempt to conceal: His belief in individual agency as the driver of behavior. In his mind, all 1.8 billion Facebook users are free agents, enlightened consumers acting in their own best interests. If they elect to, say, devote upwards of an hour of each day scrolling through posts on Facebook and Instagram, who is to tell them they shouldn't?
This view is common in Silicon Valley, but it's not universal. Tristan Harris, a former Google product designer who advocates for ethical design in technology, says the notion that it's the individual consumer's responsibility to make good choices is a dodge. "That's not acknowledging that there's a thousand people on the other side of the screen whose job is to break down whatever responsibility I can maintain," he tells The Atlantic.
At Facebook, those thousand people are working to nudge users' behavior in very specific directions: toward spending more and more time with the product; toward sharing ever more details about themselves that can be used to target advertising; toward sharing information publicly rather than in small groups; toward sharing information even about other users, by tagging them in photos; and, of course, toward consuming the kinds of content that makes them want to engage and share versus the kind they might simply read and absorb.
Zuckerberg's answer to this dialectic--individual free will versus systemic incentives--is always to give users tools to enact their preferences. If the creeping invasion of privacy on Facebook bothers you, then you should review your privacy settings when Facebook reminds you once a year. If you don't want to live in a filter bubble, go and follow a bunch of news outlets that reflect other points of view. And if you prefer truth to viral bullshit, you can tell Facebook that by flagging hoaxes when you see them.
In other words, you can have any kind of Facebook you want; all you have to do is stay vigilant and put in some extra work. And if you don't make use of the tools Facebook gives you, anything you don't like about your Facebook experience is your own damn fault so stop whining.
We've heard this kind of logic before - from cigarette makers, from soda companies, from banks that hit customers with a million hidden service fees and then tell them they should have read the fine print more carefully. It's a profitable way to run an enterprise, obviously, but a shoddy way to treat people.
There aren't a lot of technology companies in the world more successful than Facebook, but there are two that it could stand to learn from. For a long time, Apple's slogan was "It just works." Steve Jobs understood that what consumers want more than anything are products that do what they're supposed to out of the box without a lot of set-up or troubleshooting. Google could have used the same motto for its search engine, which eliminated the need to wade through pages of links and boasted an interface that was simplicity itself.
If Mark Zuckerberg thinks what Facebook users want is optionality--the ability to customize their Newsfeeds just so, with greater or lesser amounts of fake news and insularity, greater or lesser degrees of privacy intrusion--then he is kidding himself. Consumers don't want to do that kind of work; they just don't have time for it. If Zuckerberg really thinks his users value diverse viewpoints and authentic knowledge, he should give them a product that gives them that as a default, not an add-on feature.
And if he doesn't think they want that, well, maybe it's time to ask himself if Facebook is really in the business of making the world "more open and connected."