Imagine you're having dinner at your favorite neighborhood restaurant when a man walks in, takes a seat at the bar, and starts playing a ukulele. It's out of tune, and, though he's strumming it quietly, it cuts through the background din just enough to catch your attention, like a dripping faucet in a quiet house.
It should also be noted that this man has, attached to his shoulder, the red safety light that you see bicycle commuters wearing, which strobes continuously in the dim restaurant.
It's clear from their faces that your fellow diners are just as bothered by this man as you are, so you discreetly seek out the manager and ask her to have a word with Ukulele Guy. But when you do, she merely points to a copy of the restaurant rules posted above the bar. "I'm sorry," she says, "but as you can see, there's nothing in there about playing a ukulele or wearing a bicycle strobe light, so there's nothing we can do."
This is an absurd example, of course. What kind of business could claim to have its hands tied by rules the business itself wrote and expect its customers to go along?
Only the biggest and most successful companies of the 21st century, of course--the digital platform players like Facebook, Google, and Twitter. For years, they've been contorting themselves into ever more uncomfortable poses trying to reconcile a professed commitment to free expression with the awkward fact of their absolute control over the flow of information. Earlier this week, Apple yanked the yoga mats out from under them when it summarily booted from its platform all the podcasts produced by conspiracy-monger Alex Jones.
From encouraging harassment of parents whose children were murdered in the Sandy Hook school massacre, which Jones says was a hoax, to fanning the flames of anti-Semitism and Islamophobia, Jones has given the platforms no shortage of reasons to restrict his privileges--including the way he has openly flouted their wan attempts to police his behavior by posting through alternative accounts while his was suspended. Apple's decisive action made all the other players' previous responses to Jones seem weak and muddled in comparison. They scrambled to catch up, with YouTube, Facebook and Spotify all enacting their own sweeping purges within hours, saying they had determined him guilty of hate speech.
A holdout was Twitter, whose CEO, Jack Dorsey, sought to answer criticisms on Tuesday night. "The reason is simple: He hasn't violated our rules," Dorsey tweeted. "If we succumb and simply react to outside pressure, rather than straightforward principles we enforce (and evolve) impartially regardless of political viewpoints, we become a service that's constructed by our personal views that can swing in any direction."
This line of argument--a few people in Silicon Valley shouldn't be making decisions that govern what billions of people around the world can read or write--is a common one. It's the same idea Mark Zuckerberg was groping for when he tried to explain why Facebook lets users write posts denying the Holocaust happened.
It's a line meant to draw attention away from the other, deeper reasons companies like Facebook don't want to make those kinds of decisions. To name one, the costs of having human editors review content balloon at the scale of a Facebook or a Google; to name another, exercising a heavy editorial hand could jeopardize the "common carrier" status that indemnifies tech platforms of legal responsibility for what their users say and do.
There are also structural reasons that make it easier for Apple to take the bold action it did. While the iTunes and App stores are huge, Apple is primarily a hardware maker that doesn't have to worry about content-moderation costs eating up all its profits. Compared with Google or Facebook, its growth has depended far less on mergers that required government approval, meaning Tim Cook can worry much less about getting hauled in front of Congress to answer questions about bias.
But there's also corporate DNA to consider. Apple long made it clear it sees itself not as a neutral platform for others' content, but as a curator who will exercise discretion as it sees fit--with minimal explanation or recourse. That hasn't always been a popular stance. "It's Time to Declare War Against Apple's Censorship," Gizmodo declared in 2010 after Apple blocked two German newspapers from distributing content featuring nudity through their apps. Those decisions, and a number of others around the same time, earned Apple widespread condemnation from within the media world. "The fact is that they forced Stern and Bild to change their editorial content decisions, and anyone or anything could be next," Gizmodo wrote. "Apple is a corporation, and they can do whatever they want, after all."
That's true, of course. Just as it's true the restaurant with a sign that says "We reserve the right to refuse service to anyone" can kick out customers without providing a reason, as long as they're not discriminating illegally.
But we don't freak out when we see a sign like that because we understand the restaurant has no interest in abusing the policy by turning away paying customers. That's not why the sign is there. It's there so the restaurant can protect its relationship with its other customers by not letting, say, a guy with a ukulele and a bike light ruin their dinners.
"We didn't take a broad enough view of our responsibilities"--that was the mea culpa Mark Zuckerberg offered to lawmakers to explain why Facebook failed to stop Russia's social-media influence operations.
We all say we want the most powerful companies to exercise more responsibility. But there are different types of responsibility. Like a good restaurant manager, Apple has always seen itself as responsible for the quality of its customers' experience, even at the cost of being labeled arbitrary or priggish.
When Facebook, Twitter, and YouTube point to violations of this or that rule as the grounds for this or that temporary ban, they're trying to assert a framework within which responsibility rests with the users who break the rules. But that's a fiction. It pretends the enforcement of rules doesn't always comes down to human judgment in the end. There's no great way for any of those companies to deal with a problem like Alex Jones. But of all the bad ways, taking responsibility for your own judgments is a lot better than trying to blame some sign on the wall that you put there.