Why is it so unsettling to find out a Russian company with close ties to the Kremlin targeted American voters with thousands of Facebook ads during the last presidential election? The answer isn't as obvious as it might at first seem.
A typical American might say something like: Because America is a democracy with reasonable levels of freedom and justice, and Russia is an autocracy with considerably less of those things, so letting the government of Russia have a say in who runs America would be bad.
But that's not what Mark Zuckerberg would say. In his official capacity as Facebook's CEO, he doesn't choose sides in national rivalries or make value distinctions between different forms of government. That would risk excluding people. Facebook may be an American company beholden to U.S. laws, but its loyalty is to its 2 billion users, more than three-quarters of whom live in other countries.
So on the rare occasions Zuckerberg deigns to speak out against something as insidious as a foreign government's covert attempt to brainwash American voters, he does so in language that's peculiarly constrained and filtered. In Zuck-speak, the hailstorm of fake news that rained down upon American Facebook users in the summer of 2016 was undesirable not because it could have affected anyone's vote, an idea he called "crazy." It was undesirable because "[o]ur goal is to show people the content they will find most meaningful," and phony articles about the Pope endorsing Donald Trump apparently aren't that. But "meaningful" has a very specific Facebookian definition, one that has more to do with users' behavior than such squishy notions as "truth." "Identifying the 'truth' is complicated," Zuckerberg wrote. "I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves."
Facebook's chief security officer, Alex Stamos, echoed his boss's uncanny valley language in his announcement that an internal investigation had uncovered some 470 "inauthentic" accounts responsible for 3,000 ads aimed at American voters between June 2015 and May 2017. The company reportedly told Congressional investigators that it traced the ad buys -- about $100,000 worth -- to a Russian "troll farm" known for spamming social networks with pro-Kremlin propaganda. It's illegal for foreign entities to run campaign ads in the U.S. (Wired reports that the volume of illegal ads was even higher: at least 5,000 ads and $150,000 in spending.)
"We believe in protecting the integrity of civic discourse, and require advertisers on our platform to follow both our policies and all applicable laws," Stamos wrote in a post. "We also care deeply about the authenticity of the connections people make on our platform."
Catch that? The problem with Russian influence operation isn't that it turned the world's most powerful communications platform into the tool of a murderous regime. The problem is it resulted in users making inauthentic connections.
The past year has severely tested Zuckerberg's preference for saying nothing controversial and couching everything in terms of what's good for Facebook's "community." Little by little, to his credit, he has made it known that he has opinions about real-world things like whether transgendered people should be allowed to serve in the military (yes), whether climate change is real (yes), whether Nazis are bad (yes) and whether undocumented immigrants who came to the U.S. as children belong here (yes). Without naming names, he has made it clear enough to anyone who cares what he thinks of the president his algorithms may have helped elect.
But there's a limit to what you can accomplish when you're willing to go on the record only in response to things that have already happened. If your goal is to prevent terrible things that haven't happened yet, you have to be a little more daring.
As I've written before, the unwillingness of Facebook and other Silicon Valley giants to stand for what they believe in is a gift to the bad actors of the world. From ISIS and Stormfront to Vladimir Putin's Kremlin, the forces of illiberalism know exactly what they want and don't feel compelled to apologize for it.
As he's been dribbling out political stances one by one, Zuckerberg has also been barnstorming around the U.S., milking cows, eating at diners and attending church services. Many observers have offered this as evidence that he intends to take advantage of a clause in Facebook's bylaws and run for president someday.
It would be awkward in the extreme if he does run someday only to be asked by a debate opponent why, when his technology was used by a hostile foreign power to undermine American democracy, he didn't do a thing to prevent it. Or why his company hid behind the law and its own data policy in refusing to show the public the propaganda it profited from helping disseminate. Or why Zuckerberg wouldn't even take the tiny step of calling a thing by its name and declaring it bad -- and not just because it's bad for Facebook users.
Why did it take so long for Facebook to uncover the illegal Russian campaign ads? What measures will it take to prevent it from happening again? Does Zuckerberg continue to see foreign election tampering primarily as a matter of bad user experience? I'm awaiting Facebook's comments on these questions.