In an exchange during a three-hour-long Senate hearing on Thursday, Facebook's global head of safety, Antigone Davis, was asked whether the company would commit to releasing all of its data and research on the effect of Instagram on young children. "We need to look at our privacy obligations," said Davis.
Those eight words are emblematic of the real problem with Facebook, which is the complete disconnect between the way it views itself and the way the rest of the world does. Ironically, they're also true, just not in the way Davis meant.
Most observers took her answer as a no, as did most of the members of the Senate Commerce Committee's consumer protection panel, which held the hearing. The hearing was prompted by a series of reports from The Wall Street Journal, which revealed the negative effects of Instagram on teenagers, especially teenage girls who struggle with their self-image.
Of course, those senators already have much of the information, as it was provided by a whistleblower who is scheduled to appear this weekend on 60 Minutes. On Thursday, the Journal also published six documents on which it based its reporting on the subject.
One senator even compared Facebook to the tobacco companies. "Instagram is that first childhood cigarette meant to get teens hooked early," said Senator Ed Markey. "Facebook is just like Big Tobacco, pushing a product that they know is harmful to the health of young people, pushing it to them early."
It's not the first time I've heard Facebook compared to cigarettes. In 2019, Salesforce CEO Marc Benioff said that Facebook "is the new cigarettes. It should be regulated." Benioff was calling on the government to get involved. Two years later, it finally is.
Facebook's response has been to point with waving hands in the general direction of a commitment to privacy. "At Facebook, we take the privacy, safety, and well-being of all those who use our platform very seriously, especially the youngest people on our services," Davis said.
The thing is, Facebook seems to truly believe that the benefit it brings to society far outweighs any privacy or other concerns. It also seems to have a different definition of privacy from most people. Most people think privacy means that you don't track their activity and use their personal information to make money.
Facebook seems to think it's fine to do just that, as long as it isn't directly sharing their information with others. That's a distinction without a difference in Facebook's case. It's also not a new line. In 2020, Facebook's head of public policy told an audience at CES that Facebook "adds value to users in a privacy-protective way." Everyone in the room laughed.
It's pretty easy to observe that not to be the case. A business model that attracts young teenagers in an effort to get them hooked on a product early, so their personal information can be monetized for years to come, is fundamentally incompatible with any "privacy obligation."
One that does so with full knowledge of the negative impact it has on its users is fundamentally incompatible with morality. That bothers a lot of people, some of whom serve in the United States Senate.
Facebook regularly "chooses the growth of its products over the well-being of our children," said Senator Richard Blumenthal, who chairs the Senate panel that held Thursday's hearing.
As I mentioned, Davis's statement is actually true -- Facebook does need to look at its privacy obligations. Facebook's biggest issue is that it seems to rarely consider privacy obligations, even when they cause its users harm.
By the way, that's a lesson for every leader. You are responsible to your users to have their best interests in mind. That can be difficult when you're standing at the top of a business model that, for all practical purposes, prints money. Facebook's information monetization engine is extraordinarily profitable, and that success clouds the judgment of those who should be considering the impact the company is having on real people's lives.
Of course, Facebook keeps telling itself that "protecting privacy" means keeping information safe from bad actors. I'll just finish by pointing out that if you build a platform that encourages the sharing of incendiary content, spreads misinformation, and causes teenage girls to feel depressed -- or even worse, suicidal -- perhaps you're the bad actor.