The social media giant revealed yesterday that "a technical error between May 18 and 27" means potentially 14 million users' posts that were intended to be shared with only family and friends were instead visible to everyone in the world with an internet connection.
So, for example, are you careful about privacy settings when you post pictures of your kids? Facebook says it might have accidentally displayed them to everyone during that time period, because it offered users the wrong default setting.
"We recently found a bug that automatically suggested posting publicly when some people were creating their Facebook posts. We have fixed this issue, and starting today we are letting everyone affected know and asking them to review any posts they made during that time," Erin Egan, Facebook's chief privacy officer, told TechCrunch.
This time it's different
There's some mitigation. Facebook insists that it only potentially affected new posts that people made after May 18, and that it has since fixed the issue.
(It took five days after fixing it to reveal the issue to users; you can decide if that's in line with what the company calls a "new proactive and transparent way for the company to handle issues going forward," according to CNN.)
But while this Facebook privacy scandal is arguably less of a big deal than some of the other Facebook privacy scandals of the past few months, it's potentially more dangerous for at least three reasons.
1. It's very easy to understand.
Facebook told you the default privacy setting was one thing; it turns out it made a mistake and set it to something else. That's much simpler to explain than guiding people through how Cambridge Analytica manipulated Facebook to get people's data. And people latch onto things that are simple.
2. Facebook can't blame it on outsiders.
There's no political campaign here trying to dig their way into your private data; no "fake news" purveyors trying to use the platform to change your perception of reality. There's just Facebook--and, I imagine, an extremely red-faced engineer or two somewhere in the company trying to figure out if he or she can ever make up for this blunder.
3. It highlights how vulnerable Facebook is to simple human error.
Facebook is run by people. And people can make mistakes. When simple mistakes can scale instantly to affect millions and millions of people, it can have disastrous consequences.
By the way, human error doesn't have to be a technical mistake, like we assume this one was--clicking the wrong button on some interface in Menlo Park, or adding an errant piece of code. It can be the result of a strategic blunder, or a lack of awareness.
My best example of this is how Mark Zuckerberg's January 11 Facebook post wiped out $3 billion in the company's market capitalization--and upended an entire media ecosystem with just 549 words. (At least, I hope that was an unintentional side effect. If not, we have bigger problems.)
Which Facebook does Facebook want to be?
Regular readers know I will probably never be able to truly turn on Facebook, for the highly personal reason that after many years, I reconnected with my college girlfriend on the platform in 2012. Now we're married and have a daughter.
But that experience represents to me the pinnacle of what Facebook can be, and how the company would love for all of us to think about it: a place where people share things they care about with people they want to connect with.
The other side, which gets a lot more press these days, and quite rightly so--the side that poses an existential threat to the entire company and frankly the social media industry--is the incredible control this titan now has over aspects of our lives.
Parts of our lives, in fact, that many of us didn't even realize existed.
So which vision will Facebook ultimately wind up being known for?
It's up to all of us who use it to decide. A few more scandals like this one--simple, far-reaching, and damaging--and Facebook probably won't like the answer.