Facebook's algorithm is arguably one of the most powerful technology inventions of the last 20 years, in the sense that it has largely been responsible for monetizing the way almost three billion people use social media. At the same time, and for some of the same reasons, it is one of the most controversial features of any platform

In a 5,000 word post that was first published to Medium, Nick Clegg, Facebook's vice president of global affairs, defended the algorithm, arguing that what you see in the News Feed is as much your own responsibility as it is Facebook's. 

"The personalized 'world' of your News Feed is shaped heavily by your choices and actions," writes Clegg. That's true, but there's a lot more to it than that.

Facebook does allow users to switch to a "most recent" view to see posts in chronological order. The problem is that most people had no idea that was an option, and the setting isn't easy to find. 

Now, however, Clegg says that Facebook is rolling out the ability to filter what you see directly from the News Feed:

For some time, it has been possible to view your News Feed chronologically, so that the most recent posts appear highest up. This turns off algorithmic ranking, something that should be of comfort to those who mistrust Facebook's algorithms playing a role in what they see. But this feature hasn't always been easy to find. So Facebook is introducing a new "Feed Filter Bar" to make toggling between this Most Recent feed, the standard News Feed, and the new Favorites feed easier.

The company has updated its Android app with the feature, and will be rolling it out on iOS "in the coming weeks."

While I think this is a good start, I think that Facebook should abandon the algorithm altogether and make the real-time News Feed the default. I suppose it could still give users the option to view "highlights" as determined by the algorithm if they want, but it should be opt-in at best.

The truth is, people rarely change the default option, regardless of whether they think the experience is better. As a result, what a company chooses as the default has incredible power over the user experience. 

I think it's time to abandon the algorithm; it does far more harm than good. Here's why:

Facebook decides what is 'meaningful' to you.

Clegg describes the purpose of the algorithm this way:

The average person has thousands of posts they potentially could see at any given time, so to help you find the content you'll find most meaningful or relevant, we use a process called ranking, which orders the posts in your Feed, putting the things we think you will find most meaningful closest to the top.

The problem is that "things we think you will find most meaningful" is a pretty loaded concept that Clegg mostly glosses over. Part of that problem is that if you "like" or comment on certain types of posts, or content from certain people, you're likely to see more of that type of content since Facebook sees that you engaged with it. That doesn't necessarily mean it was "meaningful."

For example, when we first had children, the best advice we were given when we started feeding them solid baby food was to start with the veggies because once you give them pears or apples, they're never going to eat the peas.

But if the veggies are all you feed them, they'll eat the veggies. That doesn't mean they like them better, or that they are more meaningful. It just means they are hungry. They don't even know about the good stuff.

The same is true on social media. You engage with the content in front of you. 

Facebook says it stands for free expression. 

Facebook's algorithm uses a collection of signals to determine what content you actually see in your News Feed. The effect of ranking is that it amplifies some content over others. 

That seems to go against Facebook's goal of, "to uphold as wide of a definition of freedom of expression as possible." That's how the company's CEO, Mark Zuckerberg, described it during a speech at Georgetown University in 2019 where he defended Facebook's commitment to free expression. At the time the company was under fire because of the way it handled political ads that contained misleading or false information

I'm not arguing that anyone should be able to post whatever they want on Facebook. It's a private company and is well within its rights to moderate content on its platform in any way it chooses, despite pressure from regulators and lawmakers otherwise.

More than that, I'm not arguing that anyone has a right to have whatever the post on Facebook read by anyone at all. My point is that if you say you stand for free expression, making decisions about what content you amplify and as a result, what information people will see, gets complicated.

The problem of misinformation and divisive content.

The practice of amplifying content is maybe the biggest cause of the problems facing Facebook right now. The company is under intense scrutiny over the way it handles misinformation on its platform. It also faces allegations that it favors certain political viewpoints and suppresses others, and that it fosters divisiveness through the amplification of incendiary content. 

Maybe even more problematic than the fact that Facebook is ground zero for much of this content is the allegation by many that it actually profits from it. Clegg, of course, attempts to frame the benefits of Facebook differently:

Data-driven personalized services like social media have empowered people with the means to express themselves and to communicate with others on an unprecedented scale.

I just can't help thinking that it's disingenuous to argue that the benefits of its personalization algorithm outweigh the problems--especially when those problems involve a group of armed citizens storming the halls of Congress, as they did on January 6th.

Getting rid of the algorithm would actually benefit Facebook in that it could no longer be accused of profiting from hate and extremist content on its platform. Ultimately, it seems like a no-brainer that that version of Facebook would be better for everyone, including Facebook.