Have you heard? Facebook is biased! Oh, the outrage! Of course, if you know anything about anything or have ever thought about it for two seconds, you probably knew that. Nevertheless, this revelation is currently being treated in many quarters as news.

And the question of just what qualifies as news is very much the axis around which this particular controversy turns. It started with a pair of reports by Gizmodo, which, claimed that editors working for Facebook have been manipulating the content of that "Trending" box in your News Feed, inserting some news topics that aren't actually trending on the social network, withholding others that are and discriminating against right-wing websites.

After complaints by conservatives and journalists, and even an inquiry by the Republican-controlled Senate Commerce Committee, Facebook yielded to the pressure, publishing the guidelines its employees use to decide what to put in that "Trending" box. The 28-page document doesn't contain any hint of systematic bias, but it does reveal a depressingly half-baked conception of journalism.

Back up a bit: The reason Facebook has ideas about journalism in the first place is that having none wasn't working out. For a long time, Facebook curated news the way it curates everything else: through a mathematical algorithm that weights everything--status updates, pet photos, BuzzFeed quizzes--according to how interested users like you seem to be in it. The problem with that approach was people generally aren't very interested in news.

While showing a love of clickbait with their actions, however, users were also telling Facebook researchers in surveys that they craved more important fare. Initially, the company tried to comply by tinkering with the algorithm's inputs--for instance, looking for signs that users were spending a long time with an article. When that proved insufficient, however, Facebook turned to the human curators whose actions are now under the microscope.

That journalists as a breed tend to have certain biases is no secret, nor is it a conspiracy, nor is it a very interesting observation. (For starters, journalists tend to be college educated, to live in cities, and to earn middle-class incomes, all distinctions associated with known biases.) More interesting is the fact that Facebook executives apparently are so naive about news, they thought it possible to remove bias from the process by, for instance, steering in-house editors to a list of trusted sources--as though those sources, or the person who compiled them, didn't come with their own biases. Or as though the notion that certain types of stories are meritorious and others frivolous wasn't itself a subjective proposition.

It turns out you can't make value judgments without, you know, making value judgments. Who knew? It's almost funny to watch one of the world's most powerful companies fumbling its way to an epiphany that's eye-rollingly obvious to anyone who's ever spent a year in a newsroom.

Or it would be funny, if it weren't a little scary. If people are genuinely exercised over the idea of Facebook having an editorial bias, it's because the company controls such a shocking amount of Americans' attention. And it has ambitions to control much more of it--despite, as I've written before, having no real use for that time beyond selling it to advertisers.

If I'm getting my hair cut at a barber shop that's showing Fox News instead of MSNBC, or vice versa, I don't care much because I'm only there for half an hour. If I'm staying for a month at a hotel that only carries one of the two, the question of which becomes a matter of some importance. The more Facebook seeks to monopolize the time of its users, the greater the onus on it to use that time responsibly. "The time people spend on our site is a good measure of whether we're delivering value to them," a Facebook spokeswoman recently told The New York Times. How nice it would be if that were true.