When Facebook made its most recent changes recently to the Trending Topics box at the top of users' News Feeds, its hope was to remove the bias that comes from having human editors decide if stories are important. The outcome of those changes shows you can't cover up human bias with technology. If anything, you'll amplify it.

The company confirmed recently it had shed its team of Trending editors in an effort to make trending features, determined in part by artificial intelligence algorithms, more objective in topic selection.

"Earlier this year, we shared more information about Trending in response to questions about alleged political bias in the product. We looked into these claims and found no evidence of systematic bias. Still, making these changes to the product allows our team to make fewer individual decisions about topics," the company stated in a post announcing a greater reliance on algorithms to pinpoint trending topics.

Instead of the political bias of editors, users would see what supposedly objective computer algorithms determined they wanted to see based on their likes and interests. But the move appears not to be going on planned, in part for a reason that could have been anticipated: AI technology is, like people, prone to bias--a bias as human as the political leanings of an editor, but coming from different sources. And it's a bias that's not isolated to Facebook products.

That bias manifested Sunday as the promotion of hoax news stories about Megyn Kelly getting fired from Fox News for supporting Hillary Clinton. A Trending Topic centered on Kelly led to a page headlined with a story titled, "BREAKING: Fox News Exposes Traitor Megyn Kelly, Kicks Her Out For Backing Hillary" from a website called endingthefed.com.

It's also easy to say Facebook is over-compensating following reports from Gizmodo that members of the company's trending team had been suppressing conservative perspectives. But if you think about it, Facebook is grappling with a pretty thorny issue that extends pretty deep into technology and content selection processes. It's an issue that extends beyond Facebook into general issues of bias in AI.

Remember Tay, the charming Twitter bot who mutated into a neo-Nazi? Microsoft thought a chatbot modeled on a teenager would be a fun companion for internet users. Turned out, she had a propensity for repeating the racist rhetoric Twitter users fed her and denying the Holocaust.

Tay was easily programmed by Twitter users aggressively feeding her a diet of prejudice and bigotry. But the problem doesn't stop with what users say--it goes as deep as the programming itself. As Bloomberg reported earlier this summer, a "sea of dudes" climate among AI developers has some concerned about gender bias cropping up in AI programs even before those programs start interacting with users.

As Fortune summarized nicely Monday, "The code that operates Facebook's news feed and trending algorithms...isn't some kind of omniscient or ruthlessly objective engine, as technology analysts continually point out. It's designed and programmed by human beings, and in most cases incorporates the biases of those human programmers."

If there's an issue of inherent gender bias in AI stemming from developer demographics, it follows that there's an inherent issue of racial bias as well. Other amplifications of subjectivity and prejudice are not far behind, and Facebook Trending Topics is just one of the first places where we might see them play out.

Mark Zuckerberg may feel current artificial intelligence technology is suited to control his home, but whether algorithms should be deciding what content is fit for the masses is another story. Facebook faced criticism for human bias. Hiding behind technology isn't going to quell conflict.

Many blamed Facebook's layoffs from the Trending team for the selection of links that received visibility when "Megyn Kelly" was trending as a topic Sunday. But it's less that the layoffs caused the issue, and more that what Facebook thought was the solution didn't end up working.

For starters, Facebook promoted conspiracy news in its Trending Topics box weeks before these layoffs. When Victor Thorn, an author known for denying the Holocaust accusing the Clintons of murder, was reported to have died by suicide, the Trending Topics tab identified him as a "Clinton researcher" and promoted links speculating whether he had himself been murdered by the Clintons.

Second, there are still people charged with screening topics and links selected by Facebook's AI algorithms before they go live. The Megyn Kelly-fired-from-Fox hoax apparently gained enough traction on the social media network to pass muster with the remaining humans behind the wheel of Trending Topics.

Facebook declined to comment for this post, directing Inc. to a previous link about the company's changes to its trending team.