On Thursday, Facebook announced a plan to deal with the proliferation of fake news: Third-party fact-checkers will flag what they think are false stories, and then Facebook will decide whether or not to demote them in people's News Feeds. Facebook CEO Mark Zuckerberg has described the effort as an "early test" -- but already the move has prompted concern from critics.

"Users have all sorts of opinions about what news counts as 'fake,' which is, of course, exactly the problem," reads a Vox explainer about the change.

The first step in the process involves Facebook users flagging questionable stories. Only then can the posts make their way to fact-checkers, who will check the stories and link to articles that explain why the information is inaccurate.

Some have raised questions about whether the fact-checkers themselves may let their bias get the best of them. "My question though is -- how will you know if these fact checkers are not politically motivated or affiliated themselves? Even 'respected' news outlets are biased and misrepresent news as it is. It's a very grey complex area," a Facebook user asked Zuckerberg in the comments section of his announcement.

The Atlantic suggests that the flags won't necessarily have the intended effect. "There's a danger that people who are disinclined to trust traditional sources of information will treat Facebook's warnings as a badge of honor," writes Kaveh Waddell.

Civil liberties organization Electronic Frontier Foundation, meanwhile, has a separate concern: Will there be sufficient transparency in this new process of combating misinformation in Facebook's News Feed?

"Anything that provides users with more information is a good thing," Jillian York, EFF's director for international freedom of expression, tells Inc. of Facebook's move to label stories as disputed and link to a corresponding article explaining why. She adds however, "If Facebook is going to be doing something like this, then it really needs to be transparent about every element."

Facebook responded to questions of transparency by directing Inc. to Zuckerberg's response to the user comment on bias. The CEO said those involved in flagging had been instructed to fight spam and not flag items because of the opinions they espouse.

"For example, we're focused on obvious hoaxes with headlines like 'Michael Phelps just died of a heart attack' designed to get people to click on the stories and see ads. Our goal is to reduce these hoaxes just like we fight other scams on our platform, but I want us to be especially careful about never being arbiters of truth ourselves -- which is why we're working with third-party fact-checkers. We'll keep looking for unbiased third parties to add to our list of reviewers," he said. "This is an early test, and I'm going to keep a close eye on it to make sure we're fighting actual spam and hoaxes, and not limiting people's freedom of expression."

York says EFF started looking at fake news only recently, in response to calls the organization received from reporters. She adds that media coverage seems to draw little distinction between content promoted by Facebook's Trending Topics feature and content shared by users that then appears in News Feeds.

This latest change is not aimed at the Trending feature, and it's unclear whether Trending will be affected.

York's core concern is what the move will mean for how user-generated content is treated and how demotion of content will work within the News Feed. She says she would be especially concerned if the effort led to content removal.

Facebook confirmed to Inc. that removal of content is not part of its strategy, as Zuckerberg stated in his Facebook post about the new approach. After users label a story as questionable and fact checkers dispute the story and flag it, "You'll still be able to read and share the story, but you'll now have more information about whether fact checkers believe it's accurate," the CEO wrote.

So stories won't disappear, though they might be pushed to more obscure corners of the News Feed if flagged or if Facebook observes that users who read a story don't go on to share it.

"We've found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way. We're going to test incorporating this signal into ranking, specifically for articles that are outliers, where people who read the article are significantly less likely to share it," writes Adam Mosseri, vice president of News Feed, in a post about the change.

But Facebook has been called out before for its opaqueness around how the platform makes decisions in ranking content, and York says existing practices that the platform has not publicly explained in detail could affect how efforts play out. EFF has in the past called for greater transparency from Facebook, Google, and other tech companies around how they treat user data.

"Facebook is already manipulating their algorithms, they're already placing different weight on different topics, and I think this is really, again, an argument in terms of transparency," York says.