Mark Zuckerberg wants to make clear that he's personally leading Facebook's charge against misinformation and election meddling

The billionaire founder wrote a 3,260-word post on Wednesday detailing all the security measures the social network has introduced since it was exploited by Russian operatives  to spread misinformation in the run-up to the 2016 U.S. presidential election, and by an overzealous political firm to harvest data from more than 87 million unsuspecting Facebook users. Zuckerberg wrote that his post will be the first in a series outlining his views on the major issues the company faces. 

With the U.S. midterm elections less than two months away, here is how Facebook says it is preparing to fight any interference.

1. Identifying and removing fake accounts

Facebook removed more than 1 billion fake accounts in the six months between October 2017 and March 2018, Zuckerberg wrote, "the vast majority within minutes of being created and before they could do any harm." 

2. Investing in security

Facebook has hired more than 20,000 people to work on the site's safety and security, up from 10,000 last year, Zuckerberg wrote, adding that Facebook will continue to invest in people and artificial intelligence to improve its systems.

3. Showing you who paid for that political ad

"We now also require anyone running political or issue ads in the U.S. to verify their identity and location," Zuckerberg wrote, reasoning that the move will prevent bad actors in other countries from buying ads in the U.S. Facebook will also keep an archive of all the political ads that run in its platform. Anyone can access it and see who ran a particular ad, how much they spent on it, and how many people it reached, he said.

4. Demoting viral posts

"In places where viral misinformation may contribute to violence, we now take it down," Zuckerberg wrote. "In other cases, we focus on reducing the distribution of viral misinformation rather than removing it outright."

Why are some posts removed and others demoted? Zuckerberg explained that not all of those posts are part of a coordinated misinformation campaign. When a potentially fake or viral post is flagged, Facebook sends it to a group of independent fact-checkers to review it. "Posts that are rated as false are demoted and lose on average 80 percent of their future views," he said.

5. Making it harder to make money off Facebook ads

"We block anyone who has repeatedly spread misinformation from using our ads to make money," Zuckerberg wrote, adding Facebook will also curb the distribution of any page that has been found to routinely spread misinformation and spam. "These measures make it harder for them to stay profitable spamming our community." 

6. Increasing coordination and cooperation with governments and companies

"Preventing election interference is bigger than any single organization," Zuckerberg wrote. "Bad actors don't restrict themselves to one service, so we can't approach the problem in silos either."

Facebook's founder said the company now has "significantly stronger" coordination with governments and other tech businesses than in 2016, and that everyone now has "an incentive to work together." For example, Facebook can identify a bad actor on its platform, remove the account, and whatever other profiles that may be linked to it on Instagram and WhatsApp. If Facebook has information about other social platforms associated with those accounts, it can alert those companies (say, Twitter) so they can remove the fake accounts too. 

What you can do

The intense scrutiny over Facebook's role during the 2016 election seems to have had a significant impact on Zuckerberg, who comes off as more cognizant of the impact his company has on the world. "When you build services that connect billions of people across countries and cultures, you're going to see all of the good humanity is capable of," Zuckerberg wrote in his manifesto. "And you're also going to see people try to abuse those services in every way possible."

While Zuckerberg's post focuses mainly on how Facebook is battling misinformation and misuse of its platform, there are several ways you can fight too. For instance, you can limit the number of apps that have access to your information by updating the privacy settings on your Facebook account. You can also request all the data Facebook has on you, so at least you're aware of what information you've already shared. If you come across a viral post, use services like PolitiFact, Snopes, or to assess its credibility. If the information in the post is false, you can report and flag it so the Facebook team can review.