Facebook has tightened its research guidelines following uproar over its disclosure this summer that it allowed researchers to manipulate users' feeds to see if their moods could be changed.
At issue was a study in which Facebook allowed researchers to manipulate the content that appeared in the main section, or "news feed," of small fraction of the social network's users. During the weeklong study in January 2012, data-scientists were trying to collect evidence to prove their thesis that people's moods could spread like an "emotional contagion" depending on what they were reading.
"Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," Mike Schroepfer, Facebook's chief technology officer, wrote in a blog post Thursday. "It is clear now that there are things we should have done differently."
In the past three months, Schroepfer said, Facebook has given researchers clearer guidelines on research procedures and has created an internal panel that will review projects. But there will not be an external review process and Facebook will continue to encourage researchers to study how people use its site.
"We believe in research, because it helps us build a better Facebook," Schroepfer wrote. "Like most companies today, our products are built based on extensive research, experimentation and testing."