Facebook vows more integrity in user research, but full transparency missing

Facebook wasn't prepared for the fallout surrounding the publication of a controversial study on the moods of unwitting users. It now says it has changed the way it conducts research for the better, though it offers little in the way of specifics on what those improvements are.

Mike Schroepfer, chief technology officer, released a blog post to explain the areas in which Facebook's research division has improved. Schroepfer says Facebook has improved its research guidelines, the review process, the training and the availability of reports on its studies.

"If proposed work is focused on studying particular groups or populations -- such as people of a certain age -- or if it relates to content that may be considered deeply personal -- such as emotions -- it will go through an enhanced review process before research can begin," says Schroepfer. "The guidelines also require further review if the work involves a collaboration with someone in the academic community."

Facebook has created a review panel that is composed of senior staffers from the company's reseach, engineering, legal, privacy and policy groups, according to Schroepfer. He says the new panel will be complemented by the company's existing privacy panel that scrutinizes products and research.

All of Facebook's research will be posted to a single site, from now on. The company has also injected education on its research practices into its six-week training program.

"Like most companies today, our products are built based on extensive research, experimentation and testing," says Schroepfer. "It's important to engage with the academic community and publish in peer-reviewed journals, to share technology inventions and because online services such as Facebook can help us understand more about how the world works."

Back in June, the Proceedings of the National Academy of Sciences published a study Facebook conducted in 2011 on the moods of its users. Facebook manipulated the moods of approximately 700,000 users by pumping either positive or negative content into their newsfeeds.

Users were outraged upon learning the study had been conducted without their knowledge. Schroepfer says Facebook was encouraged to conduct the research after learning of reports suggesting emotions are contagious, but the social networking learned that it handled the process incorrectly.

"Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," says Schroepfer. "It is clear now that there are things we should have done differently."

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics