As earlier announced in June by Facebook researchers. the results of a 2011 study that manipulated the news feeds of nearly a million user news feeds to see how positive or negative posts affected user behavior. The experiment which was taken without the consent of a tiny fraction of Facebook’s more than 1.3 billion users saw incredible backlash from users who hadn’t been asked if they wanted to partake in the study.
Today, Facebook CTO Mike Schroepfer finally apologized in a blog post, and outlined plans for more structured research in the future.
As quoted below:
I want to update you on some changes we’re making to the way we do research at Facebook.
Facebook does research in a variety of fields, from systems infrastructure to user experience to artificial intelligence to social science. We do this work to understand what we should build and how we should build it, with the goal of improving the products and services we make available each day.
We’re committed to doing research to make Facebook better, but we want to do it in the most responsible way.
In 2011, there were studies suggesting that when people saw positive posts from friends on Facebook, it made them feel bad. We thought it was important to look into this, to see if this assertion was valid and to see if there was anything we should change about Facebook. Earlier this year, our own research was published, indicating that people respond positively to positive posts from their friends.
Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism. It is clear now that there are things we should have done differently. For example, we should have considered other non–experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.
Over the past three months, we’ve taken a close look at the way we do research. Today we’re introducing a new framework that covers both internal work and research that might be published:
- Guidelines: we’ve given researchers clearer guidelines. If proposed work is focused on studying particular groups or populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions) it will go through an enhanced review process before research can begin. The guidelines also require further review if the work involves a collaboration with someone in the academic community.
- Review: we’ve created a panel including our most senior subject-area researchers, along with people from our engineering, research, legal, privacy and policy teams, that will review projects falling within these guidelines. This is in addition to our existing privacy cross-functional review for products and research.
- Training: we’ve incorporated education on our research practices into Facebook’s six-week training program, called bootcamp, that new engineers go through, as well as training for others doing research. We’ll also include a section on research in the annual privacy and security training that is required of everyone at Facebook.
- Research website: our published academic research is now available at a single location and will be updated regularly.
We believe in research, because it helps us build a better Facebook. Like most companies today, our products are built based on extensive research, experimentation and testing.
It’s important to engage with the academic community and publish in peer-reviewed journals, to share technology inventions and because online services such as Facebook can help us understand more about how the world works.
We want to do this research in a way that honors the trust you put in us by using Facebook every day. We will continue to learn and improve as we work toward this goal.