Facebook has argued repeatedly that their internal research deserves to remain internal. But when does research move from corporate intellectual property to public health and safety?
Over the past month, Facebook has taken over headlines yet again for a document trove released by whistleblower Frances Haugen and the Wall Street Journal. Allegations against Facebook ran far and wide across some of humanity’s darkest subjects — from enabling human trafficking to fanning the flames of genocide and increasing the risk of suicide in teens. Many journalists and industry experts compare this reckoning to that of Big Tobacco, signaling a sea change in the way tech is regulated.
Many of the most explosive allegations shared by Ms. Haugen originated through internal qualitative and quantitative research documents that sought to report and understand the problems Facebook faced. To pause for a second, I want to start by saying that I myself am a researcher. Almost every project we do for a client involves some sort of consumer or user research. If we don’t understand what an audience wants, we can’t really help our clients. I love the process of talking to folks and finding an unmet need or everyday frustration that a company can help solve. That personal connection to research is why this news story has captured me so deeply.
As I looked through the documents Ms. Haugen shared, I was struck by the similarities between the internal research Facebook has done and the types of research we frequently do for our clients. To be clear, nothing is similar in scope of findings: we’ve never uncovered evidence that our clients are helping traffickers or harming teen girls. But, in structure, these documents are the same as any good insights report — solid methodology, user quotes and profiles, clear insights, and actionable recommendations. The difference between a typical research report and this leaked report is that Facebook crossed the line from better understanding their user base to undermining international public health and safety.
As a researcher, I'd like to break down why Facebook’s internal research has moved beyond corporate intellectual property into the realm of information the public has a basic right to know. We’ll explore why private research like Facebook's is often more impactful than industry-wide data, how, as researchers, we can maintain anonymity and respondent trust while bringing important research to the forefront, and why a company like Facebook keeping these learnings under wraps can be so dangerous.
- Too Big To Hide: Facebook reaches 3.5 billion people a month across their various platforms (that’s over 40% of the world’s population). It’s difficult to argue with the idea that negative impacts on Facebook users make for negative impacts on the globe. For Facebook to deny the idea that their services impact public health — positively or negatively — is laughable.
- Unlimited Research Dollars: Facebook is not only vast, but it is incredibly rich. For reference, they made 29 billion dollars (yes, billion with a “B”) in Q2 of this year alone. Facebook has the power and wealth to conduct as much research as they’d like, without the help of government grants, outside funding, or any of the hoops that academia and medical researchers must jump through. Yes, as a corporation, Facebook can spend their dollars any way they choose, but they have a responsibility to share their learnings with the broader community when the impact is so large.
- Purpose Over Profit: At Spectacle, we believe strongly that purpose and profit go hand in hand, and that a company thrives when equal weight is placed on both in business. Facebook’s research touches on elements of public health and safety, which means there's an ethical responsibility for Facebook to ensure their product is a safe option for its users. Facebook should have placed purpose and profit on the same pedestal — but instead protected profit over all else.
Facebook has long fallen back on the idea that their product is “no more harmful” than social media industry benchmarks suggest. In congressional hearings and interviews, Facebook and Mark Zuckerberg often referenced back to industry data on social media that framed their product and its effects in a much more positive light than their internal findings.
I want to address why private research is so important, and often much more powerful than publicly available research about an industry.
- Category makers steer public research.
Companies like Facebook — or “category makers” — quite literally control the industry or competitive set they rule. Category makers often fund public-facing research that favors “good data” over realism, which can sometimes lead to misleading public information. This more positive data has kept Zuckerberg with plenty of studies to cite regarding social media’s net positive effects, despite internal research pointing elsewhere.
- Product analytics complete the picture.
Companies doing research on their own products not only understand the desires or drivers of consumer behavior, but they also have access to their own analytics to understand what actions the user takes. Companies like Facebook have the clearest line of sight into what their users do, and they’re rarely required to share that product information with outside sources. Couple real product data with psychographic findings like user emotions or drivers, and corporations have a major upper hand in public research.
- Money makes the world go round.
Private enterprises, especially tech, have unfathomably deep pockets to talk to large groups of people, often over very long periods of time. Research is expensive, but private companies must keep their users engaged to unlock maximum profit. Companies are willing to spend outrageous amounts of money on research to best accomplish that mission.
One of Adam Mosseri’s (Head of Instagram) key arguments for keeping this research private was a desire to respect the privacy and confidentiality of respondents. To quote the WSJ, “Facebook says there were discussions about releasing their mental health research, but the company decided against it due to confidentiality and privacy concerns.” This is understandable, especially when dealing with respondents under 18, but there are a number of tactics to make important public health information available publicly without compromising the trust and confidentiality of respondents:
- Build anonymity into reporting.
It’s easy to simply blind respondents and remove personal indicators. Rather than list the first and last names of respondents (Sarah Smith, 16), simply refer to the gender and age of the respondent (Female, 16), or the product they use and age (Instagram User, 16).
- Release high-level learnings, not specific data.
If Facebook didn’t want to release specific quotes or data points, they could have shared high-level insights or executive summaries with the public, and only shared specific data points with scientists, health experts, or researchers in the field. Facebook clearly had strong insights to point to that didn’t require additional backup data to understand the meaning, like “Instagram makes 1 in 3 teenage girls feel worse.”
The Facebook Files make it clear that Facebook knew their findings represent a clear threat to public health and safety, but they’ve chosen to do nothing with their recommendations. As a massive corporation, Facebook has the influence, data ownership, and dollars that make publicizing their internal findings necessary.
Mostly… I feel sorry for the poor researcher handling their brand sentiment tracker this month.