Facebook Failed the People Who Tried to Improve It

Facebook Failed the People Who Tried to Improve It

The “badge posts” of the company’s former researchers offer the parting thoughts of the disillusioned.

“Hi, all,” reads a note on Facebook’s internal Workplace system that was posted on December 9, 2020. “Friday is going to be my last day at Facebook. It makes me sad to leave. I don’t think I’ll ever have a job as good as this one … Unfortunately, I don’t feel I can stay on in good conscience. (1) I think Facebook is probably having a net negative influence on politics in Western countries … (2) I don’t think that leadership is involved in a good-faith effort to fix this …(3) I don’t think I can substantially improve things by staying.”

This is a Facebook “badge post.” The name refers to the laminate badge employees are issued when they join the company, the one that swipes them into Facebook buildings, or did when everyone went into the office. Even more, it confers access to the world of Zuckerberg. It represents membership in a fellowship that was once was unreservedly proud but now harbors mixed feelings, shared inside newly circled wagons. When employees leave Facebook, they commonly write a badge post, often accompanied by a photo of the badge itself.

Most badge posts are fond farewells to a company that gave the departing a great work experience and a much fatter bank account. The writers bubble with optimism for their next adventure. But others are tortured missives like the one above. These people were excited to join Facebook, many of them totally on board with its mission to connect the world. Some joining more recently did so with the aim of helping Facebook address its speech and safety problem. But their experience in the trenches left them frustrated. Researchers laid bare the harm Facebook was doing, often to very large swaths of its user base. Many of those problems seemed almost intractable, but employees dutifully offered potential fixes. Some ultimately concluded that their efforts were doomed.

I found the badge post I quoted above among the hundreds of documents dubbed The Facebook Papers, disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by whistleblower Frances Haugen’s legal counsel. A consortium of news organizations, including WIRED, has reviewed the redacted versions received by Congress. Immersion in the corpus provides a ground-level view of the ways Facebook causes harm. Essentially, it’s a report card on the company’s efforts to police itself, replete with failing grades. I was drawn to the people who produced most of those documents, dutifully compiling the research that proves how well the company understood the harm it caused. Several of those researchers have now left. The best window through which to understand their motivations is the badge posts they left behind.

The author of the post I quoted above was one of Facebook’s most respected researchers, specializing in political content and misinformation. (I reached out to him through a mutual friend, but he didn’t respond to my request to speak. His name is redacted on the version of the document I saw, and I am honoring his privacy.) Several Facebook employees commented under the post that they literally gasped when they read it. The researcher further concludes that despite improvements between 2016 and 2018, Facebook had a negative effect on political discourse, did much worse than other media sources, and followed policy decisions “routinely influenced by political considerations.” The post says Facebook could do much better if it had the courage to rate its posts on their objective quality, but that it fears to do so because of public pressure.

In one extreme, Sophie Zhang, who recently emerged as a Facebook whistleblower, wrote that she had “blood on her hands” from her time at the company. (Zhang’s post was previously reported on, but her epic 6,600-word memo is all there in the Papers.) Most of the badge posts I’ve seen, though, come at the end of a long internal struggle. “Researchers face a moral quandary,” one former Facebook employee told me on the phone last week. “I saw the power that the network had to uplift people out of poverty; I talked to transgender people who found communities of other transgender people. So you see the good. But you also see the bad, and I don’t quite know how to reconcile these two things in my mind.” For this Facebooker, reconciliation came through quitting.

And these departures haven’t just begun in recent months. In July 2016, security engineer Alec Muffett wrote in his badge post, which I found among the Facebook documents, “I am leaving because I am highly concerned about our corporate direction, how our pursuit of growth may negatively impact our ethics and mission statement, and how this has become manifest in our codebase. Plus, I am too tired to fight it.”

This week, I talked online with Muffet, who expressed mixed feelings about the company. He shared a story about working late in one of Facebook’s London offices and winding up in line for a late-night bus. A group of Italian tourists engaged him in conversation. He hesitated at first to tell them where he worked—other times he had mentioned being a Facebook employee, he had been buffeted by questions about how the company handles data—but he owned up to it. The group almost embraced him. Their grandmother had been sick, and Facebook messages from the extended family had brightened her convalescence. They gave him some popcorn. “I won’t lie—I was shocked,” he says. “And I cried a bit.” This is a guy who left the company with a quote from Charlotte Bronte: “Laws and principles are not for the times when there is no temptation: They are for such moments as this … If at my individual convenience I might break them, what would be their worth?”

The researchers behind many of the Facebook Papers care a lot about the platform’s users and love their coworkers, and they think their work is important. But the results have let them down, and they do not feel their leaders have their back. “I think Integrity at Facebook is incredibly important and I have nothing but respect for the people working in that space,” writes one departing safety researcher in a badge post I reviewed. “The truth is, I remain unsure that FB should exist.”

The badge posts are important because their existence refutes a key defense Facebook has been making in the wake of a Wall Street Journal series based on documents provided by former Facebook product manager Frances Haugen, as well as her charges in testimony before a Senate committee. (Her remarks were the ultimate badge post.) In his statements defending Facebook, senior VP of global affairs and communications Nick Clegg claims the documents exposing the company’s failure are themselves proof of its efforts to keep users safe. He cites the $13 billion Facebook has spent on safety and security and correctly notes that there have been improvements. According to his argument, all those papers showing how Facebook and Instagram cause harm or foment divisiveness or reward toxic speech are part of a well-engineered effort to address a microscopic dark side that hardly blemishes the overall wonderfulness of the Facebook family of apps. “Some of the internal discussion papers and internal research that were published over the last two or three weeks were precisely designed so that we could then introduce new changes to our products, to keep people as safe as possible,” Clegg told George Stephanolopoulos on ABC on October 10. The system is working!

But that picture is at odds with the departures of people who talk about having blood on their hands.

I spoke last week to a former researcher whose badge post I did not see not in the Facebook Papers. She told me she would be in a room and provide examples of users she had spoken to, victims of hate speech or harassment. “And there are no women on those product meetings,” she says. “We as researchers in privacy and safety would present these stories that were pretty shocking, like ‘Here’s just one woman I spoke to, and in the course of one day, she got 40 direct messages from people that she didn’t even know and was being harassed.’ But you have to present it with other data, quantitative data. Sometimes that sort of small story gets lost.”

And all too often the problem doesn’t get solved. “If you’re a ‘lowly product manager’ you could be doing the best work in the world, but if you don’t get X number of new users to sign up, you don’t get your bonus, or you don’t get promoted,” she says. To truly address the problems, “The way that the company incentivizes product teams would radically have to change,” she adds.

Another complication: Facebook is structured to resist such change. Making a product shift to improve safety or reduce misinformation in something like the News Feed involves work from several teams, sometimes in the double digits. As one badge poster noted, making an integrity change that improves safety requires approval from multiple departments. But it only takes one “no” to stop that change from happening.

Even worse is the resistance that comes from higher-ups in Facebook’s food chain. “Integrity teams are facing increasing barriers to building safeguards,” a researcher said in a badge post on August 25, 2020. “In recent months, I’ve seen promising interventions from integrity product teams, with strong research and data support, be prematurely stifled or severely constrained by key decision-makers—often based on fears of public and policy stakeholder responses … Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms.”

I’ve spent hundreds of hours in the past few years talking to Facebook employees, including Mark Zuckerberg, and diving into the way the company operates. Nonetheless, I found the Facebook Papers revelatory—not because they contain major surprises about the weaknesses, conflicts, and unacceptable compromises made by Facebook and its leaders, but because they expose how thoroughly aware those leaders were of the platform’s flaws. Over the past few weeks, comparisons between Facebook and Big Tobacco have gained popularity. But Nick Clegg has pushed back on this analogy, and I actually agree with him. There is no mitigating factor in tobacco: No one’s health is improved by cigarettes, and they will kill you. Instead, when I look through these documents—which prove that so many of the terrible things we heard about Facebook were duly reported and documented by its researchers and presented to company leaders—I think of another corporate crisis, one that happened two years before Mark Zuckerberg was born.

Early one morning in September 1982, the parents of 12-year-old Mary Kellerman of the Chicago suburb of Elk Grove found their daughter dying on the bathroom floor. Hours earlier, she had complained of a cold, and her parents had given her one capsule of Extra-Strength Tylenol, the nation’s most popular remedy for minor discomfort. Hers was among three poisoning deaths reported that day, and each victim had taken Tylenol caps laced with cyanide. The death toll would soon reach seven.

Drugmaker Johnson & Johnson’s response later became the grist for countless business classes. According to one account from a Department of Defense series on Crisis Communication Strategies, Johnson & Johnson chair John Burke immediately formed a strategy team. “The team’s strategy guidance from Burke was first ‘How do we protect the people?’ and second ‘How do we save the product?’” Note the order. Company leaders didn’t defend inaction by saying they weren’t responsible for what had happened to Tylenol after it was shipped to drugstores. They didn’t assert that millions of people relieve their aches and pains by swallowing the company’s over-the-counter remedy while only the tiniest fraction of Tylenol users popped capsules laced with cyanide. The company quickly pulled 30 million Tylenol bottles from shelves, created a national alert system to warn the public not to take the pills, and put teams to work to invent a triple-sealed, tamper-proof bottle. They paid the victims—no arguments that the cyanide had been administered after Johnson & Johnson shipped the product to drugstores—and provided counseling. Burke himself went on 60 Minutes and the Phil Donahue Show, frankly and remorsefully addressing the crisis. When the company shipped bottles to stores six months later, it instituted random checks to further assure safety. A year later, Tylenol sales had fully recovered. The company didn’t even have to change its name.

No, the analogy isn’t perfect. News Feed posts or recommendations to join QAnon will not kill you in a few hours, and they are not packaged as medicine. Yet the Facebook Papers show the company was aware of multiple harms resulting from its product, arguably with greater impact than the Tylenol case. The company’s own research shows that Facebook posts have encouraged rioters in Myanmar and contributed to teens’ mental health struggles. This is not the result of tampering, as with Tylenol—Facebook’s products are working as designed, albeit with unintended consequences. Presumably fixing that design, after dozens of research projects exposed the shortcomings, would be the company’s top priority. But where is the urgency? That is the question badge posters ask. For all the billions Facebook spends on security, it lowballs safety in many countries where it is short on native speakers of the language.

At a minimum, one would expect that top leaders would defend the company they had built. Yet weeks after the Wall Street Journal stories and Haugen’s testimony before Congress, Zuckerberg and second-in-command Sheryl Sandberg have yet to respond beyond Zuckerberg’s single anodyne post. My own request to speak to Zuckerberg was denied, but spokesperson Iska Saric gave me this statement: “Mark knows that some criticism directed at the company is valid and pushes teams to learn from it and improve. But he also believes that technology is one of the best tools to improve lives and strengthen society. That’s why he is focused on how Facebook can build innovative technologies and products that bring utility, opportunity, and joy to people around the world.”

Feel better?

We hear that Facebook might be—guess what—changing its name this week. However long this may have been on the company roadmap, making the shift in the middle of what might be its most serious crisis to date seems like a cynical distraction. We get that billions of people choose to use Facebook and extract genuine value from it. As they did from Tylenol. But the Papers show that Facebook is also fully aware of the damage it does. Fixing the situation is a lot harder than providing tamper-proof bottles, of course, as the massive platform Facebook has built includes the worst of humanity, as well as those eager to exploit its weaknesses. You can’t just shut it down and start over. But Facebook will never get where it needs to go by prioritizing growth over safety, as some defectors charge. Or by continuing to present organizational barriers to safety-based product improvements.

And that is what’s so disheartening about those badge posts: They represent dedicated employees who have concluded that change will not come, or who are at least are too burned out to continue fighting for it.


More Great WIRED Stories

https://www.wired.com/feed/rss

Steven Levy

Leave a Reply