fbpx

The Secret Side of Facebook

In 2021, Facebook insiders began to leak internal documents that revealed the company’s executives knew its platform was being widely misused. Pornographers, human traffickers, pedophiles, drug cartels, and other unscrupulous users found a home on Facebook. Yet, time and time again, Facebook executives chose to ignore or minimize these problems.

The leaks, however, became the source for a series of articles by The Wall Street Journal investigative journalist and tech reporter Jeff Horwitz. His exposé—tied to the revelations of a whistleblower who eventually went public—prompted other media outlets to begin releasing The Facebook Papers with damning details of widespread abuse of the platform.

Horwitz’s new book, Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets, is a behind-the-scenes look at Facebook. The author explores the company’s failed content quality enforcement systems, its role in promoting political zealotry and violence, the negative and even crippling effects of widespread use of social media, and the special treatment afforded celebrities, politicians, and VIPS. Broken Code includes the insights from dozens of Facebook employees who spoke both on and off the record to explain the efforts they made to rein in the worst abuses only to see their work ignored, sidetracked, watered down, and marginalized.

The North Star Metric

From its inception, Facebook and its associated platforms, WhatsApp and Instagram, were measured by one overarching metric: how often, on average, people used the platforms. “Daily Average People” (DAP) was the company’s “North Star,” and that oversimplified metric became, the author explains, an insidious trap for corporate decision-makers. “Making decisions based on metrics alone, without carefully studying the effects on humans, was reckless,” Horwitz writes. “But doing it on average metrics was downright stupid. … In the interest of expediency, Facebook’s core metrics were all based on aggregate usage.”

Horwitz makes it a point throughout Broken Code to amplify the difference between how often people use Facebook and how people use Facebook. Trolls, peddlers of misinformation, and spam farms, for example, could easily drive up usage statistics, as could bots and other programs that simply replicated puerile, pernicious, or pornographic content and reposted it to other pages on the platforms. Hype techniques, including clickbait (sensationalist headlines) and engagement bait (appeals to forward content), and Facebook’s aggressive algorithmic amplification spread content further and drove up DAP still more.

These practices, Horwitz explains, empowered inauthentic actors to accumulate huge followings by rewarding publishers with content that was either stolen, aggregated, or spun (altered in some trivial way). The author claims nearly 40 percent of all posters with significant followings and 60 percent of those posting videos used these techniques—and Facebook had no mechanisms to stop them. The result was that “products routinely garnered higher growth rates at the expense of content quality and user safety.”

The content was easily forwarded by the click of a mouse to any of Facebook’s three billion users or any of thousands of groups. Advertisers paid Facebook to target these click-worthy users and groups. And it drove Facebook’s explosive growth and billions of dollars in revenue and profits.

 2016 Election

Facebook’s watershed moment, according to Horwitz, came in the wake of the 2016 elections. “The prospect that Facebook’s errors could have changed the outcome of the election and undermined democracy,” shook executives and employees—and Broken Code tracks the fallout that roiled the company’s corporate culture in the years to come.

The author describes a culture heavily invested in the company manifesto that “changing how people communicate will always change the world” was paired with “the conviction that, thanks to the wisdom of crowds, users would simply suss out falsehoods on their own and avoid spreading them. The revelations around the 2016 election had quickly given the lie to that line of thought.” A hugely woke company, the author argues, came face to face with the reality that misinformation and political diatribes spread on Facebook impacted voter’s decisions.

They also confronted the even harsher reality, Horwitz explains, that not all Facebook users came to the platform with benign intent. Some content on the company’s platforms was clearly problematic—hate speech, human trafficking, child sexual predation, advocacy for genocide and violence, and teen suicide. Employees knew mechanisms to control this content were flawed and even downright ineffective. Moreover, they knew the publishers of this vile content could target select hidden audiences by using code words that triggered users who spread it to others with the speed of the internet.

Broken Code is the inside story of Facebook and the serious and even dangerous problems of social media writ large.

Angry Emojis

Horwitz writes that “there had been no question that Facebook was feeding its users overtly false information at a rate that vastly outstripped other media.” As efforts to combat misinformation took hold, the company’s metrics began to nosedive. People stopped posting and reposting free content that was the lifeblood of Facebook.

The situation was compounded, the author explains, by growing public concern about the effects of social media on mental health. At CEO Mark Zuckerberg’s direction, the company pivoted from providing content services to offering “Meaningful Social Interactions (MSI)”—one of dozens of vacuous terms the company regularly invented. Now the new MSI metric would measure how often people engaged with content by tracking the frequency of their comments. Rushed into use, MSI was badly flawed.

It included no effort at sentiment analysis, meaning it gave equal value to a heartfelt bereavement note and a declaration of intention to piss on the departed’s grave. What mattered was not the content of the message but the fact of the comment itself. The company had already added a host of reaction emojis beyond the basic “like.” … Facebook did not care if you choose a heart or an angry face, as long as you clicked on something.

The company had built its new media platform on the baseless argument that the more users “liked” content, the more likely it was to appeal to others. A mouse click had taken the place of meaningful dialog or any attempt to explain why content had value worth sharing. People had become mere users of content. And now machine-made little emojis could stand in for the emotions at the center of real human social interactions.

The results, writes Horwitz, predictably added “an exponential component to the already-healthy rate at which problem content spread,” as “adoption of MSI turned the rarely used “angry” emoji into the bellwether of political content’s success.” The angry face provoked arguments among users, pushed even more inflammatory content to the fore, and spread it farther and faster with each agitated user’s click.

Whistleblower

Broken Code is the inside story of Facebook and the serious and even dangerous problems of social media writ large. It’s a compelling story but not an engaging one because it lacks a well-crafted narrative that draws in the reader. Much of the book lurches from one episode to the next as Horwitz shares pieces and parts of the recollections of dozens of Facebook employees. There’s a human dimension missing here as the author recounts these employees’ complex reminiscences in language loaded with tech jargon. The truly emotional side to these stories is only captured in fleeting instances.

Horwitz has an encyclopedic knowledge of Facebook executives and employees and their roles and is wholly familiar with the company’s balkanized structure of seemingly always feuding fiefdoms. But, there is no index of names and titles to help the reader through this thicket. Nor is there an organization chart, a list of acronyms, or a glossary with the names and functions of the various Facebook teams, departments, and activities that appear throughout the book.

Broken Code finally gets traction with the reader when Horwitz begins a first-person narrative of his experiences with Facebook whistleblower Frances Haugen. While many of the other employees cited and quoted in the book seem to take bit parts, Haugen is at center stage in the last third of the book. The narrative here is crisp, the stakes are clear, and Horwitz’s recounting of the enormous efforts that led to the publication of the Facebook Files is a solid look at the challenges of good investigative journalism.

Horwitz describes how Haugen was disheartened to realize Facebook routinely traded off content safety for platform growth and was unnerved by the scale of what she found. The author recounts the stress, self-doubt, and isolation she experienced as she spent six months collecting thousands of internal Facebook documents. The documents detailed what the company knew about the widespread abuses it failed to check.

Haugen’s findings also led to an investigation of Facebook’s business practices with the Securities and Exchange Commission. Warned she might be sued by Facebook, and the target of a carefully orchestrated back-channel smear campaign, Haugen took her story public on a 60 Minutes broadcast. Her career in the tech industry was over.

In the end—Haugen, like many of the other employees who came forward with the grim details that fill the pages of Broken Code—dealt with both deep-seated regrets and damage to their professional careers to bring to light the problems that plagued Facebook. Theirs is the content that was never posted to Facebook.

Related