In October 2021, a former Facebook employee, Frances Haugen, publicly revealed that the company's internal research documented harms that its products caused some of its users. The company’s response was sadly predictable. It questioned the reliability of Haugen’s testimony, asserted its commitment to doing the right thing, and then diverted the public’s attention by changing its name to Meta. The company’s deny-and-distract tactics were, by now, all too familiar and provided few answers.
More than any other platform company, Facebook has found itself at the center of controversy. Its advertisement-supported business model relies upon user engagement which means that its algorithms often, even if unintentionally, promote content that is false, divisive, and harmful to its users. The company profits handsomely from scams that proliferate on its site. Its practices governing data collection and online tracking activities are dubious at best.
Yet, despite all the hand wringing and negative commentary about the legality and ethicality of its business, Facebook continues to engage in practices with harmful social consequences. According to a Wall Street Journal investigation referred to as the “Facebook Files,” the company’s own research documented the harms that its products inflict upon its users and society. How then has Facebook managed to get away with it for so long?
In the absence of regulation, the public depends upon private citizens to claim their rights and redress their wrongs in a court of law. When companies deploy new technology and new business models, legislators and regulators are often slow to react. Consequently, the legality of these new practices is often litigated in court, typically in a class action lawsuit brought against the company. The imposition of civil liability is especially critical in Facebook’s case because the company has been famously evasive about its internal research and what it knows about its products. A lawsuit could be an important way to compel Facebook to disclose some of that information.
However, platform companies such as Facebook typically escape liability for content on their websites because of the immunity provided by section 230 of the Communications Decency Act. They claim that they are not publishers of content. Instead, they argue, they merely provide a platform for the distribution of content created by others. As mere platforms, courts have held that they are not responsible for defamatory or harmful content on their websites, even if they occasionally exercise removal or moderation power.