Why The History Of Content Moderation Matters

Document Type


Publication Title


Publication Date




The first few years of the 21st century saw the start of a number of companies whose model of making user-generated content easily amplified and distributable continues to resonate today. Facebook was founded in 2004, YouTube began in 2005 and Twitter became an overnight sensation in 2006. In their short history, countless books (and movies and plays) have been devoted to the rapid rise of these companies; their impact on global commerce, politics and culture; and their financial structure and corporate governance. But as Eric Goldman points out in his essay for this conference, surprisingly little has been revealed about how these sites manage and moderate the user-generated content that is the foundation for their success.

Transparency around the mechanics of content moderation is one part of understanding what exactly is happening when sites decide to keep up or take down certain types of content in keeping with the community standards or terms of service. How does material get flagged? What happens to it once it's reported? How is content reviewed and who reviews it? What does takedown look like? Who supervises the moderators?