Facebook Released Its Content Moderation Rules. Now What?

Document Type


Publication Title

The New York Times

Publication Date




Tuesday was a huge day for online speech. Facebook finally released the internal rules that its moderators use to decide what kinds of user content to remove from the site, including once-mysterious details on what counts as “graphic violence,” “hate speech” or “child exploitation.” It also announced the introduction of an appeals process for users who want to challenge the removal of their posts.

These developments represent a big step toward due process, which is essential on a site where so much of our speech now takes place. But an entity with such enormous power over online expression should do even more to listen to its users about what kinds of expression are allowed, and its users should be ready to be heard. Ideally, Facebook will eventually create a more robust system to respond to those who believe their posts were taken down in error, and to give its users the opportunity to weigh in on its policies.