The act of reviewing user-generated content to detect, identify or address reports of content or conduct that may violate applicable laws or a digital service’s content policies or terms of service.
See also: Terms of Service, Automated Moderation
Commentary:
- Content moderation systems often rely on some combination of people and machines to review content or other online activity with automation executing simpler tasks at scale and humans focusing on issues requiring attention to nuance and context.
- The remedies resulting from violation of a service’s policy can include disabling access to content, temporary or permanent account suspension, and demotion of distribution in search or recommendation engines, and other safety interventions such as those identified in Section III below.