The use of automation technology to perform at least some part of content moderation processes.
See also: Content Moderation, Filtering, Hash Filtering
Commentary:
- Automation may be used to detect potential abuse, through methods like keyword filtering, hash matching, behavioral analysis, machine learning, and artificial intelligence.
- In some cases, a human would then evaluate the potential abuse in light of the applicable company policy and determine what appropriate action, if any, to take.
- Automated moderation seeks to augment the speed and effectiveness of human teams through functions such as advance surfacing or prioritizing potential problems, sorting issues into categories, suggesting a response, or in some cases, selecting and applying a rule and triggering an enforcement action.