Discord is rolling out a new AutoMod tool it claims will help protect communities and allow users to “find belonging within a safeguarded environment.”
The social platform is home to a number of video game communities, making this notable news for those users within the games industry.
Pitched as a “mod team helper,” AutoMod will allow all moderators, regardless of their skill level, to quickly implement auto-moderation by accessing server settings.
AutoMod comes equipped with a keyword filter that automatically detects, blocks, and alerts moderators of messages containing harmful words or phrases before they’re posted.
The tool also gives moderators the ability to automatically time users out after they’ve been flagged, giving mod teams more time to handle situations as they arise.
“Community moderators currently have to tackle the majority of their daily duties, such as meticulously monitoring chat and removing any harmful content manually as it happens,” wrote Discord in a blog post.
“AutoMod is here to take some of the load off your shoulders, working to keep your community and conversations clean around the clock so you can take time to enjoy yourself without stressing about what’s happening when you’re out and about.”
Discord has compiled three starting lists that cover select categories of “not-nice words or phrases” within a default AutoMod rule called “Commonly Flagged Rules,” although moderators can also experiment with up to three additional custom rules of their own making.
AutoMod is available right now on Windows, macOS, Linux, iOS, Android, and the web app.