Skip to main content
Back to previous page

How Moderation Improves Community Engagement in Discord

Fostering thriving and inclusive online communities is a top priority for most video game organisations.

Author: Leah MacDermid, Trust & Safety Content Writer at Keywords Studios
Date Published: 23/01/2024
A man gaming on a computer game

Discord has emerged as one of the leading platforms for building fan communities, providing a dynamic environment for players to connect with fellow fans, share gameplay tips, gain access to exclusive content, and interact with game developers. 

Conversations on Discord are fast paced, happen in real time, and are spread across multiple channels in a single server. Due to the speed and potential virality of interactions on Discord, it is crucial that organisations moderate conversations to ensure that hate speech, harassment, and other harmful behaviors don’t spread throughout the community. 

Growing a healthy and safe community on any communication platform can be challenging. But on a platform like Discord, where conversations happen in real time, implementing thoughtful moderation practices and engaging community management techniques are key to building and maintaining a successful community.

In fact, according to James Gallagher, Head of Community Management at Keywords Studios, moderation and community engagement are inseparable. “Moderation is a prerequisite to successful community engagement on Discord,” he says. For Gallagher, engagement on Discord begins with a technique you may not expect: It’s all about building a high-quality space.

Discord Servers and Safety by Design Principles

“We approach Discord projects from a Safety by Design mindset,” says Gallagher. 

A proactive approach that puts player safety at the heart of social platform development, Safety by Design seeks to minimise online threats by detecting and eliminating them before they occur. He explains how his team brings this Safety by Design approach to their conversations with clients. 

“When building new servers for clients, my team always recommends implementing a levelling system that rewards users with XP [experience points] for posting messages,” he says. “As users progress through levels named after characters or worlds from the game, their engagement and motivation increase significantly.”

“Some of the most severe cases of trolling and disruptive behaviour often come from users who have just joined the server. By gradually exposing users to more privileges as they prove themselves through positive engagement, the risk of harmful behavior is mitigated.
James Gallagher 
Head of Community Management at Keywords Studios

This levelling system has impressive engagement benefits — unsurprisingly, players love the gamification aspect —but as a Safety by Design feature, it also serves a vital moderation function. At lower levels, certain types of potentially harmful content, such as external links or videos, are restricted to prevent disruptions. As users level up without posting disruptive content, more sharing options, like images, custom stickers, and emojis, become available to them. By implementing these guardrails at the beginning of the experience and providing a gamified system, the need for moderation is effectively lowered.

As Gallagher points out, “Some of the most severe cases of trolling and disruptive behaviour often come from users who have just joined the server. By gradually exposing users to more privileges as they prove themselves through positive engagement, the risk of harmful behavior is mitigated. Gallagher also stresses the importance of creating a professional, well-organised server, and maintaining it over time. He compares it to the "broken windows" theory, which suggests that visible signs of disorder can lead to an increased likelihood of disruptive behavior and crime in a community. 

Organisations that want to foster community engagement and growth on Discord should start by creating a server with built-in Safety by Design features and maintaining a clean, professional look.

But what’s the next step? 

According to James Gallagher and Jessica Blazquez, Community Manager Lead at Keywords Studios, community managers and content moderators are the second piece of the Discord engagement puzzle. 

Community managers and moderators must work together

Unlike other communication platforms, where community management and moderation tasks are often kept separate (with moderators typically remaining “silent” instead of interacting with the community), on Discord both roles play a large part in fostering community engagement.

“Community management on Discord is a huge part of making sure the community is thriving and friendly,” says Blazquez. Community managers are responsible for organising and shepherding a variety of engagement-boosting activities, such as developer AMAs, giveaways, and fan contests. 

But moderators also play a key role in keeping the community engaged through positive reinforcement and active dialogue with users. “In a healthy community, moderators are some of the most active members,” Blazquez points out. 

This collaboration between community managers and moderators is especially important during Stage channel events. Stage channels allow community managers to host audio events, like townhalls, trivia parties, and panel interviews, which are ideal for large audiences when separation is needed between speakers and the audience. These events are valuable community engagement tools, and they only work if community managers and moderators work together closely. 

people at work desks laughing and chatting

“Typically, community managers will help run the stage event itself, while moderators watch over the chat during the event,” explains Blazquez. 

Audience members can request to speak during stage events, another reason that these events are powerful community engagement tools. While community managers ensure that events run on time and that speakers stick to the agenda, moderators can focus on moving audience members between the speaker group and the audience group – ensuring that everyone has an opportunity to engage during stage events. 

“One of the biggest drivers of toxicity is boredom,” explains Gallagher. “By working together to keep events engaging and safe, community managers and moderators can create a vibrant, welcoming atmosphere that keeps users coming back for more.”

While events play a key role in encouraging community engagement, Blazquez and Gallagher both highlight the impact of passionate fans as one of the top drivers of engagement in a Discord server. Community managers and moderators both contribute to the success of fan engagement by providing a safe environment where users feel comfortable expressing themselves without fear of abuse or harassment. By encouraging high-level players to share their expertise and advice — character builds, weapons, and gameplay strategies — with other fans, community managers and moderators can help boost community engagement.

Active, thriving communities are the goal of Discord — but with that level of engagement comes other challenges. As a result, community managers and moderators may find it difficult to keep up with the volume of messages shared every day on the platform.

What’s the solution? 

According to our experts, automation is key. 

Technology and Human Moderators are the Final Key to Success

Moderator bots offer valuable automation and assist moderators in managing their considerable workload efficiently.  “Moderators will burn out without bots,” says Blazquez, “Especially in large communities with thousands of active users and multiple channels.” 

Discord released AutoMod in 2022, allowing community managers and moderators to set up keyword and spam filters that automatically trigger moderation actions such as blocking messages that contain specific keywords and logging flagged messages for review. 

If you want to protect your users and your brand on Discord, you need eyes on it 24/7.

In addition to AutoMod, there are multiple third-party tools that community managers and moderators can install on their servers to streamline their moderation workflow. These bots can handle various tasks, including keyboard shortcuts that take action on bad actors, automatically creating moderation logs, and even giving users on the server access to more efficient and less clunky reporting tools.   

The future of Discord community management and moderation lies in the combination of AI and human intelligence, sometimes referred to as “AI + HI”. Gallagher explains the power of AI + HI in addressing scalability challenges presented by large communities.   

“If you have a very active community, it never stops,” he says. "If you want to protect your users and your brand on Discord, you need eyes on it 24/7. That’s impossible without leveraging a blend of AI and human moderation.”

While AI is not currently used with Discord's built-in AutoMod, AI-powered moderation tools such as Community Sift are being tested more and more. AI + HI moderation offers a solution for efficiently handling moderation tasks, allowing community managers and moderators to focus on more complex and meaningful interactions with the community while ensuring a safe and engaging environment.

A Holistic Approach to Community Engagement on Discord

Moderation is not just an optional add-on when it comes to creating thriving and engaged communities on Discord. It is an essential component that directly impacts the safety, functionality, and overall success of any server.

But moderation isn’t the only key to community engagement on Discord – it's only one of the many holistic elements that result in a thriving, growing community. 

It starts with creating a high-quality server that incorporates safety features in its design. Then, community managers and moderators must work together to foster safe, healthy, and expressive interactions between users. Finally, moderator bots should be leveraged to streamline moderator’s workflow and allow them to focus on time-sensitive content.

Learn more about Community Management at Keywords here.