25th Aug 2021

How to anticipate toxicity in your game’s community before launch

Portrait headshot photo of Sam Leloup, Team Lead, Keywords Player Support

By Sam Leloup, Team Lead, Keywords Player Support
www.linkedin.com/in/samleloup/

As you design and develop a new online video game, chances are that you are already thinking about what your community will be look like.

Will the members be naughty or nice? Will there be trolls? What memes will this new community create?

Will they use your platform in creative and unexpected ways? Will they try to break your rules for their own profit?

These are some of the questions that you would be considering as a developer or publisher, and are the kind of questions that we ask ourselves here at the Keywords Player Support when we set guidelines and best practices for our community managers.

We believe that community management should be an integral part of the development process, rather than an afterthought.

With proper planning, you can access fast, powerful, well-designed, integrated moderation tools that will allow your moderators to report, silence, suspend and ban players, but also to reward them when they behave well.

This allows you to anticipate toxicity in your game’s online community, prior to launch.

For example, if your game allows user content submissions, you should consider an interface dedicated to reviewing this content.

If one of your users threatens to harm themselves or someone else because they did not like your latest content patch, you should be able to access their personal information quickly and report the incident to the authorities.

With the right tools available to you, you should be ready to face any player support challenge, and your community will be so wonderful and inspiring that other companies will try to replicate your success on their own platforms.

That is how it should work in theory. But it doesn’t always work out like this.

Anticipate Toxicity in Your Game’s Online Community banner

That’s because many developers leave an essential aspect of community management behind: the human component.

This community you are about to create and grow will be comprised of human beings with different backgrounds, tastes and expectations. You can expect to see this variety in your team of community managers and moderators as well.

What could possibly happen if you put them all together in the same room? Will they define the rules by themselves? Will they be able to understand what you had in mind when you designed your platform? Do you have a clear idea of what is good or bad for your community? How would you try to define what the rules are?

Know your community audience’s demographics

The age of your community is a good starting point to decide what players should be able to do, see and talk about on your platform. The ESRB, PEGI and similar age rating bodies are reliable sources of inspiration.

One could assume that player interactions would follow these rating guidelines, and that your community – fully aware that your platform is teen-rated for instance – will police itself and will not engage in cybersex, death threats or virtual drinking games.

However, PEGI, the ESRB and others only rate the content that you, as a developer, decided to add to the platform during its development process. Once your platform is released and hordes of players log in to interact, these ratings will not protect you.

Player interactions will, most likely, not stay within the boundaries neatly defined by an age rating. As a developer, you need to understand that you can, and will, be held responsible, morally and legally, if something goes wrong.

Players will try to game the system, to find ways to reach the highest level before everyone else, and to bypass the automated content filters you might have put in place.

It only takes a few players to start drawing inappropriate anatomical diagrams using the content they have available, and it just requires a bit more time for these daring freethinkers to inspire the rest of your community.

And just like that, you’ve lost control.

Common sense will not save you

Ideally, before the gates open and your servers are flooded with user content and interactions, you want to take the time to go beyond ratings and to further define how you want your community to interact.

Sadly, the answer to this question is not to ‘just use common sense’, because common sense varies greatly from one individual to the next and from one brand to the next.

What you personally see as toxic behaviour, someone else could view as freedom of expression, even sometimes as a selling point.

That ‘common-sense approach’ can confuse your team of moderators just as easily. Moderators are human beings with varying opinions on multiple topics. If they have no instructions but are forced to rely on common sense, they will just moderate player content based on their personal, biased perspective.

Don’t expect them to guess what is on your mind. Expect to be disappointed, however.

If your moderators follow their own instincts, soon enough, players will start noticing that some of your moderators are more permissive than others regarding certain topics. As a result, your community will start questioning moderation decisions (i.e. “I just got banned, my friend said the exact same thing yesterday and nothing happened to him”).

You will start doubting your moderators’ judgement. Your moderators will start doubting your ability to give them proper directions.

This could lead to players doubting your ability to address complaints fairly. The brand you are responsible for will suffer and you will be blamed.

Regaining everyone’s trust will be a long and difficult task.

Embracing complexity

Let’s start with a basic example: alcohol.

Imagine you run the official Facebook page for a specific brand of whiskey. Can people go there to talk about beer and vodka? Can they talk about, or even mention, other brands of whiskey? Can they post photos showing people drinking your products?

Now try applying the same approach to other topics. Politics, religion, bullying, suicide, gun control, drugs, nudity, terrorism, war, justice, racism, abortion, gender, the death penalty, social networks, freedom of speech, pornography and so on.

Carefully approach all these topics, read about them, understand their complexity, imagine scenarios, submit them to your brand managers, find out where they stand, write their explanations down as well as their reasoning.

Try finding inconsistencies in their responses, then submit new scenarios to clarify.

It is even more important if you do not own the brand in question: you want to know how the brand owners feel about all these potential scenarios. And if they try to play the common sense card, make sure they understand why it is not a good idea.

Use all these scenarios and explanations to build a reference document you can share with the moderators. Give them some time to understand the logic behind these responses, or, shall we say, to understand what the brand defines as common sense.

This is an invaluable tool they can use when they come across situations you did not think about during the preparation phase.

Now consider how some of these questions this apply to your own project? Have you taken the right approach and given due consideration and planning for community management and moderation.

Conclusion: Planning for long-term success

Even the biggest games in the world are only as successful as their communities.

To help create a positive and thriving online community that acts as an endorsement of your brand, you should consider a vendor partner that can guide you through the basic considerations and guidelines for the community managers/moderators that are then assigned to your platform.

This is something you want to get right at the outset.

Understanding these challenges and avoiding toxicity in your game’s online community for millions of users, with the help of well-trained community moderators, is something that Keywords Studios has been doing for years.

With Keywords anticipating key questions and potential issues then devising bespoke community management strategies and guides on your behalf, you provide a safe and fun environment for players who connected to the game makers and fellow players.


Visit our Keywords Player Support page to learn more about our specialised team of experienced community managers and creatives, offering bespoke community and social media services exclusively to gaming clients.

Share This