By Sam Leloup, Team Lead, Keywords Player Support
Player support is integral to the success of any video game.
The scale, variety and complexity of today’s games are incomparable when compared to those which came 10 years ago, never mind at the birth of the industry.
According to WePC, there are more than 2.5bn gamers around the world and players’ voices are an ever-increasingly integral part of the gaming experience.
In Keywords Player Support, we deal with millions of online messages, comments and posts every year. The vast majority are from well-meaning, courteous gamers seeking solutions to issues or simply sharing their passion.
But what about when that passion spills over into something more sinister?
In 2019, our Player Support agents and community managers reported hundreds of suicide threats, threats of violence or terrorism and suspected paedophiles.
That’s why we believe that it’s essential for anyone working on the frontlines of customer support to have:
- A. Training on how to handle real-life threats and emergencies
- B. A fast, efficient way to report cases to the appropriate authorities
And not only because there are people’s lives and wellbeing at stake — but also because we don’t want our staff to suffer from witnessing such cases and feeling powerless to do anything.
Since we started our real-life emergency training and Alert Network, we have been informed that multiple suicide attempts have been averted and that individuals have been arrested and charged with child grooming, as a result of our reports.
Distinguishing Real Threats from Virtual Threats
Gamers are some of the most passionate and active enthusiasts on the web. Competitive gaming itself can trigger a range of emotions — the euphoria of a win; anger in defeat, or when a teammate doesn’t pull their weight.
And the language that we use to talk about gaming (where often the main goal is to ‘shoot’ or ‘kill’ another player) could, in another context, be threatening or even illegal.
So how do we distinguish between threats of in-game retribution and real-life violence?
As a company that specializes in services for the video games industry, it helps that all of our agents and community managers are gamers themselves. They have a tacit understanding of the culture of gaming and where some of those blurred lines lay.
And most importantly, we provide mandatory training for identifying and escalating any worrying messages or posts.
We tell our agents to always escalate, even if they aren’t 100% that the threat is sincere. We also encourage them not to try to dissuade or negotiate with the poster at this initial stage.
Again, we have to ensure we have staff who are specially trained to assess and action any correspondence with someone who is in danger or who is issuing threats.
The Escalation Process
Our Alert Network works using an automation software called Zapier. When an agent flags a comment or message as a potential emergency, it triggers a process as follows:
- We try to establish the country of origin, either via account information (where available) or IP address
- An alert is sent to dedicated, trained members of staff in that country
- The assigned staff member assesses the report and if required, calls the emergency services.
We use dedicated agents in each country because, in some cases, you must be located in that country to contact emergency services.
Thanks to our automation tools, the process takes a few seconds — as speed can be decisive in cases where threats are indeed genuine.
Why Prevention is Better than Cure
One can never truly predict when a real-life emergency will occur. But we believe that fostering a respectful community, with clear guidelines for behaviour, is a good place to start.
These guidelines should include a policy on real-life harassment and threats, such as this example from Eve Online developer CCP Games.
Players should always be provided with a straightforward means to privately report harassment or threats, such as an online form or a Discord bot. You should never encourage or allow community members to call out harassment or threats in public.
And finally, be equipped to take swift action in accordance with your policy. If just one community member is seen to be breaking the rules and getting away with it for some time, it can encourage others to follow suit.
Threats, harassment, child grooming and other threats to our online safety are a sad reality.
It may be tempting to dismiss such threats as the empty spewings of so-called ‘keyboard warriors’, and while the majority of online threats never manifest into actual violence, the only responsible stance is to treat every case as a real-life emergency.
Real-life is too important to risk.
Sam Leloup is Team Lead at Keywords Player Support, based in Montreal, and has 20-plus years’ experience working in customer service for video games.
Visit Keywords Player Support to find out more about our Community Management services and how we can help nurture your community.