How can online gaming allow trash talk while keeping players safe?

In the technology industry, we like to talk a lot about ‘disruption’. Digitalizing something, we sometimes imagine, means that it will play by completely new rules, and moving it online means that we should forget everything we thought we knew about it.

In some ways this is true – but while the technology which underpins a sector might change, it is still ultimately selling to human beings. This means that, to a greater degree than we often acknowledge, the underlying nature of digital businesses is inherited from their pre-internet, analogue reality. The buying motivations for food or clothing, for example, are similar whether the shopper is on the high street or on an app.

Gaming is no different in this regard. Play is at least as old as civilization, and the emotional motivators of competition, cooperation, and self-improvement are as present in esports and online gaming as they are in any previous form of gamesmanship. Likewise, while history doesn’t record how the earliest sportspeople spoke to one another, it’s a fair guess that verbal communication has always been a key ingredient of the pleasure we get from it: the meta-game of talking to ones opponent is a fundamental ingredient.

The challenge of words and play

At the same time, online gaming does deliver something new in its ability to match players from across the globe, the speed at which communities develop and evolve around games, and – importantly – the responsibilities that a business therefore has to its game’s userbase.

The ability of online gaming to transcend geography sets it apart from non-digital play, where only elite players are likely to face international opponents. While this adds to their appeal, it also, in common with many online platforms, creates the potential for users to engage in damaging behaviors.

The fact that verbal abuse – often targeting things like a player’s age, gender, or ethnicity – is widespread in online gaming and difficult to combat has been widely recognized in academic studies. One recent analysis of how players interact in Dota 2, for example, worryingly found that, while younger players are more likely to be penalized for ‘communications abuse’, older players are more likely to actually take part in verbal abuse, suggesting that systems for managing abuse are not being properly targeted.

This difficulty in moderation is exacerbated by both the ambiguity of acceptable speech and the specific, rapidly changing language which is used in online games. “git gud”, for example, might be used abusively against a particular person, or just to bemoan one’s own lack of skill; “get rekt” might be part of a perfectly acceptable victory celebration, or a part of focused negative attention on an opposing player.

It’s an environment which poses real challenges to both AI and human moderation approaches. For humans, the speed and volume of interactions makes thorough oversight difficult; for AI, the variability of the language and its contextual nuance makes keeping up a tall order.

The power of speech

Ultimately, however, there is a clear need to carry the verbally interactive aspects of play forward into online gaming. A successful online game is one that not only brings players in, but keeps them engaged, constructing a community around the competition. While the negative potential of communication is a risk to that, the ability to communicate is also a key ingredient for sustainable growth.

Getting it right, therefore, means mirroring the complex nature of these interactions with a nuanced approach to content moderation. That means developing systems which combine the best of human and automated oversight in bespoke ways which are specific to the nature and dynamics of the game’s community.

Rules, after all, are also part of the essence of games. Just as referees in real-world sports keep play within appropriate limits, moderation of speech in online gaming can be seen not just as a way of punishing negative behavior, but as an opportunity to enable positive interactions. Sometimes, a positive interaction will be one in which players have the space and opportunity to taunt, criticize, and motivate one another; content moderation needs to evolve to keep up with that fact.

 

Find out more about working with us and request a demo today.

 

Petter Nylander - Besedo Chairman

By Petter Nylander

CEO

Want to learn more?
Join the crowds who receive exclusive content moderation insights.