In the technology industry, we like to talk a lot about disruption. Digitalizing something, we sometimes imagine, means that it will play by completely new rules, and moving it online means that we should forget everything we thought we knew about it.
This is true in some ways, but while the technology that underpins a sector might change, it is still ultimately selling to human beings. This means that to a greater degree than we often acknowledge, the underlying nature of digital businesses is inherited from their pre-internet, analog reality. For example, the buying motivations for food or clothing are similar whether the shopper is on the high street or on an app.
Gaming is no different in this regard. Play is at least as old as civilization. The emotional motivators of competition, cooperation, and self-improvement are as present in esports and online gaming as in any previous form of gamesmanship. Likewise, while history doesn’t record how the earliest sportspeople spoke to one another, it’s a fair guess that verbal communication has always been a key ingredient of the pleasure we get from it: the meta-game of talking to one’s opponent is a fundamental ingredient.
The challenge of words and play
At the same time, online gaming does deliver something new in its ability to match players from across the globe, the speed at which communities develop and evolve around games, and – importantly – the responsibilities that a business, therefore, has to its game’s user base.
The ability of online gaming to transcend geography sets it apart from non-digital play, where only elite players are likely to face international opponents. While this adds to their appeal, it also, in common with many online platforms, creates the potential for users to engage in damaging behaviors.
The fact that verbal abuse – often targeting things like a player’s age, gender, or ethnicity – is widespread in online gaming and difficult to combat has been widely recognized in academic studies. One recent analysis of how players interact in Dota 2, for example, worryingly found that, while younger players are more likely to be penalized for “communications abuse,” older players are more likely to take part in verbal abuse, suggesting that systems for managing abuse are not being properly targeted.
This difficulty in moderation is exacerbated by both the ambiguity of acceptable speech and the specific, rapidly changing language which is used in online games. “git gud”, for example, might be used abusively against a particular person or to bemoan one’s lack of skill; “get rekt” might be part of a perfectly acceptable victory celebration or a part of focused negative attention on an opposing player.
It’s an environment that poses real challenges to both AI and human moderation approaches. For humans, the speed and volume of interactions make thorough oversight difficult; for AI, the variability of the language and its contextual nuance makes keeping up a tall order.
The power of speech
Ultimately, however, there is a clear need to carry the verbally interactive aspects of play into online gaming. A successful online game is one that not only brings players in but keeps them engaged, constructing a community around the competition. While the negative potential of communication is a risk, the ability to communicate is also a key ingredient for sustainable growth.
Getting it right, therefore, means mirroring the complex nature of these interactions with a nuanced approach to content moderation. That means developing systems that combine the best human and automated oversight in bespoke ways specific to the nature and dynamics of the game’s community.
Rules, after all, are also part of the essence of games. Just as referees in real-world sports keep play within appropriate limits, moderation of speech in online gaming can be seen not just as a way of punishing negative behavior but as an opportunity to enable positive interactions. Sometimes, a positive interaction will be one in which players have the space and opportunity to taunt, criticize, and motivate one another; content moderation must evolve to keep up with that fact.
Content Moderation Glossary
Get in the know with our ultimate glossary of content moderation. From UGC to AI-powered moderation, we’ve got you covered. Learn the lingo now!
Digital Services Act (DSA): What It Is and What It Means for Content Moderation
We explain what you need to know everything you need to know about this new law in an easy-to-understand way. Stay ahead of the game in 2023, from transparency and accountability to prohibiting dark patterns.
Doxxing: How to Protect Your Platform and Users
From high-profile doxxing incidents to the potential consequences for victims and businesses, our post covers everything you need to know about this serious threat to online privacy and security.
Creating Trust and Safety in UX Design: Balancing Convenience and Security
Learn how to enhance UX design with trust and safety. Discover tips and best practices for creating secure user experiences that build trust.
Announcing Our Reporting Feature: Download and Visualize Your Data
Announcing Besedo reporting! Download and import your data into your favorite business intelligence tool to create all sorts of graphs, charts, and data magic.
The Advantages of Outsourcing Content Moderation
Discover the advantages of outsourcing content moderation, including cost savings, improved efficiency, access to expertise, scalability, and an improved user experience.
What Is User-Generated Content (UGC)?
Learn everything there is about user-generated content (UGC) and how it’s used. We also take a look at great real-world examples of UGC, and how it affects businesses worldwide.
The Job Scams Epidemic
Learn more about how hackers use brands to harvest personal details. We share how you can fight back using content moderation on your job board.
10 Tips For Startups Dealing With User-Generated Content
As a startup, it’s important to focus on your core idea and product development. However, many distractions and clutter can take your focus away from what you must do. Here are 10 tips for startups dealing with user-generated content!
Keeping Your Gaming Platform Safe And Enhancing Your User Experience
Prevent bullying, grooming, and harassment on the gaming platform you’re running. In-app messaging should be a safe place for all gamers – your users’ safety, and your reputation is on the line.
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.