Contents
COVID-19 continues to create new challenges for everyone. To stay connected, businesses and consumers are spending an increasing amount of time online, using different chat and video conferencing platforms to combat social distancing and self-isolation.
We’ve also seen the resurgence of interaction via video games during the lockdown as we explore new ways to entertain ourselves and connect with others. However, a sudden influx of gamers also brings a new set of content moderation issues—for platform owners, game developers, and gamers alike.
Loading…
The video game industry was already in good shape before the global pandemic. In 2019, ISFE (Interactive Software Federation of Europe) reported a 15% rise between 2017 and 2018, turning over a combined €21bn. Another report by ISFE shows that over half of the EU’s population played video games in 2018 – some 250 million players, gaming for an average of nearly 9 hours per week: with a pretty even gender split.
Unsurprisingly, the fastest growing demographic was the 25-34 age group – the generation who grew alongside Nintendo, Sony, and Microsoft consoles. However, gaming has broader demographic appeal too. A 2019 survey conducted by AARP (American Association Of Retired Persons) revealed that 44% of 50+ Americans enjoyed video games at least once a month.
According to GSD (Games Sales Data) in the week commencing 16th March 2020, right at the start of the lockdown, video games sales increased by 63% on the previous week. Digital sales have outstripped physical sales too, and console sales rose by 155% to 259,169 units in the same period.
But stats aside, when you consider the level of engagement possible, it’s clear that gaming is more than just ‘playing’. In April, the popular game Fortnite held a virtual concert with rapper Travis Scott; which was attended by no less than 12.3 million gamers around the world – a record audience for an in-game event.
Clearly, for gaming, the only way is up right now. But given the sharp increases and the increasingly creative and innovative ways gaming platforms are being used as social networks – how can developers ensure every gamer remains safe from bullying, harassment, and unwanted content?
Ready player one?
If all games have one thing in common, it’s rules. The influx of new gamers presents challenges in several ways concerning content moderation. Firstly, uninitiated gamers (often referred to as noob/newbie/nub) will likely be unfamiliar with established, pre-existing rules for online multiplayer games or the accepted social niceties or jargon of different platforms.
From a new user’s perspective, there’s often a tendency to carry over offline behaviors into the online environment – without consideration or a full understanding of the consequences. The Gamer has an extensive list of etiquette guidelines frequently broken by online multiplayer gamers, from common courtesies such as not swearing in front of younger users on voice chat, not spamming chat boxes to not ‘rage-quitting’ a cooperative game due to frustration.
However, when playing in a global arena, gamers might also encounter subtle cultural differences and behave offensively toward certain other groups.
Another major concern is the need to “stay ahead of the next creative idea in scams and frauds or outright abuse, bullying and even grooming to protect all users” because “fraudsters, scammers and predators are always evolving.”
Multiplayer online gaming is open to negative exploitation by individuals with malicious intent or grooming, simply because of the potential anonymity and sheer numbers of gamers taking part simultaneously around the globe.
While The Gamer list spells out that kids (in particular) should never use someone else’s credit card to pay for in-game items, when you consider just how open gaming can be from an interaction perspective, the fact that these details could easily be obtained by deception or coercion needs to be tackled.
A new challenger has entered
In terms of multiplayer online gaming, cyberbullying and its regulation continue to be a prevalent issue. Some of the potential ways in which users can manipulate gaming environments in order to bully others include:
- Ganging up on other players
- Sending or posting negative or hurtful messages (using in-game chat-boxes for example)
- Swearing or making negative remarks about other players that turn into bullying
- Excluding the other person from playing in a particular group
- Anonymously harassing strangers
- Duping more vulnerable gamers into revealing personal information (such as passwords)
- Using peer pressure to push others into performing acts they wouldn’t normally have
Whilst cyberbullying amongst children is fairly well researched, negative online interactions between adults are less well documented and studied. The 2019 report ‘Adult Online Harms’ (commissioned by the UK Council for Internet Safety Evidence Group) investigated internet safety issues amongst UK adults and even acknowledged the lack of research into the effect of cyberbullying on adults.
With so much to watch out for, how can online gaming become a safer space for children, teenagers, and adults alike?
Pause
According to a 2019 report for the UK’s converged communications regulator Ofcom: “The fast-paced, highly-competitive nature of online platforms can drive businesses to prioritize growing an active user base over the moderation of online content.
“Developing and implementing an effective content moderation system takes time, effort and finance, each of which may be a constraint on a rapidly growing platform in a competitive marketplace.”
The stats show that 13% of people have stopped using an online service after observing harassment of others. Clearly, targeted harassment, hate speech, and social bullying need to stop if games manufacturers want to minimize churn rate and risk losing gamers to competitors.
So how can effective content moderation help?
Let’s look at a case study cited in the Ofcom report. As an example of effective content moderation, they refer to the online multiplayer game ‘League Of Legends’ which has approximately 80 million active players. The publishers, Riot Games, explored a new way of promoting positive interactions.
Users who logged frequent negative interactions were sanctioned with an interaction ‘budget’ or ‘limited chat mode’. Players who then modified their behavior and logged positive interactions gained release from the restrictions.
As a result of these sanctions, the developers noted a 7% drop in bad language overall and an increase in positive interactions.
Continue
Taking ‘League Of Legends’ as an example, a combination of human and AI (Artificial Intelligence) content moderation can encourage more socially positive content.
For example, a number of social media platforms have recently introduced ways of helpfully offering users alternatives to UGC (user-generated content), which is potentially harmful or offensive, giving users a chance to self-regulate and make better choices before posting. In addition, offensive language within a post can be translated into non-offensive forms and users are presented with an optional ‘clean version’.
Nudging is also another technique which can be employed to encourage users to question and delay posting something potentially offensive by creating subtle incentives to make the right choice and thereby help to reduce the overall number of negative posts.
Chatbots, disguised as real users, can also be deployed to make interventions in response to specific negative comments posted by users, such as challenging racist or homophobic remarks and prompting an improvement in the user’s online behavior.
Finally, applying a layer of content moderation to ensure that inappropriate content is caught before it reaches other gamers will help keep communities positive and healthy. Ensuring higher engagement and less user leakage.
Game Over: Retry?
Making good of a bad situation, the current restrictions on social interaction offer the gaming industry a great opportunity to draw in a new audience and broaden the market.
It also continues to inspire creative innovations in artistry and immersive storytelling, offering new and exciting forms of entertainment, pushing the boundaries of technological possibility, and generating new business models.
However, the gaming industry also needs to take greater responsibility for the safety of gamers online by incorporating robust content management strategies. Doing so at scale, especially when audience numbers are so great, takes a lot more than manual player intervention or reactive strategies alone.
We remain committed to this challenge at Besedo: using technology to meet the moderation needs of all digital platforms. By combining machine learning, artificial intelligence, and manual moderation techniques, we can build a bespoke set of solutions that can operate at scale.
Contact our team to learn more about content moderation and gaming or to arrange a product demonstration.
Ahem… tap, tap… is this thing on? 🎙️
We’re Besedo and we provide content moderation tools and services to companies all over the world. Often behind the scenes.
Want to learn more? Check out our homepage and use cases.
And above all, don’t hesitate to contact us if you have questions or want a demo.