✨ New ✨ The Digital Services Act: A fireside chat covering all angles Watch it here → ×

Game changers: New players present new gaming industry challenges

Contents

    COVID-19 continues to create new challenges for all. To stay connected, we’re seeing businesses and consumers spend an increasing amount of time online – using different chat and video conferencing platforms to stay connected, and combat social distancing and self-isolation.

    We’ve also seen the resurgence of interaction via video games during the lockdown, as we explore new ways to entertain ourselves and connect with others. However, a sudden influx of gamers also brings a new set of content moderation issues – for platform owners, games developers, and gamers alike.

    Let’s take a closer look.

    Loading…

    The video game industry was already in good shape before the global pandemic. In 2019, ISFE (Interactive Software Federation of Europe) reported a 15% rise between 2017 and 2018, turning over a combined €21bnAnother report by ISFE shows that over half of the EU’s population played video games in 2018 – some 250 million players, gaming for an average of nearly 9 hours per week: with a pretty even gender split.

    It’s not surprising that the fastest growing demographic was the 25-34 age group – the generation who grew alongside Nintendo, Sony, and Microsoft consoles. However, gaming has broader demographic appeal too. A 2019 survey conducted by AARP (American Association Of Retired Persons) revealed that 44% of 50+ Americans enjoyed video games at least once a month.

    According to GSD (Games Sales Data) in the week commencing 16th March 2020, right at the start of the lockdown, video games sales increased by 63% on the previous week. Digital sales have outstripped physical sales too, and console sales rose by 155% to 259,169 units in the same period.

    But stats aside, when you consider the level of engagement possible, it’s clear that gaming is more than just ‘playing’. In April, the popular game Fortnite held a virtual concert with rapper Travis Scott; which was attended by no less than 12.3 million gamers around the world – a record audience for an in-game event.

    Clearly, for gaming, the only way is up right now. But given the sharp increases and the increasingly creative and innovative ways gaming platforms are being used as social networks – how can developers ensure every gamer remains safe from bullying, harassment, and unwanted content?

    Ready player one?

    If all games have one thing in common, it’s rules. The influx of new gamers presents challenges in several ways concerning content moderation. Firstly, uninitiated gamers (often referred to as noob/newbie/nub) will likely be unfamiliar with established, pre-existing rules for online multiplayer games or the accepted social niceties or jargon of different platforms.

    From a new user’s perspective, there’s often a tendency to carry over offline behaviors into the online environment – without consideration or a full understanding of the consequences. The Gamer has an extensive list of etiquette guidelines frequently broken by online multiplayer gamers, from common courtesies such as not swearing in front of younger users on voice chat, not spamming chat boxes to not ‘rage-quitting’ a cooperative game due to frustration.

    However, when playing in a global arena, gamers might also encounter subtle cultural differences and behave in an offensive way to certain other groups.

    Another major concern is the need to “stay ahead of the next creative idea in scams and frauds or outright abuse, bullying and even grooming to protect all users” because “fraudsters, scammers and predators are always evolving.”

    Multiplayer online gaming is open to negative exploitation by individuals with malicious intent or grooming, simply because of the potential anonymity and sheer numbers of gamers taking part simultaneously around the globe.

    While The Gamer list spells out that kids (in particular) should never use someone else’s credit card to pay for in-game items, when you consider just how open gaming can be from an interaction perspective, the fact that these details could easily be obtained by deception or coercion needs to be tackled.

    A new challenger has entered

    In terms of multiplayer online gaming, cyberbullying and its regulation continue to be a prevalent issue. Some of the potential ways in which users can manipulate gaming environments in order to bully others include:

    • Ganging up on other players
    • Sending or posting negative or hurtful messages (using in-game chat-boxes for example)
    • Swearing or making negative remarks about other players that turn into bullying
    • Excluding the other person from playing in a particular group
    • Anonymously harassing strangers
    • Duping more vulnerable gamers into revealing personal information (such as passwords)
    • Using peer pressure to push others into performing acts they wouldn’t normally have

    Whilst cyberbullying amongst children is fairly well researched, negative online interactions between adults are less well documented and studied. The 2019 report ‘Adult Online Harms’ (commissioned by the UK Council for Internet Safety Evidence Group) investigated internet safety issues amongst UK adults and even acknowledged the lack of research into the effect of cyberbullying on adults.

    With so much to be on the lookout for, how can online gaming become a safer space to play in for children, teenagers, and adults alike?

    Pause

    According to a 2019 report for the UK’s converged communications regulator Ofcom: “The fast-paced, highly-competitive nature of online platforms can drive businesses to prioritize growing an active user base over the moderation of online content.

    “Developing and implementing an effective content moderation system takes time, effort and finance, each of which may be a constraint on a rapidly growing platform in a competitive marketplace.”

    The stats show that 13% of people have stopped using an online service after observing harassment of others. Clearly, targeted harassment, hate speech, and social bullying need to stop if games manufacturers want to minimize churn rate and risk losing gamers to competitors.

    So how can effective content moderation help?

    Let’s look at a case study cited in the Ofcom report. As an example of effective content moderation, they refer to the online multiplayer game ‘League Of Legends’ which has approximately 80 million active players. The publishers, Riot Games, explored a new way of promoting positive interactions.

    Users who logged frequent negative interactions were sanctioned with an interaction ‘budget’ or ‘limited chat mode’. Players who then modified their behavior and logged positive interactions gained release from the restrictions.

    As a result of these sanctions, the developers noted a 7% drop in bad language in general and an overall increase in positive interactions.

    Continue

    Taking ‘League Of Legends’ as an example, a combination of human and AI (Artificial Intelligence) content moderation can encourage more socially positive content.

    For example, a number of social media platforms have recently introduced ways of helpfully offering users alternatives to UGC (user-generated content) which is potentially harmful or offensive, giving users a chance to self-regulate and make better choices before posting. In addition, offensive language within a post can be translated into non-offensive forms and users are presented with an optional ‘clean version’.

    Nudging is also another technique which can be employed to encourage users to question and delay posting something potentially offensive by creating subtle incentives to make the right choice and thereby help to reduce the overall number of negative posts.

    Chatbots, disguised as real users, can also be deployed to make interventions in response to specific negative comments posted by users, such as challenging racist or homophobic remarks and prompting an improvement in the user’s online behavior.

    Finally, applying a layer of content moderation to ensure that inappropriate content is caught before it reaches other gamers will help keep communities positive and healthy. Ensuring higher engagement and less user leakage.

    Game Over: Retry?

    Making good of a bad situation, the current restrictions on social interaction offer a great opportunity for the gaming industry to draw in a new audience and broaden the market.

    It also continues to inspire creative innovations in artistry and immersive storytelling, offering new and exciting forms of entertainment, pushing the boundaries of technological possibility, and generating new business models.

    However, the gaming industry also needs to ensure it takes greater responsibility for the safety of gamers online by ensuring it incorporates robust content management strategies. Even if doing so at scale, especially when audience numbers are so great, takes a lot more than manual player intervention or reactive strategies alone.

    This is a challenge we remain committed to at Besedo – using technology to meet the moderation needs of all digital platforms. Through a combination of machine learning, artificial intelligence, and manual moderation techniques, we can build a bespoke set of solutions that can operate at scale.

    To find out more about content moderation and gaming, or to arrange a product demonstration, contact our team!

    Contents