In the technology industry, we like to talk a lot about ‘disruption’. Digitalizing something, we sometimes imagine, means that it will play by completely new rules, and moving it online means that we should forget everything we thought we knew about it.

In some ways this is true – but while the technology which underpins a sector might change, it is still ultimately selling to human beings. This means that, to a greater degree than we often acknowledge, the underlying nature of digital businesses is inherited from their pre-internet, analogue reality. The buying motivations for food or clothing, for example, are similar whether the shopper is on the high street or on an app.

Gaming is no different in this regard. Play is at least as old as civilization, and the emotional motivators of competition, cooperation, and self-improvement are as present in esports and online gaming as they are in any previous form of gamesmanship. Likewise, while history doesn’t record how the earliest sportspeople spoke to one another, it’s a fair guess that verbal communication has always been a key ingredient of the pleasure we get from it: the meta-game of talking to ones opponent is a fundamental ingredient.

The challenge of words and play

At the same time, online gaming does deliver something new in its ability to match players from across the globe, the speed at which communities develop and evolve around games, and – importantly – the responsibilities that a business therefore has to its game’s userbase.

The ability of online gaming to transcend geography sets it apart from non-digital play, where only elite players are likely to face international opponents. While this adds to their appeal, it also, in common with many online platforms, creates the potential for users to engage in damaging behaviors.

The fact that verbal abuse – often targeting things like a player’s age, gender, or ethnicity – is widespread in online gaming and difficult to combat has been widely recognized in academic studies. One recent analysis of how players interact in Dota 2, for example, worryingly found that, while younger players are more likely to be penalized for ‘communications abuse’, older players are more likely to actually take part in verbal abuse, suggesting that systems for managing abuse are not being properly targeted.

This difficulty in moderation is exacerbated by both the ambiguity of acceptable speech and the specific, rapidly changing language which is used in online games. “git gud”, for example, might be used abusively against a particular person, or just to bemoan one’s own lack of skill; “get rekt” might be part of a perfectly acceptable victory celebration, or a part of focused negative attention on an opposing player.

It’s an environment which poses real challenges to both AI and human moderation approaches. For humans, the speed and volume of interactions makes thorough oversight difficult; for AI, the variability of the language and its contextual nuance makes keeping up a tall order.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

The power of speech

Ultimately, however, there is a clear need to carry the verbally interactive aspects of play forward into online gaming. A successful online game is one that not only brings players in, but keeps them engaged, constructing a community around the competition. While the negative potential of communication is a risk to that, the ability to communicate is also a key ingredient for sustainable growth.

Getting it right, therefore, means mirroring the complex nature of these interactions with a nuanced approach to content moderation. That means developing systems which combine the best of human and automated oversight in bespoke ways which are specific to the nature and dynamics of the game’s community.

Rules, after all, are also part of the essence of games. Just as referees in real-world sports keep play within appropriate limits, moderation of speech in online gaming can be seen not just as a way of punishing negative behavior, but as an opportunity to enable positive interactions. Sometimes, a positive interaction will be one in which players have the space and opportunity to taunt, criticize, and motivate one another; content moderation needs to evolve to keep up with that fact.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Why creating sustainable growth means looking beyond the digital present

Over the past decade, it has become common to suggest that every company is now a tech company.

The exponential growth in digital usage quickly outgrew what we traditionally think of as the technology sector and, for users, the agility of the internet didn’t stay confined to the online world. Technology has shifted expectations about how everything can or should work. Soon, companies selling everything from furniture to financial services started to look and act more like innovative tech companies. They find new ways to solve old problems through digital channels.

In other words, business leaders seeking to guarantee growth turned to digital technology – to the point that, now, the Chief Technology Officer is a key part of the C-suite.

After a year when we’ve all relied on the internet more than ever, in every aspect of our lives, growth through digital has never been more apparent. For business, digital communication has at times been the only possible way of staying in touch with customers, and there’s no sign that the CEO’s focus on agility and technology is fading. In recent surveys, IBM found that 56% of CEOs are ‘aggressively pursuing operational agility and flexibility’, PwC found that they see cyber threats as the second biggest risk to business, and Deloitte found that 85% think the pandemic accelerated digital transformation.

If the exponential growth of digital has made every company a technology company, though, it has also made terms like ‘technology’ and ‘agility’ less useful. If every CEO is pursuing a digital strategy, that term must be encompassing a vast range of different ideas. As we look towards the next decade of growth – focused on managing the challenge of achieving more responsible and sustainable business along the way – we will need to think carefully about what comes next once digitalisation is universal.

Supercharged tech growth has skyrocketed user-generated content

Of course, the importance of agile technology has never been the tech itself, but what people do with it. For customers we’ve seen tech innovation create new ways of talking, direct access to brands, and large changes in how we consume media and make purchases.

As digital channels take on a greater share of activity than ever, one of the effects of an exponential growth in digital is an exponential growth in user-generated content (UGC).

This user-led interaction, from product reviews to marketplace listings to social interactions, fully embodies the agility that companies have spent the last decade trying to bring to their processes; because it is made by people, UGC is rapid, diverse, and flexible by default. While it may be too soon to say that every business will become a content business, it’s clear that this will become an increasingly important part of how businesses operate. Certainly, it’s already a major driving force for sectors as diverse as marketplaces, gaming, and dating.

A UGC business must be protected to maximise opportunity

In the move towards UGC, a business’s user interaction and user experience will have consequences across the organisation – from profit margin, to brand positioning, to reputational risk, to technological infrastructure. Across all of these, there will be a need to uphold users’ trust that content is being employed responsibly, that they are being protected from malign actors, and that their input is being used for their benefit. Turning content into sustainable growth, then, is a task that needs to be addressed across the company, not confined to any one business function.

Marketers, for instance, have benefited from digitalisation’s capacity to make the customer experience richer and more useful – but it has also introduced an element of unpredictability in user interactions. When communities are managed and shaped, marketers need to ensure that those efforts produce a public face in line with the company’s ethos and objectives.

While tech teams need to enable richer user interaction, their rapid ascent to become a core business function has left them under pressure to everything, everywhere. Their innovation in how content is managed, therefore, needs a middle path between the unsustainable workload of in-house development and the unsustainable compromises of off-the-shelf tooling.

With the ultimate outcomes of building user trust being measured in terms of things like brand loyalty and lifetime user value, finance departments will also need to adapt to this form of customer relationship. The creation of long-term financial health needs investments and partnerships which truly understand how the relationship between businesses and customers is changing.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

UGC as a vital asset for sustainable business growth

Bringing this all together will be the task needed to create sustainable growth – growth which is fit for and competitive in the emerging context of UGC, sensitive to the increasing caution that users will have around trusting businesses, and transparent about the organisations ethos, purpose, and direction. It will require not just investing in technology, but understanding how tech is leading us to a more interactive economy at every scale.

As digitalisation continues to widen and deepen, we may find UGC, and the trust it requires, becoming just as vital an asset for businesses as product stock or intellectual property. To prepare for that future and maximise their business growth from their UGC, businesses need to start thinking and planning today.

By Petter Nylander

CEO Besedo Global Services

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

The lockdown’s lifting – at least in some parts of the world. But it’ll be some time yet before the wheels of global commerce begin to turn with any degree of regularity once again.

While it’s easy to assume that the shift to remote working, online shopping, and video-socializing has positively impacted most digital businesses and online marketplaces, that’s not necessarily the case.

However, while many digital services are undoubtedly thriving, this surge in demand continues to highlight different issues for many others – from a security, capacity, and scalability perspective.

In a similar way, companies that use technology to facilitate offline services – such as socializing, dating, or the exchange of services – are having to pivot to find new ways to stay relevant and active.

Let’s take a closer look at how many different digitally-driven companies in different sectors are addressing and overcoming the challenges they face.

Loud & Clear

One area that’s seen huge expansion during the lockdown is videoconferencing. It’s easy to see why.

Prior to the pandemic, one particular platform called Zoom was growing steadily, mostly among business customers. With 10 million active daily users back in December 2019, expectations were moderately ambitious. But fast-forward to April 2020, and user numbers had grown to an astonishing 300 million.

We all know what happened there. But then something else became apparent – Zoom wasn’t as secure as many users first thought. Cue an onslaught of privacy issues, such as ‘Zoom bombers’ and other uninvited video chat guests intent on password and identity theft.

To counter the issues the platform faced, the team has now rolled out end-to-end encryption: for its paid users. But despite these issues, Zoom continues to make massive profits – making $27m between February and April 2020: a sharp increase compared with its $198,000 profit just 12 months ago.

So what’s Zoom’s secret? People need it right now. Not just businesses intent on maintaining contact between usually office-based staff, but everyone else too – from those looking to connect with families and friends, to the global events industry which has literally moved talks, seminars, and other discussion-based happenings to the digital realm (as for instance exemplified with the recent Global Online Classifieds Summit).

But is its success sustainable? While it’s clear that ‘encryption-for-some’ must become ‘encryption-for-all’ in the long-term, right now it seems need outweighs any particular risk.

In short, it’s become an essential utility for many.

Eking Out A Living From eCommerce

In a similar way, the lockdown has sparked a massive upturn for online shopping. Given that over a third of shoppers are apparently unwilling to return to bricks and mortar stores until a COVID-19 vaccine is available, it’s not surprising that many large online retailers, fulfillment services, and manufacturers are reporting demand outstripping anything they could have been prepared for.

Of course, Amazon, the global eCommerce giant, is leading the way, as we’d assume – with Q1 2020 results 26% up year-on-year. In fact, given the increased demand for its services, Amazon has recruited an additional 175,000 people during the COVID-19 crisis.

Pre-financial announcements, the company was reportedly makin$11,000 per second back in April. However, it in fact transpires that Amazon’s actually making a loss right now. All of the extra revenues are being used to pay workers and increase capacity.

Looking ahead, the mighty online retailer is unlikely to be toppled anytime soon; though it clearly demonstrates that they too have had to prioritize meeting demand rather than doubling down on profitability.

But, while Amazon’s offline order fulfillment service may be suffering, it’s not hard to believe that losses are being offset by its purely digital services – TV, music, eBooks, cloud computing services. Diversification has presumably been its saving grace.

Beyond The Ban

However, many other businesses who essentially use digital services to enhance the customer experience – and automate backend processes such as data collation, CRM functionality, and order processing – are facing tough times.

Take the travel sector for instance, which has probably taken the hardest hit of all, given the restrictions that were put in place to stop the spread of corona virus.

In the absence of being able to guarantee immediate bookings, many companies are asking customers to book for 2021 already, in an attempt to maintain cash flow and remain operational.

However, companies that would usually generate smaller profits from multiple bookings and casual stays could lose out if things don’t recover quickly. In cases like these, it really is a case of the strongest surviving.

But that said, some well-placed creativity and innovation can go a long way.

Take Airbnb, for example, which has recently rolled out its new Online Experiences initiative to not only boost revenues, but to give customers a taste of what everyone’s missing out on, and to help bring people closer together – in a way that picks up where Airbnb’s popular in-person experiences left off.

Using the service, customers can learn and interact with experts and enthusiasts from all over the world; doing everything from family baking sessions to taking part in history quizzes – both for fun and educational purposes.

Could it be that the service that started life as a couch surfing app becomes a bonafide education platform? Only time will tell. But Airbnb’s well-timed pivot certainly plays to its strengths.

In Sweden, travel company, Apollo Tours, has started focusing on the domestic market rather than far-flung destinations. Anticipating that international travel will take a while to be fully operational, Apollo is offering and organizing local activities and training sessions – for everything from mountain biking to yoga – to give customers something proactive to be able to do during the summer vacation, both alone or in small groups.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Love In A COVID 19 Climate

Interaction is just as important as stimulation. We’re social creatures after all. And while many of us have learned to deal with being distanced from our loved ones, what about those looking for love? The countless singletons and lonely hearts out there unable to meet with prospective partners in person.

Well, dating apps and platforms open doors to new matches. They provide a safe space to interact, message, and meet new people who share the same interests and outlook.

While Tinder’s going all out encouraging users to go on virtual dates – co-watching Netflix shows and movies, ordering takeout from the same place and dining by FaceTime – the stark advice to maintain distance and avoid sneaky visits to your intended’s sleeping quarters remains in place.

Up and comer app, Hinge is attempting to bridge the lockdown divide with its own bespoke ‘date from home’ feature – connecting matched users to those ready to video chat there and then.

While these efforts may be admirable, in their efforts to capture different aspects of spontaneity, meaningful connections, and quality time, in effect they haven’t deviated too far from their original offerings.

These features might actually be kept long-term for those keen to maintain physical distance before meeting someone new in person – or where busy schedules don’t allow – let’s be honest: there’s no substitute for face-to-face meetings where affairs of the heart are concerned.

Focus On Users: Nothing Else

Ultimately, what can you do when the very nature of your business model is under threat? You find ways to give your customers what they want.

As the companies mentioned are realizing, supporting users is what counts – offering real value, in the most authentic, meaningful way possible

It’s about putting them first – not just to keep them engaged and subscribed to your service or platform – but to genuinely offer help and support during a difficult time.

This was a sentiment echoed when we spoke with online marketplace, FINN.no, fraud manager, Geir Petter Gjefsen recently. By focusing on its users, and actively encouraging its users to ask for help or help others during the crisis, not only did the initial dip in traffic recover, but deeper customer bonds were formed.

Similarly, eBayK (eBay Kleinanzeigen), a free online classifieds market that’s committed to sustainable trade, created a ‘Neighborhood Help’ category where customers could offer their service – from dog walking to tuition – as the world faced COVID 19 uncertainty. The result? A peak in traffic and 40 million live ads.

All things considered, to stay afloat, maintain customer loyalty, and to come out the other side of the crisis intact, digital businesses need to be agile. They need to adapt by focusing on their own strengths and tailoring even closer to what their customers need. As teams need to focus on delivering the best possible service, machine learning, AI and outsourced agents can play a part in helping moderate the content itself.

Now is the time for action and innovation. After all, what have you got to lose?

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

COVID-19 continues to create new challenges for all. To stay connected, we’re seeing businesses and consumers spend an increasing amount of time online – using different chat and video conferencing platforms to stay connected, and combat social distancing and self-isolation.

We’ve also seen the resurgence of interaction via video games during the lockdown, as we explore new ways to entertain ourselves and connect with others. However, a sudden influx of gamers also brings a new set of content moderation issues – for platform owners, games developers, and gamers alike.

Let’s take a closer look.

Loading…

The video game industry was already in good shape before the global pandemic. In 2019, ISFE (Interactive Software Federation of Europe) reported a 15% rise between 2017 and 2018, turning over a combined €21bnAnother report by ISFE shows that over half of the EU’s population played video games in 2018 – some 250 million players, gaming for an average of nearly 9 hours per week: with a pretty even gender split.

It’s not surprising that the fastest growing demographic was the 25-34 age group – the generation who grew alongside Nintendo, Sony, and Microsoft consoles. However, gaming has broader demographic appeal too. A 2019 survey conducted by AARP (American Association Of Retired Persons) revealed that 44% of 50+ Americans enjoyed video games at least once a month.

According to GSD (Games Sales Data) in the week commencing 16th March 2020, right at the start of the lockdown, video games sales increased by 63% on the previous week. Digital sales have outstripped physical sales too, and console sales rose by 155% to 259,169 units in the same period.

But stats aside, when you consider the level of engagement possible, it’s clear that gaming is more than just ‘playing’. In April, the popular game Fortnite held a virtual concert with rapper Travis Scott; which was attended by no less than 12.3 million gamers around the world – a record audience for an in-game event.

Clearly, for gaming the only way is up right now. But given the sharp increases, and the increasingly creative and innovative ways gaming platforms are being used as social networks – how can developers ensure every gamer remains safe from bullying, harassment, and unwanted content?

Ready Player One?

If all games have one thing in common, it’s rules. The influx of new gamers presents new challenges in a number of ways, where content moderation is concerned. Firstly, because uninitiated gamers (often referred to as noob/newbie/nub) are likely to be unfamiliar with established, pre-existing rules for online multiplayer games or the accepted social niceties or jargon of different platforms.

From a new user’s perspective, there’s often a tendency to carry over offline behaviours into the online environment – without consideration or a full understanding of the consequences. The Gamer has an extensive list of etiquette guidelines which get frequently broken by online multiplayer gamers, from common courtesies such as not swearing in front of younger users on voice-chat, not spamming chat-boxes to not ‘rage-quitting’ a co-operative game due to frustration.

However, when playing in a global arena, gamers might also encounter subtle cultural differences and behave in a way which is considered offensive to certain other groups of people.

Another major concern, as outlined by Otis Burris, Besedo’s Vice President Of Partnerships, outlined in a recent interview, which affects all online platforms, is the need to “stay ahead of the next creative idea in scams and frauds or outright abuse, bullying and even grooming to protect all users” because “fraudsters, scammers and predators are always evolving.”

Multiplayer online gaming is open to negative exploitation by individuals with malicious intent or grooming, simply because of the potential anonymity and sheer numbers of gamers taking part simultaneously around the globe.

While The Gamer list spells out that kids (in particular) should never use someone else’s credit card to pay for in-game items, when you consider just how open gaming can be from an interaction perspective, the fact that these details could easily be obtained by deception or coercion needs to be tackled.

A New Challenger Has Entered

In terms of multiplayer online gaming, cyberbullying and its regulation continue to be a prevalent issue. Some of the potential ways in which users can manipulate gaming environments in order to bully others include:

Whilst cyberbullying amongst children is fairly well researched, negative online interactions between adults are less well documented and studied. The 2019 report ‘Adult Online Harms’ (commissioned by the UK Council for Internet Safety Evidence Group) investigated internet safety issues amongst UK adults, and even acknowledges the lack of research into the effect of cyberbullying on adults.

With so much to be on the lookout for, how can online gaming become a safer space to play in for children, teenagers, and adults alike?

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Pause

According to a 2019 report for the UK’s converged communications regulator Ofcom: “The fast-paced, highly-competitive nature of online platforms can drive businesses to prioritize growing an active user base over the moderation of online content.

“Developing and implementing an effective content moderation system takes time, effort and finance, each of which may be a constraint on a rapidly growing platform in a competitive marketplace.”

The stats show that 13% of people have stopped using an online service after observing harassment of others. Clearly, targeted harassment, hate speech, and social bullying need to stop if games manufacturers want to minimize churn rate and risk losing gamers to competitors.

So how can effective content moderation help?

Let’s look at a case study cited in the Ofcom report. As an example of effective content moderation, they refer to the online multiplayer game ‘League Of Legends’ which has approximately 80 million active players. The publishers, Riot Games, explored a new way of promoting positive interactions.

Users who logged frequent negative interactions were sanctioned with an interaction ‘budget’ or ‘limited chat mode’. Players who then modified their behavior and logged positive interactions gained release from the restrictions.

As a result of these sanctions, the developers noted a 7% drop in bad language in general and an overall increase in positive interactions.

Continue

Taking ‘League Of Legends’ as an example, a combination of human and AI (Artificial Intelligence) content moderation can encourage more socially positive content.

For example, a number of social media platforms have recently introduced ways of helpfully offering users alternatives to UGC (user generated content) which is potentially harmful or offensive, giving users a chance to self-regulate and make better choices before posting. In addition, offensive language within a post can be translated into non-offensive forms and users are presented with an optional ‘clean version’.

Nudging is also another technique which can be employed to encourage users to question and delay posting something potentially offensive by creating subtle incentives to make the right choice and thereby help to reduce the overall number of negative posts.

Chatbots, disguised as real users, can also be deployed to make interventions in response to specific negative comments posted by users, such as challenging racist or homophobic remarks and prompting an improvement in the user’s online behavior.

Finally, applying a layer of content moderation to ensure that inappropriate content is caught before it reaches other gamers will help keep communities positive and healthy. Ensuring higher engagement and less user leakage.

Game Over: Retry?

Making good from a bad situation, the current restrictions on social interaction offer a great opportunity for the gaming industry to draw in a new audience and broaden the market.

It also continues to inspire creative innovations in artistry and immersive storytelling, offering new and exciting forms of entertainment, pushing the boundaries of technological possibility, and generating new business models.

But the gaming industry also needs to ensure it takes greater responsibility for the safety of gamers online by ensuring it incorporates robust content management strategies. Even if doing so at scale, especially when audience numbers are so great, takes a lot more than manual player intervention or reactive strategies alone.

This is a challenge we remain committed to at Besedo – using technology to meet the moderation needs of all digital platforms. Through a combination of machine learning, artificial intelligence, and manual moderation techniques we can build a bespoke set of solutions that can operate at scale.

To find out more about content moderation and gaming, or to arrange a product demonstration, contact our team!

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

User safety is key for all online platforms, particularly when you’re dealing with vulnerable youngsters. Moderating can be challenging and getting the balance between censorship and safety right can be hard.

We sat down with industry veteran and founder of Friendbase; Deborah Lygonis, to discuss the experience she’s gained from developing and running a virtual world for teens.

Deborah Lygonis - Founder of Friendbase

Interviewer: Hi Deborah. Could you please give us a short introduction to yourself?

Deborah: My name is Deborah Lygonis and I am a serial entrepreneur. I have started and run several businesses over the years, mainly within the software and gaming sector, but also e-health and other tech. I love tech and I’m passionate about startups and entrepreneurship. I also work as a coach and mentor for entrepreneurs within what’s called the European Space Agency Business Incubator; The ESA BIC, and for a foundation called Entrepreneurs Without Borders.

So, we put together a mockup of an Android, IOS, Web version and put it out there to see if that was something that today’s young people would like.

Interviewer: Definitely! What’s in the future for Friendbase? Where are you in two years?

Deborah: Where are we now? We’re now raising funds, because what we’ve seen is that we have a very, very loyal member base and they are wanting to invite more of their friends. And I think that with very, very little work, we can get the platform on a really interesting growth path.

Friendbase is a virtual world for teens where they can chat, play games and also design their looks and spaces. Now we’re also moving towards Ed tech in the way that we’ll be introducing quizzes that are both for fun but also have learning elements in them.

Interviewer: That sounds awesome. What would you say is the main challenge when it comes to running cross-platform online community and specifically one that caters to teens?

Deborah: There are a lot of challenges with startups in general, but also, of course, running an online community. One challenge is when you have people that meet each other in the forms of Avatar and written chat and they have different personalities and different backgrounds that can cause them to clash. The thing is that when you write in a chat, the nuances in the language don’t come through as opposed to when you have a conversation face to face. It’s really very hard to judge, the small subtleties in language and that can lead to misunderstandings.

Add to that as well that there are lots of different nationalities online. That in itself can lead to misunderstandings because they don’t speak the same language.

What starts off as a friendly conversation can actually rapidly deteriorate and end up in a conflict just because of these misunderstandings. That is a challenge, but that’s a general challenge, I think, with written social interactions.

Interviewer: Just so we understand how Friendsbase work. Do you have one to one chat, one to many chats or group chats? How does it work?

Deborah: The setup is that we can have up to 20 avatars in one space. No more, because then it will get too cluttered on the small phone screens. So, you can have group chats. I mean, you see the avatars and then they have a text bubble as they write so that it can be several people in one conversation.

Interviewer: Do you have the opportunity for groups of friends to form and join the same kind of space together?

Deborah: Yes. Each member has its own space. They can also invite and open up their space for other friends.

Interviewer: And in that regard. What you often see in the real world with team dynamics is that there is a group of friends and there is the popular people in that group. And then one person who maybe is a little bit an outsider, who will at times be bullied by the rest of the group. Do you see people ganging up on each other sometimes?

Deborah: I haven’t seen groups of people ganging up on one individual. It’s more the other way around. There are individuals that are out to cause havoc and who are just online to be toxic.

Interviewer: That means that you have in general, you have a really nice and good user base. But then there’s the rotten fruits that come in from time to time.

Deborah: That is what it is like today. We are still fairly early stage, though, when it comes to the amount of users. So I would expect this to change over time. And this is something that we’re prepared for. We added safety tools at a really early stage to be able to learn how to handle issues like this and also how to moderate the platform when incidents occur. So, I think that even though that we don’t have that type of ganging up on each other at the moment, I would expect that to happen in the future.

Interviewer: But it sounds like you’re prepared for it. Now you’ve made a really nice segue into my next question; What is the main motivation challenges you experienced running Friendbase? What are the main challenges right now and what do you expect you will have to handle later on?

Deborah: I think that a challenge in itself for all social platforms is to set the bar on what is acceptable and not.
Our target group are mid teens and up. So we don’t expect young children to be on Friendsbase. We feel that if we made a social world for young children, then we’d need to have a completely different set of regulations, more controlled regulations, rather than when it is teenagers and upwards.
However, that demographic is also very vulnerable. So, of course, there has to be some sort of measurement in place. The challenge is to determine, at what level do you want to put the safety bar and also how can you tell the difference between what is banter between friends and when it sort of flips over to actually be toxic or bullying? That’s something that is really, really hard to differ between. And I think that if you work with chat filters, then you have to have some sort of additional reporting system for when maybe the filters don’t manage this challenge. The filter is only a filter and can’t determine between the two. So that’s one challenge. It’s also complex to enforce the rules that are in place to protect the users without being perceived as controlling or patronizing.
At the moment, we also have a challenge in that we have users that come back solely for the purpose to cause havoc and create a toxic environment. We track them down and we ban their accounts, but it’s a continuous process.
That is something that should it escalate over time it will become increasingly time consuming. That’s why it’s really, really important for us to have tools in place so that it doesn’t have to be moderated manually. That will just take too much resource and time.
Of course, you have the even darker side of the internet; sexual predators that are out to groom vulnerable youngsters and to get them to maybe move over to a different platform where they can be used in a way that is extremely negative.
That’s something that is difficult to handle. But today, thanks to artificial intelligence and again, amazing toolsets out there. There are attempts to look at speech patterns and try and identify that sort of behavior. And there it’s also really great to have your own tool sets where the user can actually report someone if they feel threatened or if they feel that someone’s really creepy.

Interviewer: When you have returning users who have made it their goal to attack the platform, in a malicious way, do you see that it’s the same people returning based on their IP or the way that they talk?

Deborah: It’s not always possible to see it based on their IP because they use different ways of logging in. However, given their behavior, we can quickly identify them. And we have a group of ambassadors as well online on Friendbase that help us. On top of that we have a chat filter which can red flag certain behavior. So that helps as well.

There are a group that come back over and over again and for some mysterious reason they always use the same username. So they’re not that hard to identify. That group is actually easier to control than a group which has a different motive on why they are online and why they are trying to target youngsters. The toxic ones that are just there because they think it’s fun to behave badly. It’s easy to find them and close down their accounts.

Interviewer: We already touched upon this, but what would you say is the hardest moderation challenge to solve for you right now?

Deborah: The hardest moderation challenge to solve is, of course, finding the people who are deliberately out to target lonely youngsters that hunger for social contact. The whole grooming issue online is a problem. We are constantly trying to find new toolsets and encourage our users to contact us if there’s something that doesn’t feel right. So grooming is something that we’re very, very much aware of. If we happen to shut down someone’s account by mistake for a couple of hours, they’re most welcome to come to us and ask why. But we’d rather be safe than sorry when it comes to this kind of behavior. However, it is hard to track because it can be so very, very subtle in the beginning.

Interviewer: Friendsbase has been around for a while now. Are there any challenges that has changed or increased in occurrence over the years? And if yes. How?

Deborah: Actually, not really. I think the difference is in our own behavior as we are so much more aware of how we can solve different problems.

Bullying has been around for years. Free Internet as well. Sexual harassment of youngsters and between adults, of course, has also been around for years. It’s nothing new. I mean, the Internet is a fantastic place to be. It democratizes learning. You have access to the world and knowledge and entertainment.
It’s amazing.
But there is a dark side to it. From a bullying perspective you have the fact that previously, if you were bullied at school, you could go home or you could go to your social group somewhere else and you would have somewhere where you would feel safe.

When it’s online, it’s 24/7.

And it is relentless when it comes to the whole, child abuse part. Of course, it existed before as well. But now with the Internet, perpetrators can find groups that have the same desires as themselves and somehow together they can convince themselves as a group that it’s more acceptable. Which is awful. So that is the bad part of the net.

So, when you ask: Have the challenges changed or increased since we started Friendbase? No, not really. But what has changed is the attitude of how important it is to actually address these issues. When we started the company in 2013. We didn’t really talk that much about safety tools. I mean, we talked about should we have whitelist or a blacklist, the words. It was more on that level. But today most social platforms, they have moderation, they have toolsets, they have guidelines and policies and so forth.

So, I think that we who work with online communities as a whole have evolved a lot over the past years.

Interviewer: Yeah, I would say today in 2020, you probably wouldn’t be able to launch a social community or platform without launching with some sort of moderation tools and well-defined guidelines.

Deborah: I think you’re right. Several years ago, I did the pitch where we were talking about online safety and tools of moderation and were completely slaughtered. What we were told was that being good online or this whole be cool to be kind is going to stop our growth. It’s much better to let it all run rampant and then it will grow much faster. I don’t think anyone would say something like that today. So that’s a huge shift in mindset. Which is great. We welcome it.

Interviewer: That’s a fantastic story. You’ve been in this industry so long; you’ve seen this change. I find it fascinating that just seven years ago when you said I want to protect my users, people laughed at you. And now people would laugh at you if you said, I’m gonna go live without it.

Deborah: I know. Can you imagine going on stage today saying that I don’t care about safety? I mean, people would be so shocked.

Interviewer: You said before when we talked about the main challenges if you experienced growth, you’d need to change your approach to moderation and automate more in order to just keep up?

Deborah: Yes, definitely. We try and stay on top of what toolsets are out there.

We build in our own functionality, such as muting users. So, if someone is harassing you, you can mute them so that you can’t see what they’re writing. Small changes like that, we can do ourselves, which will be helpful.

Something I’d like to see more and that we’ve actually designed a research project around is to not only detect and ban bad behavior, but to encourage good behavior.
Because that in itself will also create a more positive environment.

That’s something that we’re really excited about, to work with people that are experts within gamification and natural language processing to see how can we create tool sets where we can encourage good behavior and see what we can do. Maybe we can start deflecting a conversation that is obviously on its way to going seriously wrong. It could be so simple as a small time delay when somebody writes something really toxic with a pop up saying: “Do you really want to say this?”. To just make someone think once more.

This is something that we’re looking into. It’s super interesting. And I hear there’s a couple of companies just the last few months that are also talking about creating tool sets for something like this. So, I think it’s going to be a really, really interesting development over the coming years.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Interviewer: It sounds like safety is very important to Friendbase. Why is that?

Deborah:  Why is that? Quite early on, we who work in the company discussed what our core values should be. And one of the core values we decided upon is inclusion. Everybody is welcome. And for everyone to feel welcome. You have to have a welcoming atmosphere.

When you continue along that line of thought, then obviously you come to the point where, OK, if everyone’s going to be welcome and you want it to be a friendly space, then somewhere you’re going to have to stop toxic behavior. So, for us safety, it’s just part of our core values.

And also, I have a teenage daughter who loves gaming. I’ve seen how platforms behave. She’s part of groups that interact with each other online. I just feel that there must be a way of doing things better. It’s as simple as that. We can do better than this, letting it be super toxic. And there are some amazing people out there working with fantastic toolsets. There are some fantastic platforms and social games out there that also work in the same sort of direction as we do. It’s really great.

And you know what? To be quite honest, I think that there have been several case studies where it’s proven as well from a business perspective that you have a longer retention and a higher profitability when you can keep your user online for a longer time. So, you know, in itself, from a business sense, it also makes perfect sense to work in a way where you keep your user as long as possible.

Interviewer: You have tons and tons of experience obviously with startups and social platforms. If you were to give a piece of advice to someone who is running a similar service to Friendbase or even who are thinking about starting one, what would that be?

Deborah:  It would be, first of all, to determine what level of safety you want to have, depending on your user group. Obviously, the younger demographic you have, the more safety tools you must ensure that you have in place. Also, not to build everything yourself. Especially if you’re working on an international market with many languages. Just to be able to filter many languages and in a decent way is a huge undertaking. If you think that you’re going to be able to hack together something yourself, it’s not that easy. It’s better to work with a tool or a company that has that as their core business because they will constantly be working with the state of the art solutions.

So better to liaise with switched on companies that already work with this as their main reason for being. I think that’s important. And then, of course, add your own easy to report system, easy to communicate with your user’s system so that you have sort of a double layer.

I mean, I’ve seen several different companies that work now with different moderation tools and chat filters and so forth. Many of them they do stellar work. And it’s important at the end of the day because if anything really, really bad would happen, then you’re just finished as a business. It’s as simple as that. The last thing you would want is to have someone knock on your door and shut you down because something’s happened online in your platform.

Deborah: So, yeah, our our aim is to become one of the big global players. It’s exciting times ahead.

Interviewer: For sure. Any closing remarks? Any statements you want to get out there from a personal point of view or from Friendbase?

Deborah: The Internet is a great place to be because there’s so much you can learn. You can meet so many interesting people. But, there is a dark side as well. And you have to be aware of it. Just by being a little bit street smart online people can keep themselves safe. And we’re getting there. People are learning. Schools have it in their curriculum, social platforms try to teach users how to behave. So slowly but surely, we’re getting there.

Friendbase is currently looking for more investors. If you are interested reach out to Deborah Lygonis.

If you need help with content moderation get in touch with Besedo.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Users’ expectations are at an all-time high and losing your customers to your competition is, of course, out of the question. Platforms need to do everything in their power to ensure a seamless and safe experience on their site. That’s why content moderation has never been more vital to gain and retain customers.

Browsing the web for content moderation statistics? Look no further. We have compiled a list of 65 statistics about the landscape of content moderation from user experience, to customer service or stats relating to your specific industry.

  1. User Experience
  2. Reviews
  3. Dating
  4. Sharing economy
  5. Online marketplaces
  6. Customer service
  7. Scams
  8. Online harassment

User Experience

Online shoppers have no time to waste. They are expecting to find what they are looking for instantly. Competing for users’ attention is a tricky business. Only one negative experience can send your users away, seeking a better place to shop from. Proper categorization, smooth navigation, good searchability and no duplicates all play a key role in creating a seamless experience in order to win customers and keep them coming back.

Reviews

Reviews can make or break your business. With customers relying more and more on reviews to buy products or services (and even trusting fellow online reviewers as much as their friends and family) genuine user reviews are an excellent way for users to gain trust in your platform.

However, fake reviews are multiplying quickly online, and this could eroding the trust needed to convert buyers. So, how can you prevent fake reviews on your site? Setting up a reliable content moderation process is your best bet to protect your site. Find out more about tackling fake reviews here.

Dating

Heterosexual couples are more likely to meet a romantic partner online than through personal contacts and connections according to a recent study. The dating industry is booming, yet it is still facing countless challenges: rude messages, inappropriate images and in the worst of cases, sexual harassment.

To succeed in the business, you need to handle these threats with an effective content moderation strategy. The following online dating stats will give you a better idea of the challenges to be faced head-on.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Sharing economy

The sharing economy is forging its way into all types of industries, from the gig economy to transportation or housing, no sector will be left untouched in the future. Yet, the sharing industry comes with its own set of challenges, privacy and safety being the two leading causes of concern.

Online marketplaces

With conscious consumerism on the rise, online marketplaces are trendier by the day. But in this competitive environment, online marketplaces need to set themselves apart. Optimizing your platform’s experience is a must if you wish to stay in the race.

Customer service

Customer service has become progressively more important for customers in the past few years. Have a look at the following statistics to help you improve your customer service and become their preferred platform.

Scams

Scams can be found everywhere, and because of their sophistication level can be hard to detect or get rid of. Still, scams hurt businesses and drive user trust away. Check out our blog post on the 5 common online marketplace scams to see how you can fight back.

Online harassment

Online harassment is a plague with dire consequences. Get to know the following stats to better your content moderation and fight back on online harassment.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Here at Besedo we are continuously working to improve Implio’s technical capabilities to detect scams, increase user experience and maintain secure online marketplaces.

We are now introducing AI-powered language detection. This gives Implio users another tool to help ensure that content uploaded by your suppliers align with the rules on your site.

Why language detection is important to online marketplaces.

Imagine that you are running a marketplace catering to a market like Belgium, where there are 3 official languages. To be relevant to all your users you likely split the marketplace into 3 language sections, but your suppliers may still accidentally upload content to the wrong section.

With a rule setup where you determine the expected language of a listing, our new language detection feature will help you catch those that are not adhering to the correct language.

The overall user experience is improved as German speakers don’t need to scroll through irrelevant items in French and vice versa.

The filter setup in Implio could look like this: 

This compares the value against expected language. You then set up the rule to reject, approve or send the content piece to manual moderation if it matches.

What if my site only accepts one language? 

Language detection is a valuable feature for sites with one language as well. Many scammers will write their ads in English rather than the native language of your site. By detecting content language, you’re able to catch fraudulent listings and keep your users safe.

If the language detected doesn’t match the expected language or is unreadable and classified as ‘unknown’, you can decide to reject the item or send it for manual review. This way only content that matches the expected language is approved to go live on your site automatically.

A setup like this leads to high-quality content on your site relevant and safe to your users, which ultimately results in a good user experience.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

How does the AI-powered language detection feature actually work? 

As you’ve probably already figured out and as the name implies, our newest Implio feature detects the language of content uploaded by users.

The feature looks at both the title and body text of an item to determine the language. It then returns an output that can be used in Implio’s rule builder. This way you can create rules that are suitable for your specific needs. The rules can be as complex or simple as you need them to be.

Are you generally okay with listings in different languages, but are experiencing a lot of English speaking missionaries selling non-existent pedigree puppies? Use language detection and lists to build a rule that catches these specific frauds. Creating accurate filters is about research, creativity and the right tools, and with language detection, Implio now offers one more element to tackle content challenges. 

Our language detection feature currently supports 123 different languages. Among these, you’ll find some of the most common such as English, Spanish, French, and Hindi, but also smaller languages like Javanese or Swedish. 

Want to know if we cover the language you’re looking for? Have a look at the full list of languages available for automated detection.

 For more information about our newest feature, take a look at the knowledge base or try it out for free

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background