COVID-19 continues to create new challenges for all. To stay connected, we’re seeing businesses and consumers spend an increasing amount of time online – using different chat and video conferencing platforms to stay connected, and combat social distancing and self-isolation.

We’ve also seen the resurgence of interaction via video games during the lockdown, as we explore new ways to entertain ourselves and connect with others. However, a sudden influx of gamers also brings a new set of content moderation issues – for platform owners, games developers, and gamers alike.

Let’s take a closer look.

Loading…

The video game industry was already in good shape before the global pandemic. In 2019, ISFE (Interactive Software Federation of Europe) reported a 15% rise between 2017 and 2018, turning over a combined €21bnAnother report by ISFE shows that over half of the EU’s population played video games in 2018 – some 250 million players, gaming for an average of nearly 9 hours per week: with a pretty even gender split.

It’s not surprising that the fastest growing demographic was the 25-34 age group – the generation who grew alongside Nintendo, Sony, and Microsoft consoles. However, gaming has broader demographic appeal too. A 2019 survey conducted by AARP (American Association Of Retired Persons) revealed that 44% of 50+ Americans enjoyed video games at least once a month.

According to GSD (Games Sales Data) in the week commencing 16th March 2020, right at the start of the lockdown, video games sales increased by 63% on the previous week. Digital sales have outstripped physical sales too, and console sales rose by 155% to 259,169 units in the same period.

But stats aside, when you consider the level of engagement possible, it’s clear that gaming is more than just ‘playing’. In April, the popular game Fortnite held a virtual concert with rapper Travis Scott; which was attended by no less than 12.3 million gamers around the world – a record audience for an in-game event.

Clearly, for gaming the only way is up right now. But given the sharp increases, and the increasingly creative and innovative ways gaming platforms are being used as social networks – how can developers ensure every gamer remains safe from bullying, harassment, and unwanted content?

Ready Player One?

If all games have one thing in common, it’s rules. The influx of new gamers presents new challenges in a number of ways, where content moderation is concerned. Firstly, because uninitiated gamers (often referred to as noob/newbie/nub) are likely to be unfamiliar with established, pre-existing rules for online multiplayer games or the accepted social niceties or jargon of different platforms.

From a new user’s perspective, there’s often a tendency to carry over offline behaviours into the online environment – without consideration or a full understanding of the consequences. The Gamer has an extensive list of etiquette guidelines which get frequently broken by online multiplayer gamers, from common courtesies such as not swearing in front of younger users on voice-chat, not spamming chat-boxes to not ‘rage-quitting’ a co-operative game due to frustration.

However, when playing in a global arena, gamers might also encounter subtle cultural differences and behave in a way which is considered offensive to certain other groups of people.

Another major concern, as outlined by Otis Burris, Besedo’s Vice President Of Partnerships, outlined in a recent interview, which affects all online platforms, is the need to “stay ahead of the next creative idea in scams and frauds or outright abuse, bullying and even grooming to protect all users” because “fraudsters, scammers and predators are always evolving.”

Multiplayer online gaming is open to negative exploitation by individuals with malicious intent or grooming, simply because of the potential anonymity and sheer numbers of gamers taking part simultaneously around the globe.

While The Gamer list spells out that kids (in particular) should never use someone else’s credit card to pay for in-game items, when you consider just how open gaming can be from an interaction perspective, the fact that these details could easily be obtained by deception or coercion needs to be tackled.

A New Challenger Has Entered

In terms of multiplayer online gaming, cyberbullying and its regulation continue to be a prevalent issue. Some of the potential ways in which users can manipulate gaming environments in order to bully others include:

Whilst cyberbullying amongst children is fairly well researched, negative online interactions between adults are less well documented and studied. The 2019 report ‘Adult Online Harms’ (commissioned by the UK Council for Internet Safety Evidence Group) investigated internet safety issues amongst UK adults, and even acknowledges the lack of research into the effect of cyberbullying on adults.

With so much to be on the lookout for, how can online gaming become a safer space to play in for children, teenagers, and adults alike?

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Pause

According to a 2019 report for the UK’s converged communications regulator Ofcom: “The fast-paced, highly-competitive nature of online platforms can drive businesses to prioritize growing an active user base over the moderation of online content.

“Developing and implementing an effective content moderation system takes time, effort and finance, each of which may be a constraint on a rapidly growing platform in a competitive marketplace.”

The stats show that 13% of people have stopped using an online service after observing harassment of others. Clearly, targeted harassment, hate speech, and social bullying need to stop if games manufacturers want to minimize churn rate and risk losing gamers to competitors.

So how can effective content moderation help?

Let’s look at a case study cited in the Ofcom report. As an example of effective content moderation, they refer to the online multiplayer game ‘League Of Legends’ which has approximately 80 million active players. The publishers, Riot Games, explored a new way of promoting positive interactions.

Users who logged frequent negative interactions were sanctioned with an interaction ‘budget’ or ‘limited chat mode’. Players who then modified their behavior and logged positive interactions gained release from the restrictions.

As a result of these sanctions, the developers noted a 7% drop in bad language in general and an overall increase in positive interactions.

Continue

Taking ‘League Of Legends’ as an example, a combination of human and AI (Artificial Intelligence) content moderation can encourage more socially positive content.

For example, a number of social media platforms have recently introduced ways of helpfully offering users alternatives to UGC (user generated content) which is potentially harmful or offensive, giving users a chance to self-regulate and make better choices before posting. In addition, offensive language within a post can be translated into non-offensive forms and users are presented with an optional ‘clean version’.

Nudging is also another technique which can be employed to encourage users to question and delay posting something potentially offensive by creating subtle incentives to make the right choice and thereby help to reduce the overall number of negative posts.

Chatbots, disguised as real users, can also be deployed to make interventions in response to specific negative comments posted by users, such as challenging racist or homophobic remarks and prompting an improvement in the user’s online behavior.

Finally, applying a layer of content moderation to ensure that inappropriate content is caught before it reaches other gamers will help keep communities positive and healthy. Ensuring higher engagement and less user leakage.

Game Over: Retry?

Making good from a bad situation, the current restrictions on social interaction offer a great opportunity for the gaming industry to draw in a new audience and broaden the market.

It also continues to inspire creative innovations in artistry and immersive storytelling, offering new and exciting forms of entertainment, pushing the boundaries of technological possibility, and generating new business models.

But the gaming industry also needs to ensure it takes greater responsibility for the safety of gamers online by ensuring it incorporates robust content management strategies. Even if doing so at scale, especially when audience numbers are so great, takes a lot more than manual player intervention or reactive strategies alone.

This is a challenge we remain committed to at Besedo – using technology to meet the moderation needs of all digital platforms. Through a combination of machine learning, artificial intelligence, and manual moderation techniques we can build a bespoke set of solutions that can operate at scale.

To find out more about content moderation and gaming, or to arrange a product demonstration, contact our team!

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Users’ expectations are at an all-time high and losing your customers to your competition is, of course, out of the question. Platforms need to do everything in their power to ensure a seamless and safe experience on their site. That’s why content moderation has never been more vital to gain and retain customers.

Browsing the web for content moderation statistics? Look no further. We have compiled a list of 65 statistics about the landscape of content moderation from user experience, to customer service or stats relating to your specific industry.

  1. User Experience
  2. Reviews
  3. Dating
  4. Sharing economy
  5. Online marketplaces
  6. Customer service
  7. Scams
  8. Online harassment

User Experience

Online shoppers have no time to waste. They are expecting to find what they are looking for instantly. Competing for users’ attention is a tricky business. Only one negative experience can send your users away, seeking a better place to shop from. Proper categorization, smooth navigation, good searchability and no duplicates all play a key role in creating a seamless experience in order to win customers and keep them coming back.

Reviews

Reviews can make or break your business. With customers relying more and more on reviews to buy products or services (and even trusting fellow online reviewers as much as their friends and family) genuine user reviews are an excellent way for users to gain trust in your platform.

However, fake reviews are multiplying quickly online, and this could eroding the trust needed to convert buyers. So, how can you prevent fake reviews on your site? Setting up a reliable content moderation process is your best bet to protect your site. Find out more about tackling fake reviews here.

Dating

Heterosexual couples are more likely to meet a romantic partner online than through personal contacts and connections according to a recent study. The dating industry is booming, yet it is still facing countless challenges: rude messages, inappropriate images and in the worst of cases, sexual harassment.

To succeed in the business, you need to handle these threats with an effective content moderation strategy. The following online dating stats will give you a better idea of the challenges to be faced head-on.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Sharing economy

The sharing economy is forging its way into all types of industries, from the gig economy to transportation or housing, no sector will be left untouched in the future. Yet, the sharing industry comes with its own set of challenges, privacy and safety being the two leading causes of concern.

Online marketplaces

With conscious consumerism on the rise, online marketplaces are trendier by the day. But in this competitive environment, online marketplaces need to set themselves apart. Optimizing your platform’s experience is a must if you wish to stay in the race.

Customer service

Customer service has become progressively more important for customers in the past few years. Have a look at the following statistics to help you improve your customer service and become their preferred platform.

Scams

Scams can be found everywhere, and because of their sophistication level can be hard to detect or get rid of. Still, scams hurt businesses and drive user trust away. Check out our blog post on the 5 common online marketplace scams to see how you can fight back.

Online harassment

Online harassment is a plague with dire consequences. Get to know the following stats to better your content moderation and fight back on online harassment.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Tackling inappropriate sexual behavior on dating sites and apps means having some difficult conversations about online conduct. But awareness is the first step on the road to prevention. Besedo spoke with online dating safety expert, Chris Dietzel, about some of the challenges and behaviors society needs to address.

Sexual harassment isn’t just found lurking in the dark corners of human society. It’s very much out in the open. The number of brave women who expressed a single #MeToo during the recent social media campaign was alarming to say the least.

But it also highlighted uncertainty around the definition of sexual harassment. While lots of conversations are being had about what physically constitutes sexual harassment, there’s been little examination of the topic in digital environments. But online dating safety expert, Chris Dietzel, hopes to change that.

However, he believes that one of the biggest problems we face is the fact that many people just aren’t aware of just how damaging unwanted behaviors can be. This is down to the fact that acceptance of inappropriateness is something that’s deeply ingrained in modern culture.

Knowing the limits

A Ph.D. Student at McGill University in Montreal, Canada, Chris turned his attention to online dating after conducting some research into people’s experiences on certain sites and apps.

“A lot of the inappropriate behaviors and sexual advances many condemn in offline environments – such as in the workplace or in social settings – aren’t always deemed as serious or damaging or problematic in a digital context,” he explains.

The definition of sexual harassment is essentially anything unwanted that’s sexual in nature. Online, it could be sexual communication that’s intimidating, predatory, or humiliating. It could be an image or inappropriate texts. Perhaps unsolicited or insistent messages. Jokes, even. And while we can argue that between consenting adults these could be acceptable, online there can be a high degree of uncertainty that true consent is mutual between two people.

“Context and permission are the keywords here,” says Chris. “Problems arise if there’s no open understanding or agreement of consent between the individuals. In online dating, the parameters of permission are too often based on assumptions about what one individual thinks that the other wants. For example, two people may flirt on a dating app. While one person might assume that a conversation like this is a prelude to sex, the other may simply be enjoying the lighthearted conversation.”

While it’s clear that there can easily be a breakdown in communication, what isn’t immediately apparent is how more extreme online behaviors creep in. While it’s logical (but definitely not excusable) that someone might lash out after being rejected, responding by publicly sharing an intimate photo is an extreme response. Actions like this fall under the banner of a word that many would be shocked to hear associated with online activity: rape.

Defining ‘Rape Culture’

Putting something as extreme as ‘rape’ in a cultural context is admittedly an uncomfortable topic. While the word unequivocally refers to sexual violence, the term ‘rape culture’ requires definition, as it takes into account a complex set of behaviors, that many may not even consider problematic.

Chris works as a research assistant on IMPACTS: Collaborations to Address Sexual Violence on Campus, a seven-year project that addresses sexual violence on university campuses across Canada and internationally. The IMPACTS Project, which is housed at McGill, defines rape culture as: The way in which sexist societal attitudes, misogyny and language tacitly condone, minimize and/or normalize sexual violence — mostly against women, but also against other genders.

“Online, these behaviors are evident in the way some people communicate about sex and violence. In fact, in some cases, people might not even be aware that they’re condoning it because certain terms, phrases, and behaviors have become normalized in mainstream cultures,” Chris explains.

“Admittedly some are glaringly obvious — such as the infamous ‘grab them by the pussy’ slur — but other misogynistic terms, or even just the way in which sex and violence are referred to casually, illustrate widespread acceptance of inappropriateness as a cultural norm.”

For example, many women are simply resigned to the fact that they are likely to experience some form of sexual harassment when they join an online dating service. They feel it’s a given that at some point they be sent inappropriate images and messages.

“The reason very little comes of these situations, and why so much goes unreported, is because people don’t actually know how to handle these behaviors,” says Chris. “They accept it as part of the dating app experience; that they have to deal with problematic individuals to find someone decent.”

Of course, those receiving the unwanted messages can delete the conversation and block the sender. They can even delete the app. But the damage has been done, and measures like these do nothing to stigmatize the sender; because the reaction is carried out by the recipient. In this situation, the ‘crime’ goes unpunished.

“Under these circumstances, many see it as difficult to assign blame,” Chris says. “Who’s at fault? The other person? The app? Yourself? Did your profile pic look too provocative? People often blame themselves when they feel victimized. And that makes them increasingly vulnerable.”

Rape myths, such as an individual feeling responsible for the sexual harassment they endured, are so ingrained in culture that victims of sexual violence may not know how to address the problems they encounter, particularly in online spaces.

Power & social capital

Wherever there’s a visible distinction between a majority and a minority, there’s an unbalanced power dynamic at work. When a ‘norm’ is perceived, those that adhere to it – the majority – wield more social capital than those who don’t – the minority.

“For example, if person A has more social capital than person B — they’re more likely to abuse their power and try to manipulate person B,” explains Chris. “On the other side, if person B accepts the fact person A has more social capital than them; they’re more likely to tolerate abusive behavior from person A. This is what puts marginalized people at greater risk of being victimized.”

In a recent social experiment, aWhat’s the Flip? video highlighted the difference of social capital on a gay-oriented dating app when a White male and an Asian male swapped profiles. In the clip, the profile of the White male receives tons of messages, while the profile of the Asian male receives very few. As a socially-desired individual who has his choice of guys, the White male holds more social capital than the Asian male.

We also see that marginalized individuals, like the Asian male in this example, may be more willing to engage in less desirable situations or with unfavorable people out of desire for some, or any, social interaction. The marginalized individual feels lucky when someone approaches them since they do not receive as much attention as privileged folks, and might mean they lower their standards and go along with things that they might not normally. This is not to suggest that marginalized people or those with less social capital are powerless; rather, it suggests that there is greater opportunity for abuse and manipulation when there are differences in social capital.

These dynamics don’t just manifest in ‘real life’. Online, social capital counts too. And nowhere else is its dominance so visible: in the number of followers someone has on a social network; the amount of comments; views; clicks etc. This means that one’s social media presence online can inflate their social capital and give them more influence. Compared to other types of social power, influence in an online context is measurable, and the potential to abuse that power can be a very dangerous thing where sexual harassment is concerned.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

The impact of technology

The sheer number of ways in which sexual harassment can happen online is troubling. It can be very public; taking place in a social network or public forum; or happen in a private email, direct message, or in-app chat. Or it can easily and quickly move from one to place to another, and as technology evolves so will the way that people interact in online dating.

“It’s easy to take for granted just how quickly things can spread online,” says Chris. “A comment, image, or video can be shared with thousands of people in seconds, which can have a tremendously negative emotional impact on an individual.”

“As the lines between real and virtual worlds converge, the environments in which dating and the associated conversations take place will shift too. But we’re still going to see harassment-related issues defined by context and environment – the platform used, and the conversations being had. “Ultimately, wherever there’s a system, people will abuse it. This is why the only real solution is education and awareness – to normalize discussion of sexual harassment; in conjunction with other proactive, rather than prohibitive, measures.”

Normalizing awareness

To change things, we need to be able to have honest and open discussions about sexual harassment and make it clear that it does exist online as much as anywhere else. Education is a key component to making this happen, but the onus shouldn’t just be on public service and charity campaigns, according to Chris. Technology companies have a role to play too.

“App and site developers have an incredible opportunity to push progress forward, and provide their users with information on acceptable behaviors, videos, links, and insights,” explains Chris. “But it’s about being proactive too; setting standards and expectations.”

Chris also points out that most apps’ Terms & Conditions only cover behaviors relating to the app and the users — not between users and other users, which is why community standards are important.

“I think that it’s important to have standards that users abide by too. Facebook does this. So does the dating app, Chappy. Additionally, awareness of difference is important. Grindr has just included options to allow users to define their preferred personal pronouns. To help educate those who are curious, but don’t fully understand these issues, on the same page there’s an info button that explains what this all means and why people would specify that information.”

“At the end of the day, self-respect and respect for others is crucial in combating discrimination and harassment of any kind. By having honest dialogues with individuals about a range of issues – everything from identity to inappropriateness – we can raise more awareness about sexual harassment and better prevent it,” says Chris. “Shaming one person won’t necessarily change behavior, but getting a group of people to reflect on their actions will.”

“We are all responsible for what we tolerate as individuals. However, organizations, as well as companies – and society as a whole – need to step up to the plate and model the message of zero tolerance against sexual harassment. That’s the only way it will truly take effect.”

prevent online sexual harassment

Christopher Dietzel

Christopher Dietzel is a doctoral student at McGill University, studying rape culture in LGBTQ+ communities and on dating apps. Chris recently completed his master’s degree on the intersections of undergraduate student leadership development and global citizenship themes.

Chris spent three years teaching English in France at the high school and university levels; he also spent four years working in Singapore, facilitating an international, multi-cultural program on student leadership development.

Chris is pursuing a career in university education to create meaningful experiences for students and to advocate for safe and healthy relationships, particularly for members of LGBTQ+ communities.

Visit IMPACTS  to learn more about Chris’ research.

Or contact him directly from his profile page

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Few things can damage your user trust and site reputation more than online sexual harassment. Disturbing images, profanities and unprovoked harassment are sadly becoming a norm in the online dating world, and many users are experiencing personal violation in their very first encounter on a dating site.

A survey done by Pew Research Center shows that 41% of Americans have been personally subjected to harassing behavior online, and 66% have witnessed these behaviors directed at others. 79% of users believe that it’s the site’s responsibility to step in and protect their users when online harassment is taking place on their website.

Tweet this: “41% of Americans have been personally subjected to harassing behavior online”. Enough is enough, now it’s time to stand together. #wetoo

Negative user experiences due to sexual harassment have created opportunities for differentiation within the online dating industry. Bumble, for example, has taken the market by storm with one simple and clear strategy; protect their female users from online harassment. They’ve managed to steal market shares, almost overnight, by giving women the sole power of initiating the first contact. This leaves room for thought.

What would it mean for your churn rates if you could make online harassment a non-issue?

Watch our CCO, Shane Correa address the issue of online sexual harassment and share actionable insights on how you can protect your users and avoid sexual harassment-induced churn.

Are you a victim of online sexual harassment? read our interview with the internet safety advocate Sue Scheff.

If you are ready to embrace your social responsibility and want to fight online sexual harassment, reach out to us and we’ll help you get started to protect your users.

Engage with our hashtag: #wetoo, on social media.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Sue Scheff is an author, parent advocate and cyber advocate who is promoting awareness of cyberbullying and other online issues. She is the author of three books, Wit’s EndShame Nation and Google Bomb.

We had the opportunity to conduct an interview with her where she talked about victims experience of online sexual harassment/online shaming and shared her opinion on what sites can do to help fight the problem. 

Interviewer: Hi Sue, thanks a lot for taking the time to share your knowledge, I know you are extremely busy! You’re the author of Shame Nation and Google Bomb, what were you hoping to achieve by releasing them?

Sue Scheff: Awareness. Most importantly, giving a voice to the voiceless.

After I wrote Google Bomb I was stunned by the outpour of people from all walks of life – from all over the world – that contacted me with their stories of Internet defamation/shaming/harassment. People were silently suffering from cyber-bullets, like myself, on the verge of financial ruin and all were emotionally struggling.

Google Bomb was the roadmap to helping people know there are legal ramifications and consequences of online behavior.

By 2012, I was taken back by the constant headlines of bullycide. Names like Tyler Clementi, Amanda Todd, Rebecca Sedwick, Audrie Potts – I knew how they felt – like there was no escaping this dark-hole of cyber-humiliation. At 40 years-old, when this happened to me, I had the maturity to know it would eventually get better. These young people don’t.

Google Bomb was the book to help people understand their legal rights, but with the rise of incivility online, Shame Nation needed to be written to help people know they can survive digital-embarrassment, revenge porn, sextortion and other forms of online hate. I packed this book with over 25 contributors and experts from around the world – to share their first-hand stories to help readers know they can overcome digital disaster. I also include digital wisdom for online safety and survival.

Interviewer: You’re a victim of online harassment and won a landmark case of internet defamation and invasion of privacy. Can you please try to explain your experience?

Sue Scheff: In 2003, I was attacked online by what I refer to as a disgruntled client, definitely a woman that didn’t like me. Once she started her attack, the gang-like mentality of trolls joined in. These trolls and this woman created a smear campaign that took an evil twist. From calling me a child abuser,  saying I kidnap kids, exploit families, a crook and more. Things went towards the sexual side when they claimed to be auctioning my panties (of course they never meet me – or had anything) but to anyone reading this, how do you explain these are malicious trolls out to destroy me?

As an educational consultant, I help families with at-risk teens find residential treatment centers. These online insults nearly destroyed me. I ended up having to close my office, hire an attorney and fight.

By 2006 I was both emotionally and financially crippled. In September 2006 I won the landmark case in Florida for Internet defamation and invasion of privacy for $11.3M in a jury verdict. Lady Justice cleared my name, but the Internet never forgets. Fortunately for me, the first online reputation management company opens their doors that summer. I was one of their first clients. To this day – I say my lawyer vindicated me – but it’s ORM that gave me my life back.

Interviewer: You’ve also met many other victims of online harassment, online shaming, revenge porn etc. How are victims affected, both in short and long-term?

Sue Scheff: Trust and resilience.

I’ve spoken to many victims of online hate. The most common theme I hear is the lack of trust we (they) have of others (both online and offline) initially. With me, I know I become very isolated and reserved. My circle of trusted friends became extremely small – the fact is, no one understands this pain unless they have walked in your shoes. When researching Shame Nation – others expressed feeling the same way.

The good news is, with time we learn to rebuild our trust in humanity through our own resilience. This doesn’t happen overnight. It’s about acceptance – understanding that the shame doesn’t define you and it’s your opportunity to redefine yourself.

The survivors you will read about in Shame Nation have inspiring stories of hope. They all learned to redefine themselves – out of negative experiences. It’s what I did – and realized that many others have done the same.

Tweet this: “no one understands this pain unless they have walked in your shoes.”- Sue Scheff, about victims of online hate. #wetoo

Interviewer: Where do you see the biggest risk of being exposed to online sexual harassment?

Sue Scheff: Online reputation and emotional distress.

Today we face the majority of businesses and universities that will use the Internet to search your name prior “interviewing” you. Depending on how your name survives a Google rinse cycle, it will dictate your financial future – career or job wise.

Just because you have a job – doesn’t mean you’re out of hot water. More than 80% of companies have social media policies in place. If your name is involved in sexual misconduct (scandal) online – you could risk losing your job. Colleges are also implementing these social media policies.

PEW Research says the most common way for adults to meet – is online. If you’re a victim of cyber-shame, online sexual harassment, revenge porn or sextortion – this content could hinder your chances of meeting your soul mate.

The emotional distress is overwhelming. You feel powerless and hopeless. Thankfully today there are resources you can turn to for help.

Interviewer: Do you think this issue is growing or are we any closer to solving it?

Sue Scheff: Yes… and no.

In a 2017 PEW survey, over 80% of researchers predicted that online harassment will get worse over the next decade – this includes revenge porn and sexual harassment. This is a man-made disaster, and can only be remedied by each of us taking responsibility for our actions online and educating others. Education is the key to prevention. I believe the #MeToo and Times Up movement have brought more awareness to this topic, but I fear not enough is being done about it for the online world. It’s too easy to use a keypad as a legal lethal weapon.

The good news is that we are seeing stronger revenge porn laws being put in place, as well as more social platforms, are responding to removing content when flagged as abusive. Years ago, we didn’t have this – though it may be slow, it’s moving in the right direction.

Tweet this: More than 80% of researchers predict that online harassment will get worse over the next decade. The time to act is now! #wetoo

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Interviewer: What would be your advice to internet users today on how to avoid, prevent and fight harassment?

Sue Scheff: Digital wisdom.

I’m frequently asked, “how can I safely sext my partner?” I give the same answer every time. The Internet and social media were not and is not intended for privacy. We only have to think of the Sony email hacking or Ashley Madison leaks to know that no one is immune to have their private habits exposed to the world wide web. You should have zero expectancies of privacy if sending any sexual message via text or otherwise. Several studies concur – a majority of adults will share personal and private messages and images of their partner without their partner’s consent.

Your friend today could quickly turn into a foe tomorrow. Divorce rates are climbing – what used to be revenge offline with charging up your ex’s credit cards, now has longer-term consequences when your nudes can go viral or other comprising images or content. E-venge (such as revenge porn) is how ex’s will take out their anger. Don’t give them that power.

If you find you are a victim of online harassment or online hate – report it and flag it to the social platform. Be sure to fill out a form – outlining how it’s violating their code of conduct – email them professionally (never use profanity or a harsh tone).

I encourage victims not to engage with the harasser. Be sure to screenshot the content – then block them. If you feel this is a case that will get worse and it needs to be monitored, you can ask a friend to monitor it for you so you don’t have to be emotionally drained from it. I also tell the friend not to engage – and to let you know if it gets to a point that it may need legal attention – that your life is in danger or your business is suffering.

Interviewer: What is your opinion on what sites can do to help fight this problem?

Sue Scheff: In a perfect world – we would say stricter consequences offline for the perpetrators – which would hinder them from doing this online in the first place.

Strengthen the gatekeepers: User -friendlier and a speedier response time.

Although sites such as Facebook, Twitter and Instagram are stepping up and want to alleviate online harassment, many people still struggle with figuring out the reporting methods and especially the poor response time. Where are the forms? After that – the response time can be troubling – from what victims have shared with me. When you’re a victim of sexual harassment, these posts are extremely concerning – every minute feels like a year.

I personally had a good experience on Facebook – when I wrote about a cyber-stalker on my public page. It was addressed and handled within 48 hours.

Systems should be in place that if a comment/image is flagged as abusive (harassment) by more than 3-5 unique visitors, it should be taken down until it can be investigated by the social platform’s team. I think we can relate to the fact that online abuse reported daily is likely overwhelming social media platforms – however, I believe they should give us the benefit of the doubt until they can investigate our complaint.

Interviewer: What do you think about the idea of using computer vision (AI) to spot and block nude pictures before they are submitted on a dating site?

Sue Scheff: If dating sites were able to implement AI for suspicious content, it would be a great start to cut-back on sexual harassment and keeping the users safer.

Interviewer: Where can victims turn for support?

Sue Scheff:

Cyber Civil Rights Initiative

Without My Consent

Online SOS Network

Are you a victim of online sexual harassment or cyberbullying?

Please heed Sue’s advice and reach out for support.

If you are site looking to help in the fight?

Contact us to see how AI and content moderation can help keep your users safe.

Sue Scheff - fight online harassment

Sue Scheff

Sue Scheff is a Nationally Recognized Author, Parent Advocate and Internet Safety Advocate. She founded Parents Universal Resources Experts, Inc. in 2001.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Sexual harassment has featured heavily in the media of late, as scores of women who’ve remained quiet about their experiences have bravely spoken out with a simple yet meaningful hashtag: #MeToo.

While the highly inexcusable exploits of men in positions of power, like Harvey Weinstein (among many others) may now be well documented, undesirable activity doesn’t have to be anywhere near as precarious to qualify as sexual harassment; particularly in digital environments like dating websites and messaging apps.

According to one study in Australia, the harassment of women online has become a ‘digital norm’ with nearly half of all women experiencing abuse or harassment online – including 76% of those under 30. These worrying statistics are just the tip of the iceberg. While much is being done to raise awareness of online harassment, for many it’s both unclear what exactly constitutes it and many dating sites still struggle with how to deal with it.

Defining online sexual harassment

According to Childnet International, an organisation that promotes internet safety for young people, there are four types of online sexual harassment

It appears that while some instances of sexual harassment criteria are obvious, others could be seen as arbitrary – particularly in the ‘Unwanted’ category. Why? Because what one person may find appropriate may in fact cause harm to another. Since the Weinstein allegations much has been made of ways to tackle individual behaviour from both a female and male perspective, but what are dating sites doing to tackle sexual harassment?

Education and empowerment

Organisations such as the Online Dating Association in the UK place a strong focus on educating consumers and online dating businesses about best practices, including ways to keep users safe from sexual predators.

However, while more needs to be done to prevent extreme cases, there also needs to be greater focus on prevention, which means taking a stance on inappropriate messaging. You only have to look at Bye Felipe on Instagram to see some prime examples of just how casual obscenity has become.

And then there’s Bumble: the first dating app to be specifically designed for women. It’s core value is advancing, empowering, and helping women. Like other dating services, it only initiates contact when there’s a mutual match, but unlike other services women make the first move. And it’s now the fastest growing dating site in the world

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Making a stand

As more women take a stand on harassment, inappropriate comments are going to be called out more frequently. That’s why it’s vital that women in the public eye continue to speak out against sexual harassment – as Oprah Winfrey did at this year’s Golden Globes– in order to give others hope, encouragement, and courage.

But the issue cannot be solved by individuals alone. Companies have a huge social responsibility and need to weigh in too. Popular platforms and companies must play their part. Speaking out is one thing. But more can be done. Dating and classified sites can help protect their users via content moderation; an effective way of monitoring, flagging and removing inappropriate images and messages. Not only does it counter sexual harassment, it’ll reduce user churn too.

There’s a clear difference between malicious behaviour and accidental offence. And while it’s relatively straightforward to create content moderation filters that flag specific words and phrases, what’s less easy to achieve is an understanding of context. But it is possible: through a combination of machine-learning and manual moderation.

No-one should have to endure fear or humiliation of any kind, at any time, in any place: on-or offline. As an increasing number of online marketplaces, classifieds, and dating sites put more stringent measures in place to prevent harassment, perhaps those who’ve been guilty of sexual harassment in the past will think twice before sending an inappropriate message.

In the meantime, the tide is turning against offenders and the issues affecting so many is firmly in the public spotlight. Change is coming, but we can’t rest until then. Here at Besedo, we’re trying to raise awareness through our #WeToo social media campaign. Why not join us?

Tweet this: Companies have a social responsibility to help fight online sexual harassment. Learn how. 

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background