Sexual harassment has featured heavily in the media of late, as scores of women who’ve remained quiet about their experiences have bravely spoken out with a simple yet meaningful hashtag: #MeToo.
While the highly inexcusable exploits of men in positions of power, like Harvey Weinstein (among many others) may now be well documented, undesirable activity doesn’t have to be anywhere near as precarious to qualify as sexual harassment; particularly in digital environments like dating websites and messaging apps.
According to one study in Australia, the harassment of women online has become a ‘digital norm’ with nearly half of all women experiencing abuse or harassment online – including 76% of those under 30. These worrying statistics are just the tip of the iceberg. While much is being done to raise awareness of online harassment, for many it’s both unclear what exactly constitutes it and many dating sites still struggle with how to deal with it.
Defining online sexual harassment
According to Childnet International, an organization that promotes internet safety for young people, there are four types of online sexual harassment
- Non consensual sharing of intimate images and videos — for example, revenge porn.
- Exploitation, coercion and threats — such as blackmailing someone with compromising images of themselves.
- Sexualised bullying — includes so called ‘slut-shaming’: demonising women for dressing provocatively.
- Unwanted sexualisation — this covers a wide range of behaviours from unwanted and even unprompted messages to making inappropriate comments about someone’s appearance.
It appears that while some instances of sexual harassment criteria are obvious, others could be seen as arbitrary – particularly in the ‘Unwanted’ category. Why? Because what one person may find appropriate may in fact cause harm to another. Since the Weinstein allegations much has been made of ways to tackle individual behaviour from both a female and male perspective, but what are dating sites doing to tackle sexual harassment?
Education and empowerment
Organizations such as the Online Dating Association in the UK place a strong focus on educating consumers and online dating businesses about best practices, including ways to keep users safe from sexual predators.
However, while more needs to be done to prevent extreme cases, there also needs to be greater focus on prevention, which means taking a stance on inappropriate messaging. You only have to look at Bye Felipe on Instagram to see some prime examples of just how casual obscenity has become.
And then there’s Bumble: the first dating app to be specifically designed for women. It’s core value is advancing, empowering, and helping women. Like other dating services, it only initiates contact when there’s a mutual match, but unlike other services, women make the first move. And it’s now the fastest growing dating site in the world
Making a stand
As more women take a stand on harassment, inappropriate comments are going to be called out more frequently. That’s why women in the public eye must continue to speak out against sexual harassment – as Oprah Winfrey did at this year’s Golden Globes– in order to give others hope, encouragement, and courage.
But the issue cannot be solved by individuals alone. Companies have a huge social responsibility and need to weigh in too. Popular platforms and companies must play their part. Speaking out is one thing. But more can be done. Dating and classified sites can help protect their users via content moderation, an effective way of monitoring, flagging and removing inappropriate images and messages. Not only does it counter sexual harassment, it’ll reduce user churn too.
There’s a clear difference between malicious behaviour and accidental offence. And while creating content moderation filters that flag specific words and phrases is relatively straightforward, what’s less easy to achieve is an understanding of context. But it is possible: through a combination of machine-learning and manual moderation.
No-one should have to endure fear or humiliation of any kind, at any time, anywhere: on or offline. As an increasing number of online marketplaces, classifieds, and dating sites put more stringent measures in place to prevent harassment, perhaps those who’ve been guilty of sexual harassment in the past will think twice before sending an inappropriate message.
In the meantime, the tide is turning against offenders and the issues affecting so many is firmly in the public spotlight. Change is coming, but we can’t rest until then. Here at Besedo, we’re trying to raise awareness through our #WeToo social media campaign. Why not join us?
Content Moderation Glossary
Get in the know with our ultimate glossary of content moderation. From UGC to AI-powered moderation, we’ve got you covered. Learn the lingo now!
Digital Services Act (DSA): What It Is and What It Means for Content Moderation
We explain what you need to know everything you need to know about this new law in an easy-to-understand way. Stay ahead of the game in 2023, from transparency and accountability to prohibiting dark patterns.
Doxxing: How to Protect Your Platform and Users
From high-profile doxxing incidents to the potential consequences for victims and businesses, our post covers everything you need to know about this serious threat to online privacy and security.
Creating Trust and Safety in UX Design: Balancing Convenience and Security
Learn how to enhance UX design with trust and safety. Discover tips and best practices for creating secure user experiences that build trust.
Announcing Our Reporting Feature: Download and Visualize Your Data
Announcing Besedo reporting! Download and import your data into your favorite business intelligence tool to create all sorts of graphs, charts, and data magic.
The Advantages of Outsourcing Content Moderation
Discover the advantages of outsourcing content moderation, including cost savings, improved efficiency, access to expertise, scalability, and an improved user experience.
What Is User-Generated Content (UGC)?
Learn everything there is about user-generated content (UGC) and how it’s used. We also take a look at great real-world examples of UGC, and how it affects businesses worldwide.
The Job Scams Epidemic
Learn more about how hackers use brands to harvest personal details. We share how you can fight back using content moderation on your job board.
10 Tips For Startups Dealing With User-Generated Content
As a startup, it’s important to focus on your core idea and product development. However, many distractions and clutter can take your focus away from what you must do. Here are 10 tips for startups dealing with user-generated content!
Keeping Your Gaming Platform Safe And Enhancing Your User Experience
Prevent bullying, grooming, and harassment on the gaming platform you’re running. In-app messaging should be a safe place for all gamers – your users’ safety, and your reputation is on the line.
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.