One of the surprising and amazing things about the internet, especially when we’re talking about things involving user-generated content, is how many ideas become well-established parts of society before we even fully understand them.

Online dating is a perfect example. While the first big dating site was launched more than 25 years ago, Tinder – which broke new ground for dating apps – only just turned 10. And yet, it’s already perhaps the main way of meeting a partner, with a Stanford study in 2019 finding that 39% of heterosexual couples met online.

Despite its massive popularity, though, we’re not necessarily wise to everything that dating apps have to throw at us or aware of all the ways they’re changing society or us. Unusual and potentially dangerous trends are emerging from online dating all the time, and never fail to claim headlines in the media: Stylist, for example, recently described voicefishing as “the freaky new dating trend we’re trying to get our heads around.”

The Challenge For Apps

On the one hand, then, there’s frequent discussion of the dangers that online dating can pose, but, on the other hand, people clearly keep coming back anyway to enjoy the benefits of digital matchmaking. Dating apps clearly have a job to do to make sure that people are empowered to be flirty, but not dirty; daring, but not dangerous.

That work all comes down to steering user interactions in the right direction, and that’s not always easy. The difficulty is illustrated by a study that recorded how dating app users speak about their experiences. Direct quotes from study participants show how dating apps – like many online venues – develop their own language, codes, and norms:

If you don’t know what some of these phrases mean, well, don’t worry: that’s exactly the point! We know that dating app users successfully find partners and have fun doing it – but while they’re doing so, they also have to navigate a fast-changing environment of language and interaction.

The moderation response

What is a challenge for users, here, is just as much of a challenge for content moderation strategies: both human moderators and automated tools need to constantly learn, adapt, and evolve in order to keep users safe. At the same time, though, the freedom to be playful and inventive in how you speak and interact with others is an important part of how dating apps work for people. While the medium might be digital, it’s still (hopefully) a real person on the other side of the screen – and, just as with bumping into someone on the street or meeting them in a bar, the element of surprise and fun is essential.

There’s a fine line to tread, then, when moderating dating apps. We need to react quickly to new threats, but intervene gently; we need to be strong in safeguarding users, but permissive about how they act; and we need to listen and learn attentively about how people interact. Needless to say, it’s a job that takes real expertise.

What do you think? Do dating apps get it right when protecting and helping their users? How can we respond to the ways language is evolving online?

And what is a “buyer”, anyway?

If you want to talk about it, reach out today.

Axel Banér

Sales Director – EMEA

Dating apps are once again preparing to be abuzz with activity for Valentine’s Day. Even though outlooks toward dating apps have become increasingly positive over the past few years, with platforms gaining in both popularity and users, they have throughout their short existence continued to attract a great deal of attention on the risks they pose to users from a personal safety perspective.

Any dating app user will be familiar with the anxiety involved with moving from digital to in-person interactions, and unfortunately, that anxiety has a legitimate source. According to the Pew Research Centre, one in two online dating users in the US believes that people setting up fake accounts to scam others is very common.

The financial details back them up, too: the FTC recently highlighted that, with $1.3b in losses over the last five yearsromance scams are now the biggest fraud category they track.

And people who strike up online relationships between Christmas and Valentine’s Day might be at particular risk of romance fraud. Last March, for example, the UK’s National Fraud Intelligence Bureau experienced a spike of romance fraud reports. It’s little wonder, then, that Netflix chose the start of February to release its true-crime documentary The Tinder Swindler.

With online dating apps now entirely mainstream as one of the default ways of meeting people, with over 300m active users, it is more important than ever that the businesses running them take strong steps to protect user safety. This is a moral imperative, of course, in terms of working for users’ best interests – but, as the market matures, it’s also quickly becoming a potentially existential problem for dating platforms.

Challenges faced by those looking for love

When considering managing the online reputation of a company, user experience, and business outcomes are often one and the same thing, meaning that moderation is an important measure to consider. Disgruntled customers, for instance, often utilize social media to publicly criticize companies, leading to a backlash that can rapidly spiral out of control.

It’s not easy, however: online dating is, understandably, a highly sensitive and personal area. Users who might otherwise be highly cautious online are more likely to let their guard down when it comes to looking for love. Platforms have a duty of care to their users to put a stop to fraudulent behavior in order to support and protect their users in a way that does not feel ‘intrusive’.

Effective moderation in this space demands a range of approaches. A well-moderated dating app generates a more seamless and convenient user experience which in turn reduces spam content and unhappy user feedback. Keeping users safe, creating the right brand experience, and building loyalty and growth go hand in hand.

How it works in practice

As we enter a peak season for online dating, a moderation strategy that brings users closer to the people they want to connect with, with less spam and a clearer sense of safety, will be a real competitive differentiator. Ensuring a safe and positive user experience should be at the heart of dating sites’ content moderation strategy.

AI-enabled content moderation processes are essential to catch and remove these fraudulent profiles before they target vulnerable end-users. Online dating app, Meeticimproved its moderation quality and speed with 90% automation at 99% accuracy through an automated moderation platform.

With dating apps relying so heavily on user trust, it is essential that platforms are able to detect and remove scammers, whilst maintaining a low false-positive rate to ensure minimal impact on genuine users. Content moderation teams must also be continuously trained and updated on the ever-evolving tricks of romance scammers.

A content moderation partner can be a great way to ensure high accuracy, and automated moderation to maintain a smooth customer experience. Only with a team of highly trained experts coupled with precise filters and customized AI models will online dating sites be truly efficient at keeping end-users safe.

Platforms cannot afford to make this a ‘non-issue’ – even if users do not experience it themselves, many will see others being harassed online and experience negative feelings towards the brand and platform. For platforms, everything is at stake for both their reputation and ultimately, the wellness of their users.

Martin Wahlstrand

Regional Sales Director Americas

Martin is Besedo’s Regional Sales Director Americas. While you can’t swipe right on anyone here at Besedo, Martin and his team would love to give you a demo of how content moderation can help your users be safer and have a great user experience.

Tiwa York, CEO of Thailand’s largest c2c marketplace, Kaidee, joined us for a webinar to share the moderation challenges they faced back in 2016, and the incredible journey they’ve taken to meet their users’ expectations on quick publishing times, by moving from an all manual moderation approach to 85% automation. 

Download Webinar

Fill out your email below to get your free copy of the webinar.

Untitled(Required)

Kaidee is Thailand’s largest c2c marketplace, with 30 million users last year.

Three years ago, in 2016, the company faced tough challenges of slow time-to-site for newly submitted listings, resulting in decreased user experience on their platform and the risk of losing users.

In this webinar, we will explore the options Kaidee considered to help improve their time-to-site and UX. Kaidee’s CEO, Tiwa York, will share their remarkable journey moving from manual moderation to 85% automation and their main takeaways from the transition.

In this webinar, you’ll learn:

Written by

Tiwa York

Head Coach (CEO) at Kaidee

Tiwa York is the Head Coach of Kaidee (a.k.a CEO). He is passionate about changing lives through trading second-hand goods, startups, and building great teams.

Along with a team of 5 awesome people, Tiwa founded an online C2C marketplace which is now known as Kaidee in 2011. In 2018, Kaidee had 1 million sellers, listing 8.7 million items, and reaching 30 million people in Thailand.

Tiwa has been recognized for his passion, his leadership, and his efforts to build the team and culture at Kaidee. Prior to his current role, he worked in digital advertising with leadership roles in Admax Network and Omnicom Media Group.

Emil Andersson

Marketing Manager at Besedo

Emil joined the Besedo team in 2017 and has since brought his can-do spirit, work ethics and marketing expertise to the team.

Emil’s education and expertise lay within the marketing field, but his true passion is to contribute to great user experiences, whether it’s through online or offline interactions. Making him the perfect fit for the Besedo team.

Prior to his current role, Emil was working in the architecture and design industry with B2B marketing and Business Development at Eurasia Architectural Products Ltd.

Download Webinar

Fill out your email below to get your free copy of the webinar.

Untitled(Required)

How will you keep your users safe and free from harassment? here’s what you’ll learn:

  • Where users may encounter harassment on your site.
  • Common types of harassing behavior online.
  • The severity of online harassment.
  • How harassment affects your overall growth.
  • How you can solve the issue and keep your user safe.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Download Webinar

Fill out your email below to get your free copy of the webinar.

Untitled(Required)

A snapshot of scam levels on online marketplaces – Valentine’s Day 2021

The infographic shows the results of a content audit performed on 6 popular online marketplaces in the lead up to Valentine’s Day 2021.

See how many scams were found on popular online marketplaces in:

  • Electronics
  • Pets
  • Perfumes

And get tips on how to keep users safe during the Valentine’s Day period.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

For many, online dating is now the default way to meet new people. As we become increasingly time-poor, digital devices act as a way for us to navigate our day-to-day lives and how we interact with others, including our relationships.

Despite attitudes towards dating apps becoming more positive and platforms gaining popularity in recent years, throughout their short history, they have attracted a great deal of attention on the risks they pose to users. While dating apps are an incredibly convenient way to maintain our love lives, they come with their own threats.

Risk vs risqué

Like any form of dating, connecting with strangers doesn’t come without risk. This is also the case when using an online dating platform. The exchange of information be it a phone number, address, or other personal details can be exploited if placed in the wrong hands. Dating scams, catfishing, and abuse attract headlines – and for platforms, advertising, misuse, and nudity also threaten to damage the user experience and brand reputation.

Finding the right balance between restricting content to protect users and allowing organic interactions to flourish is crucial to enabling platforms to grow and realize true potential. The power of online dating is its ability to make connections virtually, while the freedom which makes it possible to engage in negative interactions is also what makes it possible to have genuine, authentic, and meaningful relationships.

Growing a dating platform means harnessing the opportunities in the content it creates. Platforms cannot be seen to ‘scaremonger’ users, but it’s imperative they provide substantial safety features and guidelines to protect users and brand reputation, whilst using technology to enhance user experience and focus on retention to grow their platforms.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Creating a safe space, without killing the mood

The recent context of lockdowns demonstrated the power of online dating; even without in-person interaction, it functioned as a place to make human connections. It works best, therefore, when it delivers the same surprise, joy, and meaningfulness of speaking to someone new in real life.

With online dating, it is tempting to see shutting down opportunities to interact as the only way to remove risk. But this isn’t what users want. They want to feel protected and trusting of the systems in place to be able to interact in confidence. It is now an expectation, not a “nice to have”, for platforms to filter out all harmful content from fake profiles to indecent imagery. Providing a sophisticated app to allow users to interact with who they choose is likely to result in increased brand loyalty as opposed to blocking all connections which could be deemed as harmful.

An engaging and reliable messaging experience is the foundation of retention on a successful dating platform. Creating a positive space to connect, however, relies on really understanding how people use the platform and what works for them. With many users engaging in conversations to meet new partners, its important technology doesn’t get in the way and ‘kill the mood’, with an unstable or over-censored chat platform.

Content moderation can help strike the right balance. As well as blocking the most objectionable – or illegal – content, it delivers insight that enables dating sites to encourage sincere, positive behaviours. Online dating is a space of rapid innovation and as brands create new ways to help people connect more effectively, platforms need to ensure interactions remain safe, with custom moderation approaches.

Ultimately, stopping deceitful users from harming the user experience and removing unwanted content to keep people safe will protect brand reputations. With content moderation, your dating site can become the brand you want it to be.  

Find out more about working with us and request a demo today.

edmond vassallo

By Edmond Vassallo

Head of Customer Success

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.
Form background

How Do You Keep Singles Safe Online?

When it comes to affairs of the heart – at a time when physical contact is off-limits – it’s time to get creative. And that exactly what online dating platforms are doing: using video interaction as the ‘date’ itself.

While it’s clear that dating is innovative, exciting, and evolving at a rapid pace, how can dating site owners ensure they keep users safe?

Necessity Breeds Dating Invention

Video dating is nothing new. Well, in its current, interactive form it’s brand new, but use of video as a way of introducing yourself to potential dating partners took off in the 1980s and 90s. But back then, agencies were involved. And like recruiters, it was their job to vet, interview, and engineer love matches based on compatibility and common likes and dislikes.

However, fast forward 35 years, and the ways in which we interact have shifted significantly. And they just keep innovating. Services like eHarmony, Tinder, and Bumble, each offer their own unique approach to self-service matchmaking. And while social media platforms (Facebook Dating, anyone?) have been dipping their toes into the dating pool for a little while now, nothing groundbreaking has taken the sector by storm.

Most industry insiders saw the use of video as an ‘add-on’ to dating platforms but no-one was entirely sure how this would play out. And then, in March 2020, the COVID-19 pandemic hit. Lockdowns ensued internationally. Suddenly video took on a whole new role.

Communication evolved in one major direction – online video calls and meetings. Replacing face-to-face with face-to-screen encounters in a time of social distancing represents a huge cultural shift, unimaginable back in 2019.

Whether we’re learning at home or working remotely, how we stay connected has changed significantly. Substituting in-person conversations with video meetings is now par for the course.

Despite the ensuing Zoom fatigue, being advised to stay at home has undoubtedly led to a spike in online dating. And with traditional dating venues no longer a COVID-safe option, video dating has organically risen to the forefront.

Why Video Dating?

While not every dating site or user is engaging with video dating yet, many are trying it out.  But what are the benefits of video dating? If your online dating platform is not already providing that service, are your users missing out?

Compared with traditional online dating, video dating has some great benefits. The most obvious reason to choose video dating is that it enables participants to experience the social presence that’s lacking in written communication. As a result, it can feel much more real and authentic than just exchanging messages or swiping photos.

With a video date, users have that experience of getting to know someone more slowly, finding out if they’re a good match in terms of personality, sense of humour, and other qualities. This means if you don’t click with someone, you’re more likely to find out sooner. Particularly at a time when in-person meetings are restricted, this is a huge advantage in terms of making the leap to meeting in person.

But swapping a bar or restaurant for a video meeting carries a different set of risks for participants. And for online dating platforms, video dating poses tough new challenges for content moderation. Especially when it comes to livestream dating with an interactive audience.

Dating Live & In Public

Dating in front of a live audience is nothing new. In the 1980s, television dating shows like ‘Blind Date’ in the UK experienced huge popularity. Contestants performed in front of a live studio audience and put themselves at the mercy of the general public – and the tabloid press(!).

In the 2010s, the television dating game show-style format was revived – though it followed a wider trend for ‘reality TV’ with dating shows such as ‘Love Island’ emerging and growing in popularity. However, the legacies of these shows have been tainted by a small number of poorly-vetted contestants – some even had previous convictions for sex offences – suffering serious mental-health conditions as a result of their appearance on the show.

Despite these warning signs, it seems inevitable that the trend for dating-related entertainment has been adopted by interactive online technologies – livestream dating. Often described as ‘speed dating in a public forum’, the trend for watching and participating in live video dating seems a logical extension of platforms like Twitch and TikTok.

But sites like MeetMe, Skout, and Tagged aren’t just a way of making connections – they’re also an opportunity for daters to generate revenue. Some platforms even provide users with the functionality to purchase virtual gifts which have real monetary value.

Needless to say, these kinds of activities continue to raise questions about users’ authenticity: in terms of dating in pursuit of love. This is why, over the last decade, many industries have made a conscious move towards authenticity – in order to build better brand trust. The dating industry is no different, especially since – despite exponential growth – there are still major retention and engagement issues.

Video offers that sense of authenticity, particularly as we’re now so accustomed to communicating with trusted friends and family via live video.

Dating also has universal appeal, even to people already in committed relationships. There is an undeniable voyeuristic aspect to watching a dating show or watching live streamed daters. And of course there are inherent safety risks in that.

Like other interactive social technologies, the livestream dating trend carries its own intrinsic dangers in terms of mental health and user experience. And just like any other interactive social media, there are always going to be users who are there to make inappropriate comments and harass people.

That’s where content moderation comes into play.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

So How Can Content Moderation Support Safer Dating?

One-to-one video dating and livestream dating is happening right now. Who knows where they will evolve?

Setting your brand apart in an already crowded dating industry is becoming more complicated in a time when social media technologies are rapidly evolving. How will you stay ahead of the curve and keep your users safe?

Of course, video moderation is not the only challenge you’re going to face. The associated unwanted user-generated content that goes with running an online dating platform includes:

After all, brand trust means a better user experience. And a better user experience increases user lifetime value – and revenue.

On average, 1 in 10 dating profiles created is fake. Scammers and inappropriate content hurt your platform’s reliability. Left unanswered, undesirable content undermines user trust and can take a heavy toll on your acquisition and retention.

– but it does mean taking a leap in terms of your overall digital transformation strategy, and adding AI and machine learning to your service.

With an all-in-one package from Besedo, you can get your content moderation in order across multiple areas. It’s built on over 20 years experience and now has manual video moderation capabilities.

This means you can now review play, pause, timestamp and volume control videos. More importantly you can delete videos which don’t meet your site’s standards for user-generated content. Take a look at our short video guide to discover more.

Make dating online safer.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Why creating sustainable growth means looking beyond the digital present

Over the past decade, it has become common to suggest that every company is now a tech company.

The exponential growth in digital usage quickly outgrew what we traditionally think of as the technology sector and, for users, the agility of the internet didn’t stay confined to the online world. Technology has shifted expectations about how everything can or should work. Soon, companies selling everything from furniture to financial services started to look and act more like innovative tech companies. They find new ways to solve old problems through digital channels.

In other words, business leaders seeking to guarantee growth turned to digital technology – to the point that, now, the Chief Technology Officer is a key part of the C-suite.

After a year when we’ve all relied on the internet more than ever, in every aspect of our lives, growth through digital has never been more apparent. For business, digital communication has at times been the only possible way of staying in touch with customers, and there’s no sign that the CEO’s focus on agility and technology is fading. In recent surveys, IBM found that 56% of CEOs are ‘aggressively pursuing operational agility and flexibility’, PwC found that they see cyber threats as the second biggest risk to business, and Deloitte found that 85% think the pandemic accelerated digital transformation.

If the exponential growth of digital has made every company a technology company, though, it has also made terms like ‘technology’ and ‘agility’ less useful. If every CEO is pursuing a digital strategy, that term must be encompassing a vast range of different ideas. As we look towards the next decade of growth – focused on managing the challenge of achieving more responsible and sustainable business along the way – we will need to think carefully about what comes next once digitalisation is universal.

Supercharged tech growth has skyrocketed user-generated content

Of course, the importance of agile technology has never been the tech itself, but what people do with it. For customers we’ve seen tech innovation create new ways of talking, direct access to brands, and large changes in how we consume media and make purchases.

As digital channels take on a greater share of activity than ever, one of the effects of an exponential growth in digital is an exponential growth in user-generated content (UGC).

This user-led interaction, from product reviews to marketplace listings to social interactions, fully embodies the agility that companies have spent the last decade trying to bring to their processes; because it is made by people, UGC is rapid, diverse, and flexible by default. While it may be too soon to say that every business will become a content business, it’s clear that this will become an increasingly important part of how businesses operate. Certainly, it’s already a major driving force for sectors as diverse as marketplaces, gaming, and dating.

A UGC business must be protected to maximise opportunity

In the move towards UGC, a business’s user interaction and user experience will have consequences across the organisation – from profit margin, to brand positioning, to reputational risk, to technological infrastructure. Across all of these, there will be a need to uphold users’ trust that content is being employed responsibly, that they are being protected from malign actors, and that their input is being used for their benefit. Turning content into sustainable growth, then, is a task that needs to be addressed across the company, not confined to any one business function.

Marketers, for instance, have benefited from digitalisation’s capacity to make the customer experience richer and more useful – but it has also introduced an element of unpredictability in user interactions. When communities are managed and shaped, marketers need to ensure that those efforts produce a public face in line with the company’s ethos and objectives.

While tech teams need to enable richer user interaction, their rapid ascent to become a core business function has left them under pressure to everything, everywhere. Their innovation in how content is managed, therefore, needs a middle path between the unsustainable workload of in-house development and the unsustainable compromises of off-the-shelf tooling.

With the ultimate outcomes of building user trust being measured in terms of things like brand loyalty and lifetime user value, finance departments will also need to adapt to this form of customer relationship. The creation of long-term financial health needs investments and partnerships which truly understand how the relationship between businesses and customers is changing.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

UGC as a vital asset for sustainable business growth

Bringing this all together will be the task needed to create sustainable growth – growth which is fit for and competitive in the emerging context of UGC, sensitive to the increasing caution that users will have around trusting businesses, and transparent about the organisations ethos, purpose, and direction. It will require not just investing in technology, but understanding how tech is leading us to a more interactive economy at every scale.

As digitalisation continues to widen and deepen, we may find UGC, and the trust it requires, becoming just as vital an asset for businesses as product stock or intellectual property. To prepare for that future and maximise their business growth from their UGC, businesses need to start thinking and planning today.

By Petter Nylander

CEO Besedo Global Services

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Are my customers getting scammed? Is my site being used for illegal sales? Are offensive language and disturbing images flourishing without my knowledge?

These have been the primary concerns since the dawn of classifieds, frequently discussed at events, forums, in the media and among peers. And it makes sense, a site where illegal drugs are being traded, where fraudsters are thriving and visitors get offended, will most likely develop a pretty bad reputation and face a slow but steady death. Right?

Sure, we are not denying this, but times are changing and just avoiding negative buzz is not enough anymore to build a thriving site. In 2021, marketplaces need to actively build on their brands and we are seeing increasing amounts of money being thrown on brand strengthening campaigns and commercials. But are companies living up to their brand promises? How does content quality affect the perception of a site and how is quality perceived by the users?This is something we wanted to find out. And this is why we initiated a survey.

What Content Affect Users?

We asked 1000 people in the UK and US how they perceive listings with different types of content and what actions they would take based on the ads. And this is what we found:

target with an arrow through it icon

Is It Relevant?

Consumers clearly don’t have any patience with irrelevant or missing information. When shown an ad lacking relevant content nearly 80% said that they would not return to the site where it was posted, nor would they recommend it to others. “I would never buy a phone from an ad or site like this” was a common response to the ad.

inappropriate speech bubble icon

Inappropriate Content

People also found Racism and nudity disturbing, but the interesting thing is that this content wasn’t perceived to be as bad as the ad missing relevant information.

scam icon

Scam Detection Failure

Scam should continue to be a priority for market places, as almost half of the respondents failed to identify an ad that was an obvious scam containing a “too low to be true-price”, a western union payment, and/or typos. We would continue to maintain that it’s your responsibility to protect your visitors, and that clearly remains true as many of them are otherwise sitting ducks that scammers will pray on.

counterfeit icon

Counterfeit

Counterfeit is another area where users have problems. Less than 50% spotted the fake phone even though it was clearly stated in the ad.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Lack of Trust

We could spot a general lack of trust. The listing that was genuinely good was also the best perceived one, but even here many people expressed concerns. This only points to the importance of brand building. A strong brand will likely generate a higher trust, so take care of your reputation and keep a clear and consistent focus on building confidence on your site.

European Skepticism

Talking about trust… Brits appear to be more skeptical than Americans across the board as they regularly chose to not take the desired actions on the ads shown to them. The reason? Lack of trust. So if you need to prioritize moderation efforts across different sites this might give you some clues on where to start in order to break into the market rather than falling flat on your face.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

From dating sites and online marketplaces to social media and video games – content moderation has a huge remit of responsibility.

It’s the job of both AI and human content moderators to ensure the material being shared is not illegal or inappropriate: always acting in the best interest of the end-users.

And if you’re getting the content right for your end-users, they’re going to want to return and hopefully bring others with them. But is content moderation actually a form of censorship?

If every piece of content added to a platform is checked and scrutinized – isn’t ‘moderation’ essentially just ‘policing’? Surely, it’s the enemy of free speech?

Well actually, no. Let’s consider the evidence.

Moderating content vs censoring citizens

Content moderation is not a synonym for censorship. In fact, they’re two different concepts.

Back in 2016, we looked at this in-depth in our Is Moderation Censorship? article – which explains the relationship between content moderation and censorship. It also gives some great advice on empowering end-users so that they don’t feel censored.

But is it really that important in the wider scheme of things?

Well, content moderation continues to make headline news due to the actions taken by high-profile social media platforms, like Twitter and Facebook, against specific users – including, but not limited to, the former US President.

There’s a common misconception that the actions taken by these privately-owned platforms constitute censorship. In the US, this can be read as a violation of the First Amendment rights in relation to free speech. However, the key thing to remember here is that the First Amendment protects citizens against government censorship.

That’s not to say privately-owned platforms have an inalienable right to censorship, but it does mean that they’re not obliged to host material deemed unsuitable for their community and end-users.

The content moderation being enacted by these companies is based on their established community standards and typically involves:

These actions have invariably impacted individual users because that’s the intent – to mitigate content which breaks the platform’s community standards. In fact, when you think about it, making a community a safe place to communicate actually increases the opportunity for free speech.

“Another way to think about content moderation is to imagine an online platform as a real world community – like a school or church. The question to ask is always: would this way of behaving be acceptable within my community?”

It’s the same with online platforms. Each one has its own community standards. And that’s okay.

Content curators – Still culpable?

Putting it another way, social media platforms are in fact curators of content – as are online marketplaces and classified sites. When you consider the volume of content being created, uploaded, and shared monitoring it is no easy feat. Take, for example, YouTube. As of May 2019, Statista reported that in excess of 500 hours of video were uploaded to YouTube every minute. That’s just over three weeks of content per minute!

These content sharing platforms actually have a lot in common with art galleries and museums. The items and artworks in these public spaces are not created by the museum owners themselves –they’re curated for the viewing public and given contextual information.

That means the museums and galleries share the content but they’re not liable for it.

However, an important point to consider is, if you’re sharing someone else’s content there’s an element of responsibility. As a gallery owner, you’ll want to ensure it doesn’t violate your values as an organization and community. And like online platforms, art curators should have the right to take down material deemed to be objectionable. They’re not saying you can’t see this painting; they’re saying, if you want to see this painting you’ll need to go to a different gallery.

What’s the benefit of content moderation to my business?

To understand the benefits of content moderation, let’s look at the wider context and some of the reasons why online platforms use content moderation to help maintain and generate growth.

Firstly, we need to consider the main reason for employing content moderation. Content moderation exists to protect users from harm. Each website or platform will have its own community of users and its own priorities in terms of community guidelines.

Content moderation can help to build that trust and safety by checking posts and flagging inappropriate content. Our survey of UK and US users showed that even on a good classified listing site, one-third of users still felt some degree of mistrust.

Secondly, ensuring users see the right content at the right time is essential for keeping them on a site. Again, in relation to the content of classified ads, our survey revealed that almost 80% of users would not return to the site where an ad lacking relevant content was posted – nor would they recommend it to others. In effect, this lack of relevant information was the biggest reason for users clicking away from a website. Content moderation can help with this too.

Say you run an online marketplace for second-hand cars, you don’t want it to suddenly be flooded with pictures of cats. In a recent example from the social media site Reddit, the subreddit r/worldpolitics started getting flooded with inappropriate pictures because the community was tired of it being dominated by posts about American politics and that moderators were frequently ignoring posts that were deliberately intended to gain upvotes. Moderating and removing the inappropriate pictures isn’t censorship, it’s directing the conversation back to what the community originally was about.

Thirdly, content moderation can help to mitigate against scams and other illegal content. Our survey also found that 72% of users who saw inappropriate behavior on a site did not return.

A prime example of inappropriate behavior is hate speech. Catching it can be a tricky business due to coded language and imagery. However, our blog about identifying hate speech on dating sites gives three great tips for dealing with it:

Three ways to regulate content

A good way to imagine content moderation is to view it as one of three forms of regulation. This is a model that’s gained a lot of currency recently and it really helps to explain the role of content moderation.

Firstly, let’s start with discretion. In face-to-face interactions, most people will tend to pick up on social cues and social contexts which causes them to self-regulate. For example, not swearing in front of young children. This is personal discretion.

When a user posts or shares content, they’re making a personal choice to do so. Hopefully, for many users discretion will also come into play: will what I’m about to post cause offense or harm to others? Do I want others to feel offended?

Discretion tells you not to do or say certain things in certain contexts. We all get it wrong sometimes, but self-regulation is the first step in content moderation.

Secondly, at the other end of the scale, we have censorship. By definition, censorship is the suppression or prohibition of speech or materials deemed obscene, politically unacceptable, or a threat to security.

Censorship has government-imposed law behind it and carries the message that the censored material is unacceptable in any context because the government and law deem it to be so.

Thirdly, in the middle of both of these, we have content moderation.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

This might include things like flagging harmful misinformation, eliminating obscenity, removing hate speech, and protecting public safety. Content moderation is discretion at an organizational level – not a personal one.

Content moderation is about saying what you can and can’t do in a particular online social context.

So what can Besedo do to help moderate your content?

All things considered, content moderation is a safeguard. It upholds the ‘trust contract’ users and site owners enter into. It’s about protecting users, businesses, and maintaining relevance.

The internet’s a big place and there’s room for everyone.

To find out more about what we can do for your online business contact our team today.

If you want to learn more about content moderation, take a look at our handy guide. In the time it takes to read, another 4,000 YouTube videos will have been uploaded!

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background