With the rise of online dating comes the inevitable problem of fake profiles. These are profiles created by people who are not interested in dating but are looking to scam others out of their money or boost their own ego. Learn to spot a fake profile and protect yourself from being scammed, fooled, or harassed.

Photo by Victoria Heath on Unsplash

The problem with fake profiles is bigger than you think

The problem with fake profiles is more widespread than you think. Some sites estimate that as many as 10% of dating profiles are fake. That means that for every 10 people you see on a dating site, one of them is likely not even a real person.

So why do people create fake profiles?

There are a few reasons. Some people do it to scam others out of money. They create a profile, build up a relationship with someone, and then ask for money. Others do it to boost their ego. They create a profile with photos that make them look much more attractive than they really are and then message people they know they will never meet.

Whatever the reason, fake profiles are a problem because they ruin the user experience for everyone else. That is just as bad for customers as it is for businesses.

There are a few things you can do to spot a fake profile, but the best thing you can do is to be aware of the problem and be cautious when you’re interacting with people online. If something seems too good to be true, it probably is. Trust your gut and be careful out there!

5 ways to detect fake profiles

How do you detect fake profiles? It’s actually not as difficult as you might think. There are a few key things to look for that can help you spot a fake profile pretty easily.

  1. The first thing to look for is a lack of personal information. If a profile doesn’t have much information about the person, it’s likely fake.
  2. Look for inconsistencies in the information that is provided. If a person’s age, location, or other details don’t seem to match up, it’s probably because they’re not being truthful.
  3. Ask to see their social media accounts. If they can’t provide you with their Instagram account in this day in age, well, we don’t know what to tell you.
  4. Have a look at the follower/following ratio. A fake profile has zero or below 10 followers. Come on, how many people do you know with less than 5 friends on Instagram or Facebook?
  5. Take a look at the photos that are posted on the profile. Fake profiles will often use stock photos or photos clearly not of the person claiming to be behind the profile. If something looks too good to be true, it probably is!

Also, if someone you match with quickly seems to get you all too quickly. You have the same interests, they mention music or tv-show you like, not saying that’s a red flag but if you look at the big picture it might be someone you know who is catfishing you.

How do we stop harassment in dating apps?

We’re all too familiar with the scenario. You mind your own business, chatting away on your favorite dating app, when suddenly the conversation turns left, and you’re bombarded with messages from someone you don’t know. It’s annoying, it’s intrusive, and it can even be scary.

Well, this is where content moderators play an important role in keeping chat apps safe for everyone. We help to stop harassment and bullying by enforcing the policies set by the company behind the platform. Most dating apps have a chat functionality in place that you can use once you have matched with someone. That’s all great.

Say what you want, but be nice and obey the house rules.

Besedo offers chat moderation in real-time

When someone doesn’t obey the house rules, technology like Besedo’s will ensure you don’t get a chance to see the offensive messages sent to you. Real-time filtering for profanity or nudity will ensure that we can keep chats clean and civilized.

That way Besedo also works to keep the app user-friendly and respectful by enforcing the app’s terms of use.

I have been harassed on a dating app

You can do a few things to protect yourself from harassment in chat apps.

  1. Make sure you have the latest version of the app installed. Many chat apps include features that allow you to block or report users who are being abusive.
  2. If someone is harassing you, use these reporting tools to protect yourself. Always report the user and block them.
  3. Take screenshots to provide proof. It can also help protect others.
  4. Be careful about who you add to your chat app contacts list. If you don’t know someone well or if they seem sketchy, it’s best not to add them. This will help reduce the chances of being harassed by someone you don’t know.
  5. Finally, remember that you can always leave a chat app if it’s making you uncomfortable. There’s no shame in doing so, and it’s often the best way to protect yourself from abuse. If someone makes you feel unsafe or uncomfortable, just hit the exit button and move on.

At a time when people are looking for love and connection more than ever, it’s important to be aware of the risks of fake dating profiles. While most dating platforms do their best to keep users safe, there are always going to be some bad actors who slip through the cracks.

That’s why it’s important to be vigilant in online dating. If something seems too good to be true, it probably is. Be wary of anyone who asks for money early on, or who wants to move too fast without getting to know you first.

Scammers will hurt your brand with poor reviews

Dating apps have been accused of promoting a hook-up culture and fostering an environment where users are more likely to swipe in the search for someone better endlessly. But another side to dating apps that can be just as problematic is the proliferation of fake profiles.

Fake profiles are not only a problem for users but also pose a risk to the app itself. If users come across too many fake profiles, they may start to question the legitimacy of the app and its users. This can lead to them deleting the app and leaving negative reviews, which will damage the app’s reputation.

If you’re running a dating app, it’s important to ensure that you’re taking steps to prevent fake profiles from being created. This includes things like requiring verification for new users, using artificial intelligence to identify suspicious activity, and monitoring user reviews for feedback about fake profiles. 

Hi, we’re Besedo; we should talk! Nudge, nudge 😉

Taking these measures can help protect your app from being tainted by scams and poor reviews.

Don’t waste your time with dishonest people

With the prevalence of dating apps and websites, it’s no surprise that there are fake profiles out there. The important thing to remember is to be vigilant and do your research before meeting anyone in person. If you suspect that someone you’re talking to is a fake, report them to the site or app so that they can be removed. And most importantly, don’t waste your time on someone who isn’t being honest with you. 

There is plenty of fish in the sea, so keep swimming!


William Singam

Sales Director APAC

William is the Besedo Sales Director for APAC and you can meet him at the GDI Singapore Conference in July 2022 where he is one of the speakers. When he is not on stage, he’ll be happy to share his wealth of experience about 1-2-1 chat moderation, user experience, app reviews, and just about anything content related.

One of the surprising and amazing things about the internet, especially when we’re talking about things involving user-generated content, is how many ideas become well-established parts of society before we even fully understand them.

Online dating is a perfect example. While the first big dating site was launched more than 25 years ago, Tinder – which broke new ground for dating apps – only just turned 10. And yet, it’s already perhaps the main way of meeting a partner, with a Stanford study in 2019 finding that 39% of heterosexual couples met online.

Despite its massive popularity, though, we’re not necessarily wise to everything that dating apps have to throw at us or aware of all the ways they’re changing society or us. Unusual and potentially dangerous trends are emerging from online dating all the time, and never fail to claim headlines in the media: Stylist, for example, recently described voicefishing as “the freaky new dating trend we’re trying to get our heads around.”

The Challenge For Apps

On the one hand, then, there’s frequent discussion of the dangers that online dating can pose, but, on the other hand, people clearly keep coming back anyway to enjoy the benefits of digital matchmaking. Dating apps clearly have a job to do to make sure that people are empowered to be flirty, but not dirty; daring, but not dangerous.

That work all comes down to steering user interactions in the right direction, and that’s not always easy. The difficulty is illustrated by a study that recorded how dating app users speak about their experiences. Direct quotes from study participants show how dating apps – like many online venues – develop their own language, codes, and norms:

  • “I get him a different picture and I make his profile his “buyer” – he didn’t have a buyer. I made his profile a buyer, and said “You can always go back” and it blew up!”
  • “…sometimes they’ll write “say orange if you’ve read this.” And so you’re expected if you match, the first thing you say to them is orange to show that you’ve actually read through it.”

If you don’t know what some of these phrases mean, well, don’t worry: that’s exactly the point! We know that dating app users successfully find partners and have fun doing it – but while they’re doing so, they also have to navigate a fast-changing environment of language and interaction.

The moderation response

What is a challenge for users, here, is just as much of a challenge for content moderation strategies: both human moderators and automated tools need to constantly learn, adapt, and evolve in order to keep users safe. At the same time, though, the freedom to be playful and inventive in how you speak and interact with others is an important part of how dating apps work for people. While the medium might be digital, it’s still (hopefully) a real person on the other side of the screen – and, just as with bumping into someone on the street or meeting them in a bar, the element of surprise and fun is essential.

There’s a fine line to tread, then, when moderating dating apps. We need to react quickly to new threats, but intervene gently; we need to be strong in safeguarding users, but permissive about how they act; and we need to listen and learn attentively about how people interact. Needless to say, it’s a job that takes real expertise.

What do you think? Do dating apps get it right when protecting and helping their users? How can we respond to the ways language is evolving online?

And what is a “buyer”, anyway?

If you want to talk about it, reach out today.

Axel Banér

Sales Director – EMEA

Dating apps are once again preparing to be abuzz with activity for Valentine’s Day. Even though outlooks toward dating apps have become increasingly positive over the past few years, with platforms gaining in both popularity and users, they have throughout their short existence continued to attract a great deal of attention on the risks they pose to users from a personal safety perspective.

Any dating app user will be familiar with the anxiety involved with moving from digital to in-person interactions, and unfortunately, that anxiety has a legitimate source. According to the Pew Research Centre, one in two online dating users in the US believes that people setting up fake accounts to scam others is very common.

The financial details back them up, too: the FTC recently highlighted that, with $1.3b in losses over the last five yearsromance scams are now the biggest fraud category they track.

And people who strike up online relationships between Christmas and Valentine’s Day might be at particular risk of romance fraud. Last March, for example, the UK’s National Fraud Intelligence Bureau experienced a spike of romance fraud reports. It’s little wonder, then, that Netflix chose the start of February to release its true-crime documentary The Tinder Swindler.

With online dating apps now entirely mainstream as one of the default ways of meeting people, with over 300m active users, it is more important than ever that the businesses running them take strong steps to protect user safety. This is a moral imperative, of course, in terms of working for users’ best interests – but, as the market matures, it’s also quickly becoming a potentially existential problem for dating platforms.

Challenges faced by those looking for love

When considering managing the online reputation of a company, user experience, and business outcomes are often one and the same thing, meaning that moderation is an important measure to consider. Disgruntled customers, for instance, often utilize social media to publicly criticize companies, leading to a backlash that can rapidly spiral out of control.

It’s not easy, however: online dating is, understandably, a highly sensitive and personal area. Users who might otherwise be highly cautious online are more likely to let their guard down when it comes to looking for love. Platforms have a duty of care to their users to put a stop to fraudulent behavior in order to support and protect their users in a way that does not feel ‘intrusive’.

Effective moderation in this space demands a range of approaches. A well-moderated dating app generates a more seamless and convenient user experience which in turn reduces spam content and unhappy user feedback. Keeping users safe, creating the right brand experience, and building loyalty and growth go hand in hand.

How it works in practice

As we enter a peak season for online dating, a moderation strategy that brings users closer to the people they want to connect with, with less spam and a clearer sense of safety, will be a real competitive differentiator. Ensuring a safe and positive user experience should be at the heart of dating sites’ content moderation strategy.

AI-enabled content moderation processes are essential to catch and remove these fraudulent profiles before they target vulnerable end-users. Online dating app, Meeticimproved its moderation quality and speed with 90% automation at 99% accuracy through an automated moderation platform.

With dating apps relying so heavily on user trust, it is essential that platforms are able to detect and remove scammers, whilst maintaining a low false-positive rate to ensure minimal impact on genuine users. Content moderation teams must also be continuously trained and updated on the ever-evolving tricks of romance scammers.

A content moderation partner can be a great way to ensure high accuracy and automated moderation to maintain a smooth customer experience. Only with a team of highly trained experts coupled with precise filters and customized AI models will online dating sites be truly efficient at keeping end-users safe.

Platforms cannot afford to make this a ‘non-issue’ – even if users do not experience it themselves, many will see others being harassed online and experience negative feelings towards the brand and platform. For platforms, everything is at stake for both their reputation and ultimately, the wellness of their users.

Update October 31, 2022: Thank you to Bedbible for reaching out. We have updated our link reference to their site. You should check them out, they are amazing.

Martin Wahlstrand

Regional Sales Director Americas

Martin is Besedo’s Regional Sales Director Americas. While you can’t swipe right on anyone here at Besedo, Martin and his team would love to give you a demo of how content moderation can help your users be safer and have a great user experience.

Tiwa York, CEO of Thailand’s largest c2c marketplace, Kaidee, joined us for a webinar to share the moderation challenges they faced back in 2016, and the incredible journey they’ve taken to meet their users’ expectations on quick publishing times, by moving from an all manual moderation approach to 85% automation. 

Download Webinar

Fill out your email below to get your free copy of the webinar.

Untitled(Required)

Kaidee is Thailand’s largest c2c marketplace, with 30 million users last year.

Three years ago, in 2016, the company faced tough challenges of slow time-to-site for newly submitted listings, resulting in decreased user experience on their platform and the risk of losing users.

In this webinar, we will explore the options Kaidee considered to help improve their time-to-site and UX. Kaidee’s CEO, Tiwa York, will share their remarkable journey moving from manual moderation to 85% automation and their main takeaways from the transition.

In this webinar, you’ll learn:

  • How Kaidee improved their time-to-site from less than 20% listings moderated within 5 min to a staggering 93%.
  • The challenges Kaidee faced trying to re-build and manage their in-house moderation tool.
  • How Kaidee managed to reduce their moderation team by 55%, without losing productivity.
  • How a classifieds site can transition from pure manual moderation to 85% automation.

Written by

Tiwa York

Head Coach (CEO) at Kaidee

Tiwa York is the Head Coach of Kaidee (a.k.a CEO). He is passionate about changing lives through trading second-hand goods, startups, and building great teams.

Along with a team of 5 awesome people, Tiwa founded an online C2C marketplace which is now known as Kaidee in 2011. In 2018, Kaidee had 1 million sellers, listing 8.7 million items, and reaching 30 million people in Thailand.

Tiwa has been recognized for his passion, his leadership, and his efforts to build the team and culture at Kaidee. Prior to his current role, he worked in digital advertising with leadership roles in Admax Network and Omnicom Media Group.

Emil Andersson

Marketing Manager at Besedo

Emil joined the Besedo team in 2017 and has since brought his can-do spirit, work ethics and marketing expertise to the team.

Emil’s education and expertise lay within the marketing field, but his true passion is to contribute to great user experiences, whether it’s through online or offline interactions. Making him the perfect fit for the Besedo team.

Prior to his current role, Emil was working in the architecture and design industry with B2B marketing and Business Development at Eurasia Architectural Products Ltd.

Download Webinar

Fill out your email below to get your free copy of the webinar.

Untitled(Required)

How will you keep your users safe and free from harassment? here’s what you’ll learn:

  • Where users may encounter harassment on your site.
  • Common types of harassing behavior online.
  • The severity of online harassment.
  • How harassment affects your overall growth.
  • How you can solve the issue and keep your user safe.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Download Webinar

Fill out your email below to get your free copy of the webinar.

Untitled(Required)

A snapshot of scam levels on online marketplaces – Valentine’s Day 2021

The infographic shows the results of a content audit performed on 6 popular online marketplaces in the lead up to Valentine’s Day 2021.

See how many scams were found on popular online marketplaces in:

  • Electronics
  • Pets
  • Perfumes

And get tips on how to keep users safe during the Valentine’s Day period.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

For many, online dating is now the default way to meet new people. As we become increasingly time-poor, digital devices act as a way for us to navigate our day-to-day lives and how we interact with others, including our relationships.

Despite attitudes towards dating apps becoming more positive and platforms gaining popularity in recent years, throughout their short history, they have attracted a great deal of attention on the risks they pose to users. While dating apps are an incredibly convenient way to maintain our love lives, they come with their own threats.

Risk vs risqué

Like any form of dating, connecting with strangers doesn’t come without risk. This is also the case when using an online dating platform. The exchange of information be it a phone number, address, or other personal details can be exploited if placed in the wrong hands. Dating scams, catfishing, and abuse attract headlines – and for platforms, advertising, misuse, and nudity also threaten to damage the user experience and brand reputation.

Finding the right balance between restricting content to protect users and allowing organic interactions to flourish is crucial to enabling platforms to grow and realize true potential. The power of online dating is its ability to make connections virtually, while the freedom which makes it possible to engage in negative interactions is also what makes it possible to have genuine, authentic, and meaningful relationships.

Growing a dating platform means harnessing the opportunities in the content it creates. Platforms cannot be seen to ‘scaremonger’ users, but it’s imperative they provide substantial safety features and guidelines to protect users and brand reputation, whilst using technology to enhance user experience and focus on retention to grow their platforms.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Creating a safe space, without killing the mood

The recent context of lockdowns demonstrated the power of online dating; even without in-person interaction, it functioned as a place to make human connections. It works best, therefore, when it delivers the same surprise, joy, and meaningfulness of speaking to someone new in real life.

With online dating, it is tempting to see shutting down opportunities to interact as the only way to remove risk. But this isn’t what users want. They want to feel protected and trusting of the systems in place to be able to interact in confidence. It is now an expectation, not a “nice to have”, for platforms to filter out all harmful content from fake profiles to indecent imagery. Providing a sophisticated app to allow users to interact with who they choose is likely to result in increased brand loyalty as opposed to blocking all connections which could be deemed as harmful.

An engaging and reliable messaging experience is the foundation of retention on a successful dating platform. Creating a positive space to connect, however, relies on really understanding how people use the platform and what works for them. With many users engaging in conversations to meet new partners, its important technology doesn’t get in the way and ‘kill the mood’, with an unstable or over-censored chat platform.

Content moderation can help strike the right balance. As well as blocking the most objectionable – or illegal – content, it delivers insight that enables dating sites to encourage sincere, positive behaviours. Online dating is a space of rapid innovation and as brands create new ways to help people connect more effectively, platforms need to ensure interactions remain safe, with custom moderation approaches.

Ultimately, stopping deceitful users from harming the user experience and removing unwanted content to keep people safe will protect brand reputations. With content moderation, your dating site can become the brand you want it to be.  

Find out more about working with us and request a demo today.

edmond vassallo

By Edmond Vassallo

Head of Customer Success

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.
Form background

How Do You Keep Singles Safe Online?

When it comes to affairs of the heart – at a time when physical contact is off-limits – it’s time to get creative. And that exactly what online dating platforms are doing: using video interaction as the ‘date’ itself.

While it’s clear that dating is innovative, exciting, and evolving at a rapid pace, how can dating site owners ensure they keep users safe?

Necessity Breeds Dating Invention

Video dating is nothing new. Well, in its current, interactive form it’s brand new, but use of video as a way of introducing yourself to potential dating partners took off in the 1980s and 90s. But back then, agencies were involved. And like recruiters, it was their job to vet, interview, and engineer love matches based on compatibility and common likes and dislikes.

However, fast forward 35 years, and the ways in which we interact have shifted significantly. And they just keep innovating. Services like eHarmony, Tinder, and Bumble, each offer their own unique approach to self-service matchmaking. And while social media platforms (Facebook Dating, anyone?) have been dipping their toes into the dating pool for a little while now, nothing groundbreaking has taken the sector by storm.

Most industry insiders saw the use of video as an ‘add-on’ to dating platforms but no-one was entirely sure how this would play out. And then, in March 2020, the COVID-19 pandemic hit. Lockdowns ensued internationally. Suddenly video took on a whole new role.

Communication evolved in one major direction – online video calls and meetings. Replacing face-to-face with face-to-screen encounters in a time of social distancing represents a huge cultural shift, unimaginable back in 2019.

Whether we’re learning at home or working remotely, how we stay connected has changed significantly. Substituting in-person conversations with video meetings is now par for the course.

Despite the ensuing Zoom fatigue, being advised to stay at home has undoubtedly led to a spike in online dating. And with traditional dating venues no longer a COVID-safe option, video dating has organically risen to the forefront.

Why Video Dating?

While not every dating site or user is engaging with video dating yet, many are trying it out.  But what are the benefits of video dating? If your online dating platform is not already providing that service, are your users missing out?

Compared with traditional online dating, video dating has some great benefits. The most obvious reason to choose video dating is that it enables participants to experience the social presence that’s lacking in written communication. As a result, it can feel much more real and authentic than just exchanging messages or swiping photos.

With a video date, users have that experience of getting to know someone more slowly, finding out if they’re a good match in terms of personality, sense of humour, and other qualities. This means if you don’t click with someone, you’re more likely to find out sooner. Particularly at a time when in-person meetings are restricted, this is a huge advantage in terms of making the leap to meeting in person.

But swapping a bar or restaurant for a video meeting carries a different set of risks for participants. And for online dating platforms, video dating poses tough new challenges for content moderation. Especially when it comes to livestream dating with an interactive audience.

Dating Live & In Public

Dating in front of a live audience is nothing new. In the 1980s, television dating shows like ‘Blind Date’ in the UK experienced huge popularity. Contestants performed in front of a live studio audience and put themselves at the mercy of the general public – and the tabloid press(!).

In the 2010s, the television dating game show-style format was revived – though it followed a wider trend for ‘reality TV’ with dating shows such as ‘Love Island’ emerging and growing in popularity. However, the legacies of these shows have been tainted by a small number of poorly-vetted contestants – some even had previous convictions for sex offences – suffering serious mental-health conditions as a result of their appearance on the show.

Despite these warning signs, it seems inevitable that the trend for dating-related entertainment has been adopted by interactive online technologies – livestream dating. Often described as ‘speed dating in a public forum’, the trend for watching and participating in live video dating seems a logical extension of platforms like Twitch and TikTok.

But sites like MeetMe, Skout, and Tagged aren’t just a way of making connections – they’re also an opportunity for daters to generate revenue. Some platforms even provide users with the functionality to purchase virtual gifts which have real monetary value.

Needless to say, these kinds of activities continue to raise questions about users’ authenticity: in terms of dating in pursuit of love. This is why, over the last decade, many industries have made a conscious move towards authenticity – in order to build better brand trust. The dating industry is no different, especially since – despite exponential growth – there are still major retention and engagement issues.

Video offers that sense of authenticity, particularly as we’re now so accustomed to communicating with trusted friends and family via live video.

Dating also has universal appeal, even to people already in committed relationships. There is an undeniable voyeuristic aspect to watching a dating show or watching live streamed daters. And of course there are inherent safety risks in that.

Like other interactive social technologies, the livestream dating trend carries its own intrinsic dangers in terms of mental health and user experience. And just like any other interactive social media, there are always going to be users who are there to make inappropriate comments and harass people.

That’s where content moderation comes into play.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

So How Can Content Moderation Support Safer Dating?

One-to-one video dating and livestream dating is happening right now. Who knows where they will evolve?

Setting your brand apart in an already crowded dating industry is becoming more complicated in a time when social media technologies are rapidly evolving. How will you stay ahead of the curve and keep your users safe?

Of course, video moderation is not the only challenge you’re going to face. The associated unwanted user-generated content that goes with running an online dating platform includes:

  • Romance scams
  • prostitution
  • online harassment
  • catfishing
  • profanity
  • nudity
  • image quality
  • underaged users
  • escort promotion.

After all, brand trust means a better user experience. And a better user experience increases user lifetime value – and revenue.

On average, 1 in 10 dating profiles created is fake. Scammers and inappropriate content hurt your platform’s reliability. Left unanswered, undesirable content undermines user trust and can take a heavy toll on your acquisition and retention.

– but it does mean taking a leap in terms of your overall digital transformation strategy, and adding AI and machine learning to your service.

With an all-in-one package from Besedo, you can get your content moderation in order across multiple areas. It’s built on over 20 years experience and now has manual video moderation capabilities.

This means you can now review play, pause, timestamp and volume control videos. More importantly you can delete videos which don’t meet your site’s standards for user-generated content. Take a look at our short video guide to discover more.

Make dating online safer.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Why creating sustainable growth means looking beyond the digital present

Over the past decade, it has become common to suggest that every company is now a tech company.

The exponential growth in digital usage quickly outgrew what we traditionally think of as the technology sector and, for users, the agility of the internet didn’t stay confined to the online world. Technology has shifted expectations about how everything can or should work. Soon, companies selling everything from furniture to financial services started to look and act more like innovative tech companies. They find new ways to solve old problems through digital channels.

In other words, business leaders seeking to guarantee growth turned to digital technology – to the point that, now, the Chief Technology Officer is a key part of the C-suite.

After a year when we’ve all relied on the internet more than ever, in every aspect of our lives, growth through digital has never been more apparent. For business, digital communication has at times been the only possible way of staying in touch with customers, and there’s no sign that the CEO’s focus on agility and technology is fading. In recent surveys, IBM found that 56% of CEOs are ‘aggressively pursuing operational agility and flexibility’, PwC found that they see cyber threats as the second biggest risk to business, and Deloitte found that 85% think the pandemic accelerated digital transformation.

If the exponential growth of digital has made every company a technology company, though, it has also made terms like ‘technology’ and ‘agility’ less useful. If every CEO is pursuing a digital strategy, that term must be encompassing a vast range of different ideas. As we look towards the next decade of growth – focused on managing the challenge of achieving more responsible and sustainable business along the way – we will need to think carefully about what comes next once digitalisation is universal.

Supercharged tech growth has skyrocketed user-generated content

Of course, the importance of agile technology has never been the tech itself, but what people do with it. For customers we’ve seen tech innovation create new ways of talking, direct access to brands, and large changes in how we consume media and make purchases.

As digital channels take on a greater share of activity than ever, one of the effects of an exponential growth in digital is an exponential growth in user-generated content (UGC).

This user-led interaction, from product reviews to marketplace listings to social interactions, fully embodies the agility that companies have spent the last decade trying to bring to their processes; because it is made by people, UGC is rapid, diverse, and flexible by default. While it may be too soon to say that every business will become a content business, it’s clear that this will become an increasingly important part of how businesses operate. Certainly, it’s already a major driving force for sectors as diverse as marketplaces, gaming, and dating.

A UGC business must be protected to maximise opportunity

In the move towards UGC, a business’s user interaction and user experience will have consequences across the organisation – from profit margin, to brand positioning, to reputational risk, to technological infrastructure. Across all of these, there will be a need to uphold users’ trust that content is being employed responsibly, that they are being protected from malign actors, and that their input is being used for their benefit. Turning content into sustainable growth, then, is a task that needs to be addressed across the company, not confined to any one business function.

Marketers, for instance, have benefited from digitalisation’s capacity to make the customer experience richer and more useful – but it has also introduced an element of unpredictability in user interactions. When communities are managed and shaped, marketers need to ensure that those efforts produce a public face in line with the company’s ethos and objectives.

While tech teams need to enable richer user interaction, their rapid ascent to become a core business function has left them under pressure to everything, everywhere. Their innovation in how content is managed, therefore, needs a middle path between the unsustainable workload of in-house development and the unsustainable compromises of off-the-shelf tooling.

With the ultimate outcomes of building user trust being measured in terms of things like brand loyalty and lifetime user value, finance departments will also need to adapt to this form of customer relationship. The creation of long-term financial health needs investments and partnerships which truly understand how the relationship between businesses and customers is changing.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

UGC as a vital asset for sustainable business growth

Bringing this all together will be the task needed to create sustainable growth – growth which is fit for and competitive in the emerging context of UGC, sensitive to the increasing caution that users will have around trusting businesses, and transparent about the organisations ethos, purpose, and direction. It will require not just investing in technology, but understanding how tech is leading us to a more interactive economy at every scale.

As digitalisation continues to widen and deepen, we may find UGC, and the trust it requires, becoming just as vital an asset for businesses as product stock or intellectual property. To prepare for that future and maximise their business growth from their UGC, businesses need to start thinking and planning today.

By Petter Nylander

CEO Besedo Global Services

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Are my customers getting scammed? Is my site being used for illegal sales? Are offensive language and disturbing images flourishing without my knowledge?

These have been the primary concerns since the dawn of classifieds, frequently discussed at events, forums, in the media and among peers. And it makes sense, a site where illegal drugs are being traded, where fraudsters are thriving and visitors get offended, will most likely develop a pretty bad reputation and face a slow but steady death. Right?

Sure, we are not denying this, but times are changing and just avoiding negative buzz is not enough anymore to build a thriving site. In 2021, marketplaces need to actively build on their brands and we are seeing increasing amounts of money being thrown on brand strengthening campaigns and commercials. But are companies living up to their brand promises? How does content quality affect the perception of a site and how is quality perceived by the users?This is something we wanted to find out. And this is why we initiated a survey.

What Content Affect Users?

We asked 1000 people in the UK and US how they perceive listings with different types of content and what actions they would take based on the ads. And this is what we found:

target with an arrow through it icon

Is It Relevant?

Consumers clearly don’t have any patience with irrelevant or missing information. When shown an ad lacking relevant content nearly 80% said that they would not return to the site where it was posted, nor would they recommend it to others. “I would never buy a phone from an ad or site like this” was a common response to the ad.

inappropriate speech bubble icon

Inappropriate Content

People also found Racism and nudity disturbing, but the interesting thing is that this content wasn’t perceived to be as bad as the ad missing relevant information.

scam icon

Scam Detection Failure

Scam should continue to be a priority for market places, as almost half of the respondents failed to identify an ad that was an obvious scam containing a “too low to be true-price”, a western union payment, and/or typos. We would continue to maintain that it’s your responsibility to protect your visitors, and that clearly remains true as many of them are otherwise sitting ducks that scammers will pray on.

counterfeit icon

Counterfeit

Counterfeit is another area where users have problems. Less than 50% spotted the fake phone even though it was clearly stated in the ad.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Lack of Trust

We could spot a general lack of trust. The listing that was genuinely good was also the best perceived one, but even here many people expressed concerns. This only points to the importance of brand building. A strong brand will likely generate a higher trust, so take care of your reputation and keep a clear and consistent focus on building confidence on your site.

European Skepticism

Talking about trust… Brits appear to be more skeptical than Americans across the board as they regularly chose to not take the desired actions on the ads shown to them. The reason? Lack of trust. So if you need to prioritize moderation efforts across different sites this might give you some clues on where to start in order to break into the market rather than falling flat on your face.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background