The term ‘sharing economy’ is famously difficult to define. For some, it refers to any digital platform that connects people more directly than traditional business models. For others, a business is only true in the sharing economy if it enables people to make money out of things they would buy and own anyway.

What all forms of the sharing economy share, though, is a reliance on trust. Whether you are hailing a ride, staying in someone’s spare room, borrowing a lawn mower, or paying someone to do a small one-off job, you’re entering into a transaction that starts with a decision to trust a stranger.

The difficulty of encouraging that decision is exacerbated by the fact that, from the user’s perspective, getting this wrong is potentially a high-stakes issue: while sometimes it might mean merely getting a dissatisfying product, interactions like borrowing a car or renting a room can pose serious risks to health and wellbeing.

Content’s double-edged sword

The question for platforms, then, is what kinds of structures and tools best encourage both positive outcomes and – almost as importantly – a sense of trust amongst the userbase.

Alongside approaches like strict rules on what can be listed and physical checks of users’ offerings, many sharing economy platforms turn to user-generated content (UGC) for this purpose. User reviews, photos of what is on offer, communication options, and even selfies can all help to humanize the platform, validate that users are real people, and generate a sense of trustworthiness.

At the same time, however, allowing UGC can open the door to specific risks. Low-quality images, for example, can worry people and erode trust, while giving users more control over how listings are presented creates greater potential for scams, fraud, and fake profiles. A permissive approach to content can also lead to users conducting business off-platform, side-stepping both safety, and monetization systems.

This is why there is such a variety of approaches to UGC in the sharing economy. Where some platforms, like Airbnb, encourage users to share as much about themselves and their property as possible, others, like Uber, allow only a small selfie and the ability to rate riders and drivers out of five stars. In between these open and defensive approaches, there are any number of combinations of content and sharing permissions a business might choose – but what delivers the best outcome?

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Using the carrot, not just the stick

Intuitively, many might assume that the platforms which feel safest will be those with the strictest rules, only allowing interaction between users when absolutely necessary and banning those who engage in damaging behavior. In recent research, a group of organizational psychologists described this as ‘harsh’ regulation, as opposed to the ‘soft’ regulation of supporting users, encouraging interaction, and influencing them to engage in positive behavior.

Perhaps surprisingly, the research found that soft regulation has a stronger positive impact than harsh regulation. The sharing economy, after all, digitalizes something humans have always done in the physical world: try to help one another in mutually beneficial ways. Just as we take our cues on how to behave in everyday life from the people around us, seeing positive engagements on platforms sets a standard for how we treat each other – and trust each other – in digital spaces. Being able to talk, share, and humanize helps people to engage, commit, and trust.

This suggests that we may need to shift how we think about managing content in order to make the most of its potential to drive long-term growth. Content moderation is seen, first and foremost, as a way of blocking unwanted content – and that’s certainly something it achieves. At the same time, though, having clear insight into and control over how, when, and where content is presented gives us a route towards lifting the best of a platform into the spotlight and giving users clear social models of how to behave. In essence, it’s an opportunity to align the individual’s experience with the best a platform has to offer.

Ultimately, creating high-trust sharing economy communities is in everyone’s best interest: users are empowered to pursue new ways of managing their daily lives, and businesses create communities where people want to stay, and promote to their friends and family, for the long term. To get there, we need to focus on tools and approaches which enable and promote positive interactions.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

If you keep your eye on content moderation as we do, you’ll be aware that the EU’s Digital Services Act (DSA) is on the road to being passed, after the European Commission submitted its proposals for legislation last December.

You’ll also know, of course, that the last year has been a tumultuous time for online content. Between governments trying to communicate accurately about the pandemic, a tumultuous US election cycle, and a number of protest movements moving from social media to the streets, it’s felt like a week hasn’t passed without online content – and how to moderate it – hitting the headlines.

All of which makes the DSA (though at least partly by accident) extremely well-timed. With expectations that it will overhaul the rules and responsibilities for online businesses around user-generated content, EU member states will be keen to ensure that it offers an effective response to what many are coming to see as the dangers of unmanaged online discourse, without hindering the benefits of a digitalized society that we’ve all come to rely on.

There’s a lot we still don’t know about the DSA. As it is reviewed and debated by the European Council and the European Parliament, changes might be made to everything from its definition of illegal content to the breadth of companies that are affected by each of its various new obligations. It’s absolutely clear, though, that businesses will be affected by the DSA – and not only the ‘Very Large Platforms’ like Google and Facebook which are expected to be most heavily targeted.

Many people looking at the DSA will instinctively think back to the last time the EU made significant new laws around the online business with the GDPR. The impact of that regulation is still growing, with larger fines being levied year-on-year, but it’s perhaps more important that internet users’ sense of what companies can or should do with data has been shifted by the GDPR. Likewise, the DSA will alter the terrain for all online businesses, and many industries will have to do some big thinking over the coming years as the act moves towards being agreed upon.

Content moderation, of course, is our expertise here at Besedo, and making improvements to how content is managed will be a big part of how businesses adapt to the DSA. That’s why we decided to help get this conversation started by finding out how businesses are currently thinking about it. Surveying UK-based businesses with operations in the EU across the retail, IT, and media sectors, we wanted to take the temperature of firms that will be at the forefront of the upcoming changes.

We found that, while the act is clearly on everyone’s radar, there is a lot of progress to be made if businesses are to get fully prepared. Nearly two-thirds of our respondents, for example, knew that the DSA is a wide-ranging set of rules which applies beyond social media or big tech. However, a similar proportion stated that they understand what will be defined as ‘illegal content’ under the act – despite the fact that that definition is yet to be finalized.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Encouragingly, we also found that 88% of respondents are confident that they will be ready for the DSA when it comes into force. For most, that will mean changing their approach to moderation: 92% told us that achieving compliance will involve upgrading their moderation systems, their processes, or both.

As the DSA is discussed, debated, and decided, we’ll continue to look at numbers like these and invite companies together to talk about how we can all make the internet a safer, fairer place for all its users. If you’d like to get involved or want insight on what’s coming down the road, our new research report, ‘Are you ready for the Digital Services Act?’, is the perfect place to start.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Why is the quality of content important?

Every marketplace professional knows that quality content is the key to success. Without good user generated content, you won’t drive traffic to your platform.

The quality of your content helps build trust (a vital element to marketplace success) and directly impacts conversions, engagement and retention. In fact, visitors are more than twice as likely to return to your site if they encounter high quality content on their first visit.

Now, what constitutes good content can differ wildly depending on; your audience, the services your platform facilitates and the overall competition in your specific niche.

However, we’ve collected some general rules of thumb that rarely fail.

What’s a good online marketplace profile?

A good online marketplace profile helps build trust in the owner, makes it easy to identify and re-identify them and provides the reader with enough relevant information for them to decide whether it makes sense to engage or not.

Polina Yelkina, Manager Russia & Ukraine, Community Relations at Blablacar breaks down their view on a good profile to the following:

  • Maximized useful info (bio, preferences, photo, certified phone/email, car
  • Real information.
  • Relevant content (no gibberish)
  • No surname mentioned.
  • No contact details publicly visible.
  • No advertising/spam.
  • No offensive/insulting content in the profile.
  • Content which complies with the rules of the platform.

Expanding on this and based on our experience, it’s clear that there’s some universal criteria quality profiles meet:


Good profiles contain relevant information that helps the reader understand whether the profile is matching their current needs. Which information is relevant, will depend on the platform and the service exchanged between the profile owner and the person viewing it.

For a profile on a car sharing platform for instance, it might be relevant to include reviews or information such as whether the driver accepts smokers or not.

Spam and advertising are of course completely unacceptable and links that lead off the site are inadvisable due to the risk of fraud and platform leakage.


Profiles should be streamlined to communicate the exact message needed for the reader to understand who they are about to engage with. It’s important that it’s clear who the profile owner is so pictures of multiple people are usually a no go. While the profile should be informative, it also shouldn’t contain irrelevant information. Short and to the point is better than rambling and incoherent.


We live in a visual age. Regardless of whether the profile is for a classifieds site, a gamer account or a car sharing platform, having a visual representation of the user is important. In general, this should be in the form of a picture of the user but could also be an avatar for platforms that are more lenient on anonymity (like community sites for kids or gaming platforms). The image should however be recognizable, so the representation is always easy to identify as the profile owner.

We’ll get further into this in the next paragraph, but make sure the image is clear, sharp, properly formatted and inviting.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


What’s a good image for an online marketplace?

What constitutes a good image, is highly reliant on context. However, usually the following is true:

Blurry images communicate low quality and don’t instill trust in the person looking at it. Make sure users upload sharp images where it’s easy to see the subject.

Images that are formatted wrong, or are too dark or too bright, signal low value and should generally be avoided.

Photos that look genuine are preferable so try to get users to stay away from the stock photo look. There is such a thing as too perfect. In general, if your users are trying to sell an item it’s good to have multiple images. A couple that show the item, some that show details of the item in closeup and one or two that show the item in action (preferably in a natural, yet non-home-video-style fashion).

For profile pictures, a good image ensures that the face is easily distinguishable. After all, you’re trying to simulate a face to face encounter. Sunglasses, shots from behind and other images that remove information should be discouraged.

Depending on your site policies you may also want to remove inappropriate images where people have gone in the opposite direction and reveal too much.

In a qualitative survey on user search experience 100% of all participants highlighted good images as a reason to click on an advert. Underlining that great pictures drive engagement and conversions.

“The pictures are showing that it actually belongs to a real person. It looks less blurry, and like the seller really cares to sell the item” – Louise’s Product Support Lead, Ana Castro and Anti-Fraud Specialist Jelena Moncilli reveals that their internal guidelines echo this call for genuine and clear images.

“A good quality image is an authentic image captured by the user, directly related to the listing, without contacts or web links, and where the product or service offered can be clearly visualized, without being blurred.”

The same message of clarity is repeated by Polina Yelkina, Manager Russia & Ukraine, Community Relations at Blablacar.

In pictures of cars we want the following to be true:

  • The whole car should be visible.
  • The car from the picture should be easily recognized by passengers.
  • No people in the picture.
  • No contact details/advertising visible in the picture or on the car itself.”

What’s a good marketplace listing?

What is the purpose of listing text? It’s to provide the potential buyer or user with enough information for them to decide whether to engage with the seller.

If it takes too much effort on the buyer-side to decode the item or service specifications, it’s likely the buyer gives up and moves on to the next listing or worse, to the next platform.

As such the goal with listing text should be to clearly and concisely describe the service or item for sale. Misinformation, spammy, scammy or inappropriate content should obviously be disallowed and immediately removed.

Detailed descriptions aren’t just good for the browsing experience, it also helps with SEO driving traffic to the listing and the platform as a whole.

Listing text is also very important in building long-term trust in the platform brand. Since the buyer can’t physically examine the item, they have to rely on the description. If this turns out to be dishonest or lacking, trust will quickly deteriorate. As such an honest and detailed description is recommendable as a good listing text.

Ana Castro and Jelena Moncilli backs the need for detailed descriptions stating:

“A good quality listing is a listing in compliance with our insertion rules, offering only one service or product, placed in the right category at a fair price, with a clear title with good key words which help to find the service or product offered by going through a list of results. Description should contain all necessary information like the size, the color etc. A good listing is completed by a picture illustrating the item or the service which is sold. All necessary contact information should be correctly filled in the specific field.
For the three points above, all of them must respect our insertion rules as well as the Swiss law.”

The perils of low-quality content

We’ve listed a lot of benefits to high quality content, but what issues does low-quality content cause?

Imagine walking by your local clothing store and seeing dirty, tattered dresses tossed around in the window. You’d be unlikely to enter the store and even less likely to purchase from there.

The page a visitor lands on when first visiting your site is your storefront. It should be inviting and beautiful. You want the user generated content to show your platform in the best light possible. Blurry images, lacking descriptions and user profiles that don’t give insight into the owner will deter visitors. Guide users to create and submit better content and take care of the bad pieces that inevitably slip through. That way you’ll ensure a more successful platform with more engagement, traffic and higher retention.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Manual moderation isn’t really rocket science. Take away the bad stuff and leave the good. Right?

For very basic moderation sure, but the truth is that as soon as you reach any significant volumes on the site, moderation becomes a lot more complex. And to handle the complexity professionally you will need a very well organized team of moderators. To build that you will need to know the best practices for running an efficient moderation team.

We have dedicated other articles to talking about KPI’s and moderation methods, but once you have decided on the goals and methods you need to look at your delivery framework to ensure that your team has the optimal conditions to consistently and efficiently carry out quality work.

Communicate, Communicate, Communicate!

Set up a communication procedure to make sure new policies and site decisions are communicated to your moderation agents as fast as possible. When agents are in doubt they will lose precious time debating or speculating on what to do with new issues. This will also cause mistakes to be made.

Put in place a process for communicating new policies and ensure that someone is in charge of collecting, answering and communicating answers to questions from the moderation team.

Also make sure someone in your organization is on top of current events that might pose new challenges. We have covered such an example in a previous blog post The Summer of Big Events. And Big Ticket Scams.

Setting up a structure for a communication flow between the moderation team and the rest of your organization is key to enabling your moderators to work at their top speed and for them to feel equipped to do their job properly.

When we, at Besedo, provide a client with a team of moderators, we start out by setting up a clear framework for how questions from the agent on one side,  together with answers and new directions from the client on the other side are communicated.

Usually the structure will consist of the following:

  • A quarterly meeting where any adjustments to current guidelines or new focuses for the client business strategy are discussed. This allows the moderation team to give input on where and how their efforts are best applied to accommodate the client’s long-term vision.
  • A monthly meeting where our client informs about upcoming policy changes and new features.
  • A weekly meeting where current issues and challenges are raised by both parties. This is a great place to discuss any errors that have been made and request clarification on any policies that seem to cause a lot of grey areas.
  • Daily contact to touch base. This is usually not in the form of a meeting, but rather an ongoing conversation between a point of contact at the client side and one at Besedo’s site. This allows the moderation team to quickly receive answers and communicate new challenges that may pop up during the day. The key to success in this case is to have ONE clear point of contact on each side where all communication can be channeled.

After each meeting, the communication will be emailed out and also cascaded to the team through team leaders or veteran agents, ensuring that all moderators regardless of shift are made aware.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Moderation superstars are like athletes. They need ongoing training to stay on top.

1 hour spent training can save many more long term. It’s easy to think that moderation is straightforward, but it takes time, knowledge and experience to spot bad content in seconds when reviewing hundreds of ads an hour.

While it can be tempting to throw people headfirst into the moderation job (especially if you are short on staff) it almost always pays to spend time equipping your moderator for the job. You will have less mistakes, better speed and a happier new colleague.

When we employ new moderators to Besedo, we pass them through a very in-depth on-boarding program. Not only do we introduce them to the clients rules and platform, we also spend time teaching them about content moderation in general, the trends, the tools of the trade and the reasons behind moderating.

But we don’t stop there. We have ongoing training, workshops and knowledge sharing forums. The industry is not static, laws change and scams are always evolving. This means our moderation team needs to constantly improve and keep up with current trends. And so should yours!

You want ROI on moderation? You have to work for it!

When we speak to owners of sites and apps that deal with user generated content, one of the concerns we face is that they have not seen the expected ROI from moderation in the past.

Digging into their previous experience we often see that while they have had moderation efforts in place, they have not dedicated time and resources to really structure and maintain it.

We cannot stress enough how important it is to setup processes, training and retraining for your moderation team. Without it they will be working in the dark, catching some things, but leaving too much else untouched. An approach like this can almost be more harmful than no moderation at all, as you customer won’t know what to expect and whether or not site policies are backed up.

If you want to see proper ROI from moderation, it will require a lot of work, resources and attention. Sit down, plan, structure, implement and keep iterating.  It isn’t going to happen by itself!

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Users’ expectations are at an all-time high, and losing your customers to your competition is out of the question. Platforms need to do everything they can to ensure a seamless and safe experience on their site. That’s why content moderation has never been more vital to gain and retaining customers.

Browsing the web for content moderation statistics? Look no further. We have compiled a list of 65 statistics about the landscape of content moderation, from user experience to customer service or stats relating to your specific industry.

  • User Experience
  • Reviews
  • Dating
  • Sharing economy
  • Online marketplaces
  • Customer service
  • Scams
  • Online harassment

User Experience

Online shoppers have no time to waste. They are expecting to find what they are looking for instantly. Competing for users’ attention is a tricky business. Only one negative experience can send your users away, seeking a better place to shop from.

Categorization, smooth navigation, good searchability, and no duplicates play a key role in creating a seamless experience to win customers and keep them coming back.

  • One in three consumers says they will walk away from a brand they love after just one bad experience. – PWC
  • First impressions are 94% design-related. – Research Gate
  • A study shows that 60% of consumers start their research on a search engine before visiting a specific website. – on Adweek
  • When they visited a mobile-friendly site, 74% of people say they’re more likely to return to that site in the future. – Think with Google
  • 42% of shoppers abandon an online purchase because of limited product information. – Biz report
  • Around 87% of online shoppers abandon their carts during the checkout if the process is too long or complicated. – Splitit
  • The average cart abandonment rate at 69.5 percent in 2019. – Statista
  • 55% of website visitors spend less than 15 seconds actively reading. – FreelancingGig
  • 53% of mobile site visits leave a page that takes longer than three seconds to load. – Think with Google
  • Slow-loading sites increase the abandonment rate by 79.17%. – SaleCycle
  • 30 % of shoppers say that the loading time of a website is the most important feature. Magento Commerce report
  • 100% of the participants found irrelevant items in their search results. Besedo user search study


Reviews can make or break your business, with customers relying more and more on reviews to buy products or services (and even trusting fellow online reviewers as much as their friends and family) genuine user reviews are an excellent way for users to gain trust in your platform.

However, fake reviews multiply quickly online, which could erode the trust needed to convert buyers. So, how can you prevent fake reviews on your site? Setting up a reliable content moderation process is your best bet to protect your site. Find out more about tackling fake reviews here.

  • 82% of consumers have read a fake review in the last year. – Bright Local
  • 85% of consumers are willing to leave reviews. – Bright Local
  • More than 8 in 10 say user-generated content from people they don’t know influences what they buy and indicates brand quality. – Hubspot
  • 76% trust online reviews as much as recommendations from family and friends. – Bright Local
  • 88% of consumers trust online reviews as much as personal recommendations. – Bright Local
  • 73% of Millennials say that consumers care more about customer opinions than companies themselves do. – Hubspot
  • 93% of consumers say that user-generated content can help them make purchasing decisions. – Adweek
  • Around 30 percent of online consumers said they posted product feedback online, and, in Asia, consumers were nearly 50 percent more likely than average to post a review. – KPMG
  • 90% of customers are influenced by positive reviews when buying a product. – Zendesk


Heterosexual couples are more likely to meet a romantic partner online than through personal contacts and connections, according to a recent study. The dating industry is booming, yet it is still facing countless challenges: rude messages, inappropriate images, and in the worst of cases, sexual harassment.

You need to handle these threats with an effective content moderation strategy to succeed in the business. The following online dating stats will give you a better idea of the challenges to be faced head-on.

  • 41% said of participants to a study said they are afraid of dating scams. – MeasuringU
  • 51% of women say safety was a concern when meeting with people resulting from a match on a dating app. – Once survey
  • 42% of women reported being “contacted by someone through an online dating site or app in a way that made them feel harassed or uncomfortable. – Pew Research
  • Nearly 40% of people surveyed have “swipe fatigue” because these apps are superficial, geared towards casual relationships, and don’t have adequate safety features. – Once survey
  • 34% of participants have been contacted in a way that made them uncomfortable. – Statista
  • Around the world, 600 million singles have access to the Internet. 400 million of these have never used a dating app. – TechCrunch
  • 82% say they would feel more secure using a dating app with public ratings. – Once survey

Sharing economy

The sharing economy is forging its way into all types of industries, from the gig economy to transportation or housing, and no sector will be left untouched in the future. Yet, the sharing industry comes with its own set of challenges, privacy and safety being the two leading causes of concern.

  • The number of sharing economy users is set to reach 86.5 million in the U.S. in 2021. – Statista
  • Fears’ consumers addressed issues of value and quality, articulated as concerns about “sharing not being worth the effort” (12 percent), “goods/services being of poor quality” (12 percent) and “other factors” (9 percent). – Campbell Mithun and Carbonview
  • 67% of consumers’ fears about the sharing economy are related to trust. – Campbell Mithun and Carbonview
  • In a 2017 survey, 36 percent of the respondents said privacy concerns were a reason not to use Airbnb, and 13 percent stated safety concerns as a reason not to use the app. – Statista
  • Over the past five years, the Car Sharing Providers industry has grown by 8.9% to reach revenue of $1bn in 2019. – Ibis World
  • McKinsey estimates that in the U.S. and Europe alone, 162 million people, or 20-30 percent of the workforce, are providers of sharing platforms. – McKinsey

Online marketplaces

With conscious consumerism on the rise, online marketplaces are trendier daily. But in this competitive environment, online marketplaces need to set themselves apart. Optimizing your platform’s experience is necessary if you wish to stay in the race.

  • Marketplaces generate significant revenue, two-thirds generating more than $50 million annually and one-third generating $100 million or more. – Altimeter
  • 21x, that’s how much faster resale has grown over retail over the past three years. – ThredUp
  • Lack of sellers who meet their needs (53%) is the single biggest reason buyers leave marketplaces. – Altimeter
  • By 2021, 53.9% of all U.S. retail eCommerce is expected to be generated through mobile commerce. – Statista
  • 56% of consumers would buy more from off-price retailers if they offered secondhand apparel, and one-third said the same was true of fast-fashion retailers. – ThredUp
  • The digital classifieds ad revenue in the U.S. grew by 8.3 percent in 2018 compared to the previous year. – Statista
  • Nearly half (48 percent) of online shoppers head straight to a large eCommerce marketplace.  – Big-commerce and Square study
  • Only 20% would buy the product in an ad with poor description and 73% were unlikely to return to the site. Compared to 56% and 37% for a good listing. Besedo user search study

Customer service

Customer service has become progressively more important for customers in the past few years. Have a look at the following statistics to help you improve your customer service and become their preferred platform.

  • According to a survey by Customer Care Measurement and Consulting the Carey School of Business at the Arizona State University, over three-quarters of complaining consumers were less than satisfied with their experience with the given company’s customer service department. – Customer Care Measurement and Consulting
  • 63% of customers are happy to be served by a chatbot if there is an option to escalate the conversation to a human. – Forrester
  • 73% say that valuing their time is the most important thing a company can do to provide them with good online customer service. – Forrester
  • 95% of customers tell others about a bad experience, and 87% share good experiences. – Zendesk
  • 90% of customers rate an “immediate” response as important or very important when they have a customer service question. 60% of customers define “immediate” as 10 minutes or less. – HubSpot Research
  • Investing in new customers is between 5 and 25 times more expensive than retaining existing ones. – Invesp


Scams can be found everywhere, and because of their sophistication level can be hard to detect or get rid of. Still, scams hurt businesses and drive user trust away. Check out our blog post on the 5 common online marketplace scams to see how you can fight back.

  • Fifty-five percent of businesses surveyed reported increased online fraud-related losses over the past 12 months. – Experian
  • 75% who saw a scam on a site would not return. – Besedo user search study
  • Only half of companies believe that they have a high level of understanding about how fraud affects their business. – Experian
  • 27% abandoned a transaction due to a lack of visible security. – Experian
  • Nearly 1.5 million phishing sites are created each month. – Dashlane blog
  • The median individual loss to a romance scam reported in 2018 was $2,600, about seven times higher than the median loss across all other fraud types. – FTC
  • 74% of consumers see security as the most important element of their online experience, followed by convenience. – Experian
  • 43.19% of the first page results for Gucci bag were counterfeits. Besedo mystery shopping study
  • Nearly 60 percent of consumers worldwide have experienced online fraud at some point. – Experian
  • In Australia, reports for financial losses due to scams reached 16.9% in December 2019 compared to 7.8% in December 2018. – Scamwatch
  • In Australia, investment scams are the top scam in terms of losses followed by dating ones. – Scamwatch
  • In our study, 50% of participants encountered something they thought was a scam. – Besedo user search study

Online harassment

Online harassment is a plague with dire consequences. Get to know the following stats to better your content moderation and fight back on online harassment.

  • 1 in 2 young people reported having experienced online bullying before the age of 25. – Universities U.K.
  • 41% of American adults have experienced online harassment, and 66% of adults have witnessed at least one harassing behavior online. – Pew Research
  • 62% of U.K. youth participants reported they had received nasty private messages via social media, of which 42% were hate-based comments on race, sexuality, or gender identity. – Universities U.K.
  • More than a tenth of Americans have experienced mental or emotional stress (and 8% have experienced problems with friends and family) due to online harassment. – Pew Research
  • 81 percent of women and 43 percent of men had experienced some form of sexual harassment during their lifetime. – GfK

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

The fraud landscape is continually shifting with fraudsters’ methods growing more sophisticated by the day. With cybercrime profits rising to $1.5 trillion last year, scammers are getting increasingly creative. Fraudsters are always improving on their deceptive and lucrative scamming methods alongside the continuous development of the Internet.
Let’s have a closer look at some of the most sophisticated scams surfacing on the Internet currently.

Fake website scams

Scammers might be getting more elaborate in their techniques, yet they take more precautions than ever before to avoid being exposed and caught by the authorities.

A current trend making the headlines involves scammers setting up fake websites to replicate reputable online stores or marketplaces while carefully executing their scams.

A news story broke out in Spain earlier this year and is considered to be the largest cyber-scam in the history of the country. A 23-year old, along with his accomplices, managed more than 30 fraudulent websites generating an income amounting to an estimated 300,000 € per month.

The fraudsters sold mainly consumer electronics by directing potential buyers to their own fraudulent pages which looked like replicas of reputable websites using companies’ logos and names to dupe users.

To avoid suspicion from the authorities, these sites would only stay live for a few days while being intensely advertised on search engines and social media before disappearing into thin air. The aim was to attract as many potential customers as possible in the shortest amount of time.

Victims were asked to send bank transfers for their orders, but of course, the goods would never arrive. The money received would be deposited into bank accounts set up by mules recruited to transfer money illegally on behalf of others. The scammers then only needed to withdraw the money at cashpoints.

Another technique used by fraudsters plays on the fact that most of us have been warned to check for secure websites when browsing the web. The presence of “https” and the lock icon are supposed to indicate that the network is secure, and users can share their data safely. However, this can be easily replicated. By using those security symbols, scammers make it easier for them to fool potential victims.

Fake website scams, also popping up on sharing economy websites such as Airbnb, shows that since people have become warier of online scams, fraudsters have taken their methods to the next level of sophistication.

Romance mule scams

Online dating sites have been one of romance scammers’ preferred hunting grounds to find their next victim.

The FBI’s online crime division recently flagged an alarmingly growing trend in the romance scam department. Online dating scammers have expanded their romance scam strategies by adding a new dark twist to it and taking advantage of their victims in a whole new way. This dating scam turns love-seekers into unwittingly recruited criminals, also known as ‘money mules’.

Fraudsters, under the guise of a relationship, are using online dating sites to recruit their next victim and dupe them into laundering stolen money.

Here’s how they work. The scammers, pretending to be American or European citizens living or working abroad, start grooming their victims for several months by establishing a supposedly trustworthy relationship with them. They then proceed to have their victim act as a financial middleperson in a variety of fraudulent activities. There are many ways victims can unwittingly help fraudsters. For instance, they can print and post packages or letters often containing counterfeit checks, or in other cases pick up money at Western Union and forward it to another place. They could even open a bank account under the pretense of sending and receiving payment directly, except that these accounts, unknowingly, are then used to aid criminal activities.

A study estimates that 30% of romance scam victims in 2018 have been used as money mules.

Indeed, romance scam victims are the perfect mules for these sorts of frauds. Blindly in love, victims will not doubt their lovers’ intentions and likely never steal the money or ask for anything in exchange for their services, unlike professional fraudsters.

This romance mule scam makes it complicated for local authorities to follow the money, and even if the victims get caught, they do not know the actual identity and location of the scammers. Tracking the fraudsters’ movements becomes an intricate or impossible task.

Romance mule victims often do not know they are a part of these fraud schemes or criminal activities until it’s too late. Despite the victims not losing any money, the FBI warns that they might face legal or financial consequences for participating in such a scheme.

The Telegraph reported a story about a 61-year-old British man who was tricked by fraudsters to fund terrorists in this exact way. He believed he was corresponding with a wealthy businesswoman he met on an online dating site. The woman claimed she needed her money in the UK so she could pay her European employees. The man would then send checks on her behalf, becoming inadvertently and unknowingly part of a fraud scheme.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Impersonation scams

Phishing scams are very well-known, however a variation, the ‘impersonation scam’ has boomed over the past few years impacting both users’ online safety and companies’ reputation.

These phishing emails might seem completely genuine as they look nearly identical to those of reputable and reliable websites, including Google or Apple, and often end up bypassing spam filters. Like many fraud schemes, impersonation scams are based on trust.

Last year, the popular streaming service Netflix was the target of an email phishing scam in Ireland which was sent to thousands of subscribers on the pretext of a maintenance and verification issue. In a similar way to the fake website scams previously mentioned, these malicious emails, looking like perfect replicas of Netflix emails featured a link to update credit card information or login credentials. However, the link did not direct users to the Netflix website but to a site managed by scammers.

Scams often target popular brands with a large user-base to lure subscribers into giving out personal information. Not only is this dangerous for the customer, but it also threatens brands’ reputation.

Staying ahead of the scammers

With fraudsters persistently refining their scamming techniques, companies must always be one step ahead in the prevention of these scams to protect their users and keep them safe on their site.

Marketplace leakage, which we wrote about in a previous article, refers to losing your users’ from operating on your platform and continue their conversations beyond your site’s security measures. This technique of luring users away from your site is used in the scam examples mentioned in this article, and dramatically increases the risk of your users being scammed. To ensure the safety of yours users, online marketplaces need to keep interactions on their platforms by preventing the display of personal details and hidden URLs.

Marketplace leakage can be avoided by improving your content moderation efforts. By ensuring accurate automated moderation you can instantly spot and prevent any content aimed at drawing users away from your platform.

To learn more about setting up efficient automated moderation filters to protect your users, check out our Filter Creation Masterclass.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Over the years of helping online marketplaces moderate their real estate sections, we’ve gathered a long list of best practices for content moderation rules and guidelines specific to this industry.

If you work with a real estate site or platform you know that you need to keep a close eye on the user-generated content published by users. If you don’t, your site quickly turns from a helpful tool for smooth interactions between renters and landlords to a scam infested spam forum.

Today we share some actionable guidelines on what to direct your content moderation efforts at. You can use them to increase the quality of listings, protect your users and hopefully increase the conversion rates of your real estate site.

Edit or reject ads with contact details

While the whole industry should slowly be moving towards sites that monetize and provide value through value-added services, most sites are not there yet.

Unless your site already has a unique and strong offering of additional value, it’s likely still relying on sellers who use it as a lead generator. If that’s the case you should remove all mention of phone nr, name, and email or physical addresses to prevent platform leakage.

Unconventional payment methods

Unless it’s your USP, all ads that mention unconventional payment methods should be removed. This is true both for swap and exchange suggestions, such as a car or cellphone in exchange for accommodation.

The rule applies to unorthodox payment methods such as Bitcoins and other electronic currencies.

We advise against allowing such reimbursement options as the value comparison can be hard to get right and there’s a risk one of the two parties will end up dissatisfied. You don’t want those negative feelings associated with your platform and you definitely do not want to get involved in disputes concerning unconventional payment methods.

Finally, there’s also the additional risk that some of the commodities, offered in exchange, has been acquired illegally and you don’t want your platform involved in what could essentially be seen as fence activities.

Whether you monetize your real estate platform by charging listing fees or not, you should remove listings with more than one item in.

If you charge a listing fee, sellers who post multiple items in one go are circumventing the fee and negatively impacting your revenue. If you don’t, listings with many different offerings are still really bad as they make it harder for users to find relevant results when searching for accommodation, decreasing the user experience.

Links or references to competitor sites

It goes without saying that it’s best practices for content moderation to remove or edit any mention of competitors immediately, particularly if they include outbound links.

It’s bad enough with bouncing visitors, it’s even worse if the content on your site is actively referring them to rivals in your space.

Pay attention to listing prices

We have an entire article focused on things to look out for to prevent scams on real estate sites, but one of the things we haven’t discussed in depth is listing prices.

Most areas will have a pretty baseline price range for similarly sized accommodations. For cities that are extra prone to be targeted by scammers, it’s a good idea to familiarize yourself with this range. Scammers often offer up apartments for rent at too good to be true prices. If you know the realistic range, it’s easier for you to catch them before they get to your customers.

We are currently working on building out a pricing database for some of the bigger cities in the world. If this project sounds interesting, be sure to subscribe to our blog and get informed when we have more information available.

Take a hard stance against discrimination

You’re probably already aware of the multiple lawsuits Airbnb has faced due to various instances of discrimination that’s occurred through their platform.

To avoid getting into the same legal trouble and the ensuing PR storm as well as to provide all your users the best possible experience through your site, we advise taking a hard stance against any discriminatory listings. Reject any listings that singles out people of specific race, religion, or sexual orientation etc.

Prohibit rentals that enable prostitution

For anyone who has followed the case of and how its owners were indicted for earning over $500 million in prostitution-related revenue from the site, it should be second nature to have processes in place for finding and removing any content that facilitate prostitution.

Apart from the moral implications, allowing prostitution is illegal in many countries and could land your company (and you) in both legal and PR troubles.

If your platform isn’t offering hotel rooms or vacations homes, it’s often a good and safe practice to reject rooms-for-the-night type listings. That type of listings is often advertising accommodations used for indecent interactions.

Remove duplicate items

Users will sometimes submit the same listing multiple time. Why they submit duplicates may vary, but the most common reason is to try and bump up their ranking on your site. When users try circumventing rules, it’s never good as it usually impacts either user experience, violates legal commitments or, as in the case with duplicates, could get your site penalized in Google rankings.  

The best cause of action is to remove duplicates from your site directly before they get published, this way you ensure the quality of your site and avoid a messy search experience for other users.

Re-categorize listings placed in the wrong category

Vacation homes in the permanent residency category or for-sale houses in the for-rent section, all contribute to irrelevant search results and negative user experiences.

It’s important to remove or re-categorize misplaced listings quickly to ensure a good experience for users.

Reject listings with poor descriptions

Depending on the category, the required details on the commodity for rent or sale may differ. What’s always true though is that the description needs to be accurate and descriptive. Information like location, price, minimum rent time etc. should be a given. But sellers are often in a hurry and don’t want to spend too much time on the admin work that goes into writing a good listing that converts.

Make sure you educate your users to create proper descriptions and titles for their listings, otherwise both bounce rate and conversion rates may suffer. In a study we did on user behavior we found that irrelevant content leads to 73% of users never returning to the site again.

Following these guidelines outlined above will help you eliminate fraudulent listings and improve the quality of content as well as the overall user experience of your site.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


4 most common refusal reasons on real estate sites in 2019

Now that we have gone through the best practices for content moderation let’s also quickly disclose where we find the biggest content moderation challenges lie for real estate sites.

In 2018 these were the top 4 rejection reasons for the real estate sites we help.

  • Multiple products
  • Duplicates
  • Wrong category
  • Poor description

As you can see most of the rejected items affect either user experience, and as a result conversion rate, or they impact your revenue more directly, as is the case with multiple products, where users circumvent the listing fee.

Curious about which other listings we reject for real estate sites? Check out our presentation 9 listings to remove from your real estate site to optimize for monetization from Property Portal Watch in Bangkok 2018 where we go into more details.

Want to know more about best practices for content moderation or expert advice on how to set up and optimize the content moderation on your site? Get in touch and hear how we can help out.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Without content moderation, your online marketplace or classifieds site could end up a little like the Wild West: lawless. While user generated content (UGC) is – for the most part – contributed by genuine customers, a system that’s open to the public is open to abuse: which is why content moderation is a must.

However, despite its importance, there are a number of massive misconceptions about content moderation.

Let’s bust some of those myths right now.

Myth #1: Content moderation is censorship

Some people see content moderation as a form of censorship; a way for organizations to exercise control and block comments, posts, reviews, and other types of undesirable content.

The truth is, content moderation is about providing a healthy and safe environment where users can comfortably engage with others and upload their own products, posts or comments.

Flags and report buttons allow users to notify site owners when something’s out of place, human moderators ensure that all users comply with community standards, and well-trained AI moderation solutions use filters to screen for inappropriate words, phrases, and images – helping weed out the trolls, bullies, and spammers; keeping your online space a great place to be.

In short, content moderation isn’t censorship, it’s a tool to improve user experience, ensure that you adhere to local and global laws and that your users can interact through your services without fear of getting scammed.

Myth #2: Content moderation gives no ROI

Hmm… where to start? The notion that content moderation is simply another time-consuming and resource-heavy task that provides little ROI is a common one.

Marketers need to remember that all their hard work on SEO, branding, and marketing is all for nothing if a damaging image gets uploaded, someone is bullied, or links to spam content or NSFW material get posted.

Speaking of SEO. Content moderation helps here too. By removing duplicates, re-categorizing misplaced items and rejecting low quality ads you can increase your Google ranking.

Content moderation also helps breed and maintain trust. A qualitative study we conducted  showed that 75% of users who saw scam on a site would not return and in a quantative study we found that 50% of participants encountered something they thought was a scam.

That means you could potentially experience 38% user churn just from scams.

And finally, a great content moderation strategy not only protects your brand value, it helps increase engagement as well. Besedo customers have seen a significant improvement in bounce rate, for example. In some cases, a drop from 35% to 14% – increasing the chance of conversion and return visits.

So, while you may not be able to directly translate money into money out when it comes to content moderation, you will definitely feel it on your bottom line if you neglect to set up a strong strategy for content review and management.

Myth #3: AI is not accurate enough

People just don’t seem to trust robots (thanks Terminator!). And while Skynet is still a way off (waaay off, we hope!) current forms of AI are tailor-made for content moderation – and are actually in many cases as accurate as human moderators.

Case in point: our own experience working with Meetic – an online dating platform with 6.5 million monthly users. We were able to automate 74,8% of their content from day one with 99% accuracy and over time we have moved the automation level to 90%, without negatively impacting the accuracy.

You can read more about this in our Meetic case study >>

Myth #4: Building your own content moderation solution is cheaper than using a SaaS solution

Building your own content moderation platform is a huge task, especially when you want to use machine learning AI to support your human team. Setting your developers to work on building something will cost a lot of money and take a lot of time. That’s time that could be used creating unique features that will give you the competitive edge.

Segment CEO, Peter Reinhardt states  “You should only be building your own tool if you’ve tried lots of others on the market and find that none of them will solve your problem. Only build if you’re left without a choice. And still take care: I’d estimate that 50% of the startups that I see build tools can’t maintain them.”

So if you haven’t yet tested our content moderation tool Implio, put your in-house development on hold. Implio is a proven SaaS solution, with inbuilt AI, customizable filters and an efficient manual interface developed specifically for online marketplaces, sharing economy sites and dating apps. And it’s free – for up to 10,000 items a month.

Most importantly, it’s religiously maintained, and new features are added regularly, without impacting your product roadmap.

Consider those myths busted!

There’s a lot of misinformation regarding content moderation. But fail to get it right and spam will harm your SEO, trolls will harass your customers, and irrelevant content will ruin your site’s user experience.

Want to get it right first time? Talk to our solutions designer team here at Besedo.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Getting your users content live fast is becoming ever more important, while ensuring quality content is the competitive edge needed to win the market.

At Besedo we are constantly working on improving our all-in-one moderation tool Implio for added efficiency without losing moderation accuracy.

To that end we are now releasing an updated manual moderation interface. We have tested it both internally, with some of our most experienced moderators and externally with highly successful clients to ensure that it is chuck full of usability improvements. Our new manual moderation UI will allow your moderators to make good decisions faster!

macbook for sale example page

Let’s have a look at the new features.

Innocent until Proven Guilty

It is now possible to make “Approve” the default action, this means moderators only have to click when they want to refuse an item. You will soon be able to enable and disable pre-approval within the UI, but for now you can reach out to your contact person and have it set up for you.

feedback and approval button screenshot

A Stop to Redundant Reviews

With the new interface all revisions of an item are grouped and only the latest version will be entered for moderation. You can of course still access the older versions, but you only have to reject or approve the latest one.

content moderation progression screenshot

Direct Edits

You can now edit items as you review them, directly on the page. Category and item type can be changed through a drop-down menu making re-categorization a lot faster.

Are We There Yet?

At the bottom of the new UI we have included a progress bar. This will help moderators keep track of how far they are with their current batch of items and also shows them which decisions they have taken so far.

moderation progress bar example

And Even More Efficiency Gaining Features

On top of the above improvements we have also:

  • Organized content to follow the structure defined in the API
  • Added the option to change the number of items displayed on a page (for now, to change this you will need to reach out to your contact person, but soon there will be an UI enabling you to do it yourself within Implio)

With all of these new features, moderators should be able to improve their speed significantly and with the more logical info structure and increased data visibility our new UI should prove to be a great quality of life improvement in their day to day work.

If your needs are mainly towards image moderation and you feel the new UI does not cater to this, fear not. Implio has multiple UI layout templates to choose between. Have a word with your point of contact if you wish to swap layout.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Congratulations on getting your marketplace up and running, your audience is growing and content is now steadily flowing in. To keep growing, you know you need a certain quality on your site or app, and you have read everything there is to read about “Moderation Methods”.

However, you are already starting to feel that the very basic moderation you have in place will probably not scale that well and you need to make a decision on how to move forward. A decision that will affect the future of your company: To develop in-house or to outsource? We have seen many examples of this dilemma and here are some things we have learned.

Effective moderation is a mix of manual and automation that works together seamlessly. Needless to say, achieving this requires both a team and a tool, but how to obtain them is up for discussion. There are a few different combinations and alternatives to consider here and they all come with pros and cons. We will go through the most common approaches and share our opinions based on the thoughts UGC-dependent companies around the world share with us every day.

The Moderation SWAT Team

Content moderation is a lot more complex than you may think at first and it takes time to build up the knowledge needed to see through sophisticated scams, code-named illegal content and advanced counterfeit. But that’s not enough, cyber-criminals are extremely good at spotting trends and take advantage in more innovative ways than you can imagine. This is why you need a “moderation swat team” built up of analysts and industry experts competent in the processes of analyzing patterns to detect new ways of getting through your eagle eye moderators and automation filters. For example: Carla the car-expert will instantly recognize the “too-good-to-be-true” deals and Tim the ticket-pro will know if the UEFA-cup final tickets look suspicious. Without these experts you will have a very slow team that needs to look up every single car model before being able to make a decision.

Your Professional Moderation Partner

A professional moderation outsourcing partner can solve this problem immediately as they will have the experience and structures to quickly set up a team that operates according to your site policies. But a good moderation partner will also give you insights that will help you get a better understanding of current fraud patterns and data on things like what type of content is trending on your particular site. An expert moderation partner can also scale up really quickly as they are used to utilizing resources efficiently and if needed hire and train new staff quicker than you probably would yourself.

Extra Hands and Eyes

A general “BPO” without the extensive experience and expertise with UGC moderation will give you some extra hands and eyes, often at a very low price. This can be the short term solution you need if your team is getting overwhelmed, but you will pretty soon need to find a better way to be able to guarantee a safe and clean marketplace, as trust and quality is vital for continuous growth, which you can read more about in Why Moderation is Vital for Marketplace and Dating Startups.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Doing It Yourself

If you choose to build up your own moderation team, this article on How we Build Moderation Teams to Deliver Success will give you some valuable tips on how to get started. Another option is to engage a moderation partner for your first year or two, to get a flying start and gather the data and insights needed to build up an internal team at a later stage. This will also allow you to focus on growth in a critical period of your site’s or app’s life.

Collaborating with Your Partner

If you decide to hire a moderation partner, choose it with care and make sure that they understand the importance of communication. You will need a dedicated contact person with whom you communicate regularly. Set up processes for not only reporting KPIs, but also for continuous learning both ways. Your partner needs to be extremely responsive to things going on in your company and flexible to make quick changes. But they should also communicate back to you patterns seen and educate you on how to adjust your policies accordingly. Last but not least are the cultural aspects. The partner needs to understand the specific needs and cultural aspects of your particular market, and for them to become a natural extension of your team, the time zone, language and culture have to be taken into consideration. If you cannot communicate on your time zone, in your language and in the manner you are used to, the collaboration risks being forced and un-natural which is a sure way to fail.

The Automation Tool

How hard can it be to hack together a profanity filter that catches bad language and illegalities? Not that hard. How hard is it to maintain it and keep it up to date in an ever-changing digital environment? A little tougher and much more time consuming. An automation tool is only as good as its filters and, just as with manual moderation, you need to always be one step ahead. And moving into the future, you will need some more advanced technology like machine learning.


A ready-made tool will allow you to focus on your top priorities: growing your audience, building a platform or fine tuning conversion and retention. But more important is that you will get a short-cut into the latest technology, which is in really rapid development right now. Machine learning is the word on everyone’s lips in 2016, and things are happening extremely fast in this area. Even if your extremely smart devs spend significant time on research and development, it will be a true challenge to compete with those whose only priority is to fine-tune algorithms and learn from the bigger user base that comes with several customers.


Building your own tool can allow you to fine-tune it even more to your particular needs, and it may also be easier to integrate with other systems that you have built yourself. Just keep in mind that the more you integrate the harder it will be to maintain. Most of our customers who have had their own moderation tools are realizing that they keep down-prioritizing the maintenance due to more urgent fires and growth opportunities closer to their core business. And all of a sudden they find themselves with an outdated automation solution that simply doesn’t do the job. So if you choose this option, make sure you give your developers the time they need for maintenance and that someone keeps an eye on the filters in order to fine-tune them according to user patterns, policies and trends.

Where Should You Put Your Focus?

Think about it, the challenges to succeed in a saturated market are plenty and you should really try to put all your focus into what you do the best: developing that kick-ass marketplace and get buyers and sellers to try it out! It might not surprise you that we promote an external product and team, but we are seeing too many examples of marketplaces getting stuck with old technology or startups slowed down when trying to handle overwhelming amounts of content that turned out to be more complex than originally anticipated. Whichever way you choose our main piece of advice is to plan for the long run. You will save a lot of resources if you make the right decision on moderation strategy from the start.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background