A good website loads fast, boasts a beautiful design, is search engine friendly and offers a brilliant user experience design. In fact, having a website with a poor design could make users feel like your brand is of poor quality or untrustworthy.

*record scratch*

But if you peel off that top layer of design elements – what is a user experience, really? 

Nielsen Norman Group probably says it best that “user experience encompasses all aspects of the end-user’s interaction with the company, its services, and its products.”

All your design efforts will come up short if your website, or app, is not supporting your users’ goals. To most business owners, these goals are so fundamental that they risk being forgotten when you’re focused on all aspects of your business. With user-generated content platforms such as dating apps, marketplaces, video streaming, etc., you’re essentially handing over a massive chunk of your user experience to your community.

Consider this: You are interested in buying a bike, so you hop on your favorite marketplace app and search for bikes. The search result shows hundreds of postings near you. Great! The only thing is; first, you must wade through 4 pages of inappropriate images, scams, and harassment.

Two apps showing content with and without content moderation
Moderated content is a big part of creating a great user experience

To quote Donald Miller, “a caveman should be able to glance at it and immediately grunt back what you offer.” This is referred to as the Grunt Test; it’s a real thing.

Many marketing reports show poor design decisions are culprits why customers may leave your site. That’s a given. One report says that 88% of online consumers are unlikely to return to a website after a poor experience.

With user-generated content platforms you’re essentially handing over a massive chunk of your user experience to your community.

Most likely are those numbers closer to 99% should we remove content moderation from the user experience equation.

The User Experience Honeycomb

At the core of UX is ensuring that users find value in what you provide. Peter Morville presents this magnificent through his User Experience Honeycomb. 

The user experience honeycomb as presented by semanticstudios.com

One of the 7 facets of his honeycomb is “credible,” as Morville notes that for there to be a meaningful and valuable user experience, information must be:

Credible: Users must trust and believe what you tell them.

So what if your information and content are user-generated? Then you aren’t the one providing the credibility.

User Experience in user-generated content

We would argue that Credible (or Trust) serves best as the base for your user experience when it comes to user-generated content apps and websites. After all, the user experience is more than just something intuitive to use.

When User Experience Fails Despite Good Design

Few things will hurt your users’ confidence in your app faster than harassment or irrelevant content. In-game chats and, to some extent, dating apps are breeding grounds for trolling. Flame wars can create an unfriendly online environment, making other users feel compelled to respond to abusers or leave your platform entirely. 

Harassment still happens, and no one is immune, despite your platform’s fantastic design.

The emphasis on trust and credibility can not be overstated when your platform relies on user-generated content.

Online reviews and comments from social media are the new word-of-mouth advertisement. With a growing pool of information available online to more consumers, this form of content could either become an effective branding tool or the undoing of branding. Even if the content does not appeal to children, they may still flag it on the site or tell an adult they trust.

Trust user reviews, images, and videos

Suppose handing over a big part of your customers’ user experience to largely unknown users feels like a scary ordeal. In that case, you’re in for a rollercoaster regarding reviews.

Fake online reviews are more prevalent than you might think and could lead you to purchase a product you would not have otherwise. Fake customer reviews are usually glowing, even over-the-top, reading more like infomercials than reviews. One MIT study found that fake reviews typically contained more exclamation points than genuine reviews. Fake reviewers believe that by adding these marks, they’ll emphasize the negative emotions behind their feedback. 

Conversely, it is not uncommon for sellers to purchase fake, one-star reviews to flood competitors’ pages.

According to research, 91% of people read online reviews regularly or occasionally, and 84% trust online reviews as much as personal recommendations.

Building trust into the user journey

Your online business aims to attract, retain, or engage users; creating an experience that turns them off is definitely not a smart step in this direction. It should be kept in mind that users should have an accessible and user-friendly experience when going on this journey with you. We even published a webinar about building trust into a user journey if you’re interested.

Find out which online marketplaces are the biggest in various countries, categories, and much more in our definitive list of marketplaces worldwide.

There are obviously various advantages of selling products on the best-rated marketplaces as a seller. With so many visitors, Amazon has significant benefits as an online market when you choose to sell on the internet. One of the biggest reasons you should trade in online markets is the established audience. Suppose you are a seller who is just getting started selling online. In that case, marketplaces can be a fantastic way to earn some income and establish your brand while working on driving traffic to a new e-commerce site.

Just how big are the biggest marketplaces in the world?

What is an online marketplace?

First of all, we need to understand what defines an online marketplace. It boils down to two key features:

  1. Sellers and buyers are trading through the same website (or app).
  2. The buyer can complete their purchase on the website (or app).

This excludes price comparison sites like PriceRunner or Google Shopping. They are essentially advertising channels rather than online marketplaces.

The buyers are mainly consumers, not businesses. The marketplace sells physical products, not just downloads, streaming, or other services.

We start with approximately 200 marketplaces with more than one million visits per month. Then we look at the most popular product categories and the break-out stats for a few countries.

In short, we are looking at actual online marketplaces where you can sell physical products to consumers.

The world’s top online marketplaces

#NameCategoryVisits/month
1AmazonGeneral4.81B
2eBayGeneral1.18B
3RakutenGeneral542.7M
4Marcado LibreGeneral511.8M
5ZalandoFashion420.0M
6ShopeeGeneral415.7M
7AliExpressGeneral390.9M
8WalmartGeneral387.3M
9EtsyArts, Craft & Gifts373.2M
10TaobaoGeneral277.9M
11WildberriesGeneral232.7M
12TrendyolGeneral222.1M
13AllegroGeneral189.5M
14FlipkartGeneral186.9M
15PinduoduoGeneral183.8M
16TargetGeneral165.9M
17JDGeneral164.7M
18OzunGeneral164.5M
19TokopediaGeneral158.8M
20MercariGeneral132.6M
21OlxGeneral102.1M
22TmallGeneral113M
23WayfairHomewares98.98M
24AmericanasGeneral97.67M
25AlibabaGeneral90.76M

Estimated monthly visits for April 2022, from SimilarWeb. Traffic to different domains (e.g., amazon.com, amazon.co.uk, amazon.de, etc.) is combined.

All these marketplaces sell goods under a general category, and all except one (Amazon) are pure marketplaces without any retail operations of their own. All these marketplaces sell general goods except Zalando, Etsy, and Wayfair

“Wayfair and Amazon account for 63% of furniture sales online”

Only Amazon and eBay break the one billion visits mark. However, Rakuten and Mercado Libre aren’t too far behind, with over 500 million per month. And Zalando is hot on their heels with 420 million visits per month.

The most well-known retailer to have its marketplace is Amazon, with more than 50% of sales now made through Marketplace sellers. In addition, the biggest players in online furniture marketplaces, Wayfair and Amazon, account for 63% of furniture sales online. 

US-based e-commerce giant Amazon, the top-ranked e-commerce company worldwide by market capitalization, is the third-largest online furniture market. Amazon.com is the most visited e-commerce marketplace, with an average of over 2.3 billion visits per month.

If you merge the visits to the biggest Amazon domains, the visits are almost 5 billion per month.

It is not surprising to see Amazon and eBay among the top three; eBay receives 1.2 billion visits per month. Suppose you add up Amazon, eBay, and Etsy. In that case, you are looking at 500 million+ monthly active visitors, which is an enormous amount of real estate on the internet. 

How did the pandemic change our online adoption rate?

According to a study by McKinsey, COVID-19 has pushed companies over the technology tipping point–and transformed business forever. Digital adoption has taken a quantum leap at both the organizational and industry levels. Of course, this affects our consumer behavior on marketplaces all over the world.

Graph with illustration of leap in digitization

Markets in Europe

The most popular market in Europe is Amazon, which gets 1.6B visits per month. At the same time, eBay receives less than half that traffic, at 634M visits.

Another American-based, general-purpose global marketplace, eBay, received 255 million monthly visits in the United Kingdom.

With these impressive numbers, eBay is the only marketplace that comes anywhere near matching Amazon’s numbers in the UK for visitors. 

Amazon is the largest market in the US, with over 300 million customers on Amazon, 100 million of whom are Prime members. The best-known online marketplace is also Amazon, thanks to Amazon’s strong delivery and fulfillment capabilities and its seamless shopping experience. 

Walmart offers various categories of products that draw large volumes of visitors every month, making it one of the leading online markets in 2022. 

Since the rise of the titans such as Amazon, eBay, and Alibaba, brands have been racing to thrive, compete, or get beat in the online markets. Companies like Walmart added marketplaces to their existing retailers’ websites, giving shoppers more choice in products while creating price competition among sellers. Of these major online markets, three are in China, and two are based in the U.S., the two biggest drivers of e-commerce sales growth.

The same is true for the sales from online retailers, which are expected to also increase significantly over the next few years. 

There are also niche online markets such as Bonanza and Fruugo and Hollar, fashion-focused markets like Zalando and Fullbeauty, and deal-focused markets such as Tophatter and Tanga – the list goes on. 

European consumers are using the greatest number of different marketplaces – 63 have more than one million visits per month, generating more than 3.6 billion visits in total.

Let’s have a look in-depth at some categories.

Fashion online marketplaces

#NameCountry/RegionVisits/month
1ZalandoEurope420M
2SheinGlobal148.3M
3ASOSGlobal64.64M
4MyntraIndia53.46M
5ZozoJapan48.95M
6AjioIndia31.55M
7StockXGlobal30.51M
8VintedFrance29.81M
9DSWUSA24.97M
10FarfetchGlobal23.06M

Clothing and fashion are one of the most popular online marketplaces niches. Popularized by many influencers online, the fashion and clothing industry has really found its market on popular apps like TikTok and Instagram.

Fashion marketplaces are spread out worldwide, with Europe, USA, and India in the top-5 spots.

Electronics online marketplaces

#NameCountry/RegionVisits/month
1Bestbuy.comUSA, Canada46.46M
2Gearbest.comUSA, Canada52.33M
3Offerup.comUSA20.15M
4Newegg.comUSA, Canada13.25M
5Bhphotovideo.comGlobal12.56M
6G2A.comUSA, Canada10.18M
7Digitec.chSwitzerland9.34M
8Shutterfly.comUSA, Canada6.65M
9CDOnEurope5.78M
10Game.co.ukUK2.12M

A surprise inclusion on this list, for anyone outside of the USA, is probably Offerup in 3rd place on this list.

Electronics are typically commodities – easily available and extremely price-sensitive.

Travel and Tourism

Travel and tourism are irrelevant when we only look at physical goods. But just for comparison, we thought we’d have a look at the top 10. Booking.com is three times bigger than the runner-up Tripadvisor. Booking is also bigger than Tripadvisor, Airbnb, Expedia, Uber, and Jalan (positions 2–6) combined.

#NameCountry/RegionVisits/month
1Booking.comGlobal490.5M
2Tripadvisor.comGlobal148.1M
3Airbnb.comGlobal89.20M
4Expedia.comGlobal88.19M
5Uber.comGlobal75.85M
6Jalan.netJapan52.38M
7Hotels.comGlobal51.17M
8Agoda.comIndia48.68M
9Travelersdream.comUSA47.36M
10Vrbo.comUSA44.89M

“Booking is bigger than Tripadvisor, Airbnb, Expedia, Uber, and Jalan combined.”

Top online marketplaces by country and region

#RegionMarketplaces*Visits/month
1North America554.5B
2Europe694.1B
3East Asia192.7B
4Latin America191.5B
5Southeast Asia15820M

* Includes only marketplaces with more than one million visits per month.

North American consumers generate the most traffic to online marketplaces, with little more than 4.5 billion visits per month and more than 50 different marketplaces having one million or more visits each. 

Europe is the runner-up with the highest number of marketplaces with over one million monthly visits.

The third is East Asia, and that is primarily China and Japan, with an estimated 2.7 billion visits.

Online marketplaces by country

Breaking down the top online marketplaces by country. This is a little trickier because we can’t find data for many countries, so this is for the United States and the United Kingdom.

United States

#NameCategoryVisits/month
1AmazonGeneral1.46B
2ebayGeneral665.5M
3EtsyArts, Crafts & Gifts371.3M
4WalmartGeneral363.1M
5Target.comGeneral147.5M
6WayfairHomewares98M
7PoshmarkFashion42.3M
8BestbuyElectronics41.4M
9Samsclub.comGeneral35.8M
10OverstockGeneral23.8M

United Kingdom

#NameCategoryVisits/month
1AmazonGeneral350.1M
2eBayGeneral238.1M
3EtsyArts, Crafts & Gifts33.39M
4ASOSFashion19.6M
5JohnLewis.comGeneral16.74M
6veryFashion11.1M
7DiscogsMusic5.1M
8ManoManoHomewares4.6M
9DepopFashion3.1M
10HomebargainsHomewares2.3M

About the data

The lists are ranked by estimated website visits, based on SimilarWeb and Statista data for April 2022. Please note that traffic to different domains for the same marketplace (amazon.com, amazon.de, amazon.jp, etc.) has been combined. It’s regarded that Gross Merchandise Value might be an ideal measure of size, but that data is not available for most marketplaces.

Due to the lack of reliable traffic data from the sources, we have not included app-only marketplaces.

One of the surprising and amazing things about the internet, especially when we’re talking about things involving user-generated content, is how many ideas become well-established parts of society before we even fully understand them.

Online dating is a perfect example. While the first big dating site was launched more than 25 years ago, Tinder – which broke new ground for dating apps – only just turned 10. And yet, it’s already perhaps the main way of meeting a partner, with a Stanford study in 2019 finding that 39% of heterosexual couples met online.

Despite its massive popularity, though, we’re not necessarily wise to everything that dating apps have to throw at us or aware of all the ways they’re changing society or us. Unusual and potentially dangerous trends are emerging from online dating all the time, and never fail to claim headlines in the media: Stylist, for example, recently described voicefishing as “the freaky new dating trend we’re trying to get our heads around.”

The Challenge For Apps

On the one hand, then, there’s frequent discussion of the dangers that online dating can pose, but, on the other hand, people clearly keep coming back anyway to enjoy the benefits of digital matchmaking. Dating apps clearly have a job to do to make sure that people are empowered to be flirty, but not dirty; daring, but not dangerous.

That work all comes down to steering user interactions in the right direction, and that’s not always easy. The difficulty is illustrated by a study that recorded how dating app users speak about their experiences. Direct quotes from study participants show how dating apps – like many online venues – develop their own language, codes, and norms:

If you don’t know what some of these phrases mean, well, don’t worry: that’s exactly the point! We know that dating app users successfully find partners and have fun doing it – but while they’re doing so, they also have to navigate a fast-changing environment of language and interaction.

The moderation response

What is a challenge for users, here, is just as much of a challenge for content moderation strategies: both human moderators and automated tools need to constantly learn, adapt, and evolve in order to keep users safe. At the same time, though, the freedom to be playful and inventive in how you speak and interact with others is an important part of how dating apps work for people. While the medium might be digital, it’s still (hopefully) a real person on the other side of the screen – and, just as with bumping into someone on the street or meeting them in a bar, the element of surprise and fun is essential.

There’s a fine line to tread, then, when moderating dating apps. We need to react quickly to new threats, but intervene gently; we need to be strong in safeguarding users, but permissive about how they act; and we need to listen and learn attentively about how people interact. Needless to say, it’s a job that takes real expertise.

What do you think? Do dating apps get it right when protecting and helping their users? How can we respond to the ways language is evolving online?

And what is a “buyer”, anyway?

If you want to talk about it, reach out today.

Axel Banér

Sales Director – EMEA

In parts one and two of this blog post series about the evolution of language, we talked about how moderating user-generated content (UGC) echoes how people communicate. And also how the rapid evolution of language online is now making that job harder.

In short: there’s nothing new about setting rules for acceptable speech, but we have to get faster about how we do it.

However, it’s also worth thinking about how online communication doesn’t just build on how offline communication works but offers something genuinely new and different. The fact that email was created to be a digital equivalent of postal mail, for example, is right there in the name – but today, email offers much more than the post ever could, from uniquely personalized content to embedded video.

Across the internet, there’s a wealth of communication options, ranging from adding simple emoji to broadcasting yourself live to millions of viewers, which don’t have a direct offline equivalent. In a way, of course, pointing this out is stating the obvious; those otherwise impossible options are, to a large extent, precisely why the internet is so powerful and popular.

Risk-reward?

And yet, from a business perspective, it would be easy to look at this UGC and see it as something quite similar to cybersecurity. Cyber attackers are often locked in a kind of arms race with security professionals, each trying to identify weaknesses first and develop more robust tactics than the other. Having a wide variety of options is also a range of potential ways to get around the policies that a platform might want to impose – whether it is stopping people from conducting business through other channels or monitoring for much graver abuse issues or hate speech.

Giving shoppers the power to post videos of products they purchase, for example, has clear benefits in building credibility. But, conversely, users can use that feature to publish irrelevant or even maliciously untrue content. Or, building reaction gifs into an online dating messaging platform might enrich conversations but could also be used as an avenue for guerilla marketing.

The sheer variety at play here marks a real difference from the offline reality of (mostly) speech and writing.

While these concerns are well-founded, thinking about this kind of UGC in these terms runs the risk of missing how vital it is as an engine of growth for online businesses: the perception of danger might cloud sight of the benefits.

The most successful moderation approaches are about enabling interactions as much as they are about blocking them; not an arms race, but teamwork.

New moderation for new communication

It’s becoming more widely understood that offering advice about, examples of, and benefits in return for positive behaviors on platforms is ultimately more effective than punishing negative behavior. This is something that research has shown, and it’s a method that large online platforms are increasingly turning to.

Here we might be looking at something fundamentally different from the long offline history of moderating speech, which has typically relied on limiting certain expressions and interactions.

When businesses make themselves open to users and customers communicating in richer ways, we think that the best approaches will focus on how moderation can empower users in ways that enable growth. An entirely conservative approach will only stifle the potential of audiences, customers, and users.

These new worlds of content will not be effectively moderated using tools and methods adopted to deal with purely text-based interactions. As users’ interactions become more complex, we will need human input to oversee and understand how those interactions are working.

Petter Nylander

CEO

Dating apps are once again preparing to be abuzz with activity for Valentine’s Day. Even though outlooks toward dating apps have become increasingly positive over the past few years, with platforms gaining in both popularity and users, they have throughout their short existence continued to attract a great deal of attention on the risks they pose to users from a personal safety perspective.

Any dating app user will be familiar with the anxiety involved with moving from digital to in-person interactions, and unfortunately, that anxiety has a legitimate source. According to the Pew Research Centre, one in two online dating users in the US believes that people setting up fake accounts to scam others is very common.

The financial details back them up, too: the FTC recently highlighted that, with $1.3b in losses over the last five yearsromance scams are now the biggest fraud category they track.

And people who strike up online relationships between Christmas and Valentine’s Day might be at particular risk of romance fraud. Last March, for example, the UK’s National Fraud Intelligence Bureau experienced a spike of romance fraud reports. It’s little wonder, then, that Netflix chose the start of February to release its true-crime documentary The Tinder Swindler.

With online dating apps now entirely mainstream as one of the default ways of meeting people, with over 300m active users, it is more important than ever that the businesses running them take strong steps to protect user safety. This is a moral imperative, of course, in terms of working for users’ best interests – but, as the market matures, it’s also quickly becoming a potentially existential problem for dating platforms.

Challenges faced by those looking for love

When considering managing the online reputation of a company, user experience, and business outcomes are often one and the same thing, meaning that moderation is an important measure to consider. Disgruntled customers, for instance, often utilize social media to publicly criticize companies, leading to a backlash that can rapidly spiral out of control.

It’s not easy, however: online dating is, understandably, a highly sensitive and personal area. Users who might otherwise be highly cautious online are more likely to let their guard down when it comes to looking for love. Platforms have a duty of care to their users to put a stop to fraudulent behavior in order to support and protect their users in a way that does not feel ‘intrusive’.

Effective moderation in this space demands a range of approaches. A well-moderated dating app generates a more seamless and convenient user experience which in turn reduces spam content and unhappy user feedback. Keeping users safe, creating the right brand experience, and building loyalty and growth go hand in hand.

How it works in practice

As we enter a peak season for online dating, a moderation strategy that brings users closer to the people they want to connect with, with less spam and a clearer sense of safety, will be a real competitive differentiator. Ensuring a safe and positive user experience should be at the heart of dating sites’ content moderation strategy.

AI-enabled content moderation processes are essential to catch and remove these fraudulent profiles before they target vulnerable end-users. Online dating app, Meeticimproved its moderation quality and speed with 90% automation at 99% accuracy through an automated moderation platform.

With dating apps relying so heavily on user trust, it is essential that platforms are able to detect and remove scammers, whilst maintaining a low false-positive rate to ensure minimal impact on genuine users. Content moderation teams must also be continuously trained and updated on the ever-evolving tricks of romance scammers.

A content moderation partner can be a great way to ensure high accuracy, and automated moderation to maintain a smooth customer experience. Only with a team of highly trained experts coupled with precise filters and customized AI models will online dating sites be truly efficient at keeping end-users safe.

Platforms cannot afford to make this a ‘non-issue’ – even if users do not experience it themselves, many will see others being harassed online and experience negative feelings towards the brand and platform. For platforms, everything is at stake for both their reputation and ultimately, the wellness of their users.

Martin Wahlstrand

Regional Sales Director Americas

Martin is Besedo’s Regional Sales Director Americas. While you can’t swipe right on anyone here at Besedo, Martin and his team would love to give you a demo of how content moderation can help your users be safer and have a great user experience.

Evolution of language, part two: flexing with the speed of conversation

Have you ever been overheard and misunderstood by a stranger? It’s not, thankfully, an everyday occurrence, but it is something that most of us have probably experienced at least once. Imagine you’re talking to a friend somewhere public, perhaps in a café or on public transport. Maybe the conversation turns to discussing a film you both like or joking about a recent political event. Suddenly, you realize that, a few meters away, someone has caught a few words midway through your chat, and doesn’t look at all happy about how you’re speaking.

Words don’t have to mean anything offensive in order to cause concern when taken out of their context – of being a fictional story, for instance, or an inside joke between friends. Language is always social, and seeing it from a different social vantage point can cause serious errors in interpretation.

The social question in content moderation

Being mistakenly overheard is a very small version of something which happens all the time on a much larger scale. Particularly in an age where cultural ideas spread virally, it’s not unusual for people to find it hard to keep up with what the conversations around them mean. For example, a new hit crime series on Netflix may leave someone confused, at least for a day or two, as to why they keep hearing people describing gruesome murders.

If this kind of event can temporarily wrong-foot human beings, though, it’s a much more persistent problem for content moderation systems. After all, while office worker can ask their colleagues what is going on, content moderation generally can’t directly ask the user what they mean, even when human moderators are involved. Automated systems, meanwhile, can maintain full awareness of anything happening on-platform – but often have little scope to understand that in terms of the wider world.

In one way, this situation is unsurprising: content moderation systems have evolved to meet specific business needs, such as protecting brand reputation, maintaining revenue, and protecting user safety, and the state of the art is extremely effective at achieving this. Tools from filter lists to machine learning are very powerful when the aim is to create clear boundaries for acceptable speech.

They are less well-suited, however, to these situations which can cause the greatest friction with users, when seemingly normal interactions are punished without any apparent explanation. No matter how well trained and refined a business’s content moderation is, a system focused on the platform will always have the potential to be surprised by the world-changing around it.

The cultural stakes of moderation

As user-generated content takes on an ever more central position in people’s daily lives, content moderation likewise takes on ever more responsibility to behave effectively, and the potential for disagreement between businesses and userbases grows ever more severe. In the extreme end, this can lead to international media attention on a system trying to deal with content it was not specifically designed for.

To put it another way, while protecting (for example) brand reputation may once have meant strictly enforcing rules for user interaction, expectations of user content are evolving and the demands on businesses that rely on user content are becoming more subtle. Automated moderation which doesn’t keep pace with cultural shifts, therefore, is becoming less palatable.

One consequence of this is that we still rely on humans to help decide which side of the line to err on in ambiguous situations. While they can’t directly address users, human moderators nonetheless outperform their machine colleagues when it comes to making subtle distinctions. This still leaves problems, however, of establishing large enough teams to cope with the volume of content and ensuring that the manual review process engages with the right content in the first place.

In the longer term, the expertise of the content moderation community will have to be seriously applied to thinking about how to help create healthier, more human conversations – not just limiting those which are clearly negative. We think that user-generated content presents a far greater opportunity for sustainable growth than it does a risk factor for brand damage; as we consult with our customers (and speak to you through blogs like this) we’re always keen to hear more about how the next, more subtle generation of content moderation tools might best be designed.

Petter Nylander

CEO

If you mention the word ‘forum’ to someone today, their immediate thought will, almost certainly, be of an internet forum, with its members discussing anything from astrophysics to finding cheap offers in supermarkets. It’s a way of using the internet which goes back to the network’s very earliest days: even before the World Wide Web, early adopters were talking and arguing through services like USENET. And, while much online conversation has moved to social media and other spaces, it’s fair to say that forums laid the groundwork for those platforms, establishing expectations about how online interaction operates.

In short, it’s easy to talk and think about forums without being conscious of the word’s original meaning: in the Roman Empire, a forum was the public space built at the center of every town or habitation. Established principally as a place for people to buy and sell goods, the forum also became the heart of Roman social life, as a place to meet, chat, and debate. Often, then, the modern idea of the Roman forum is one of the political and philosophical discussions – or, often, arguments.

Moderation through the ages

Just as behavior in a Roman forum would have been kept in check by social standards and peer pressure, early online forums relied on oversight from individual users with special privileges volunteering to maintain safety and civility.

Of course, the speed and scale of online conversation soon outstripped individuals’ abilities to keep up, and more formal solutions had to be found. Professional moderation teams are now common, and meeting the challenge also necessitated moves towards automated moderation, first in the form of word filters and more recently with AI-based approaches; an always-on solution to an always-on problem.

What, though, does it mean to be always-on for online speech? These systems don’t just need to oversee speech; they also need to adapt and learn in order to keep pace with the way which language (especially on the internet) is adapted and evolved. ‘Forum’, after all, is not the only word that has had its meaning changed by the digital age: one of the remarkable things about the internet is the speed at which it generates both new words and new meanings for old words.

The crux of this problem lies in the fact that, while automated digital systems tend toward categorizing the world into neat boxes, such as ‘acceptable’ and ‘unacceptable’ speech, language itself is fundamentally ambiguous.

The dictionary definition of the word ‘dead’, for instance, would make it a fairly unambiguously negative piece of speech. Posted as a reply under a joke, however, it (or a skull emoji) would actually signify an exaggerated way of saying that the user found it hilarious. This kind of ambiguity is rife, too, in online gaming, where ‘gg’, meaning ‘good game’, is used as a virtual handshake with one’s opponent. After a particularly one-sided match, however, saying ‘gg’ might actually be deeply antagonistic behavior.

While these are relatively light examples, the same pattern can be found in the darkest areas of online speech, where hate groups who are aware of the attempts that content moderators will make to keep them out of communities will frequently change how their language is coded to disguise their intent.

The next frontier

All of this is coming in the context of a level of activity that, across the internet, amounts to billions or trillions of interactions a day. In these kinds of edge cases, the content being posted can be too new and variable for AI-powered solutions to respond to, and too voluminous for human operators to keep up with.

This is something which, as we look to a healthier future for content and content moderation, the industry as a whole will need to work on and take seriously. Historically, businesses have tended towards being over-cautious, preferring to accidentally block safe speech than accidentally allow unsafe speech. However, the tide is turning on this route, and we think the next frontier for content moderation will be to take a consultative approach toward more subtle solutions.

As we talk to customers, we’re always keen to learn more about how their users use language, where it slips through the cracks of the systems we’re building, and what kind of insight they would need in order to manage user interactions more effectively – and if you’d like to join the conversation, we’d love to hear from you too.

In the meantime, the most effective approaches will draw on the full content moderation toolbox, applying word filters, AI oversight, and human intervention where their respective strengths are most impactful. As we do so, we are building the insight, expertise, and experience which will deliver an approach to content moderation that is fully alive to how language evolves.

Petter Nylander

CEO

So, as another year draws down and a new one dawns, it’s again that time when we reflect on what happens – and take a moment to think about what the future holds. Living in these disrupted, dramatic times, though, it can be hard to see the wood for the trees: with so much going on, and so much uncertainty about what it all means, we might feel like it’s a tougher job than usual to take stock this year.

In the case of content moderation, for example, so many of the important recent changes have actually been impacts felt by the technology industry as a whole. Even as the world, on average, emerges from the other side of the pandemic, we’re seeing little pull-back from the spike in technology usage that it triggered. Commerce is more online than ever, work is more remote than ever, lives are more tied to platforms than ever, and social expectations about how we use technology have been set on a new path.

Looking back

All of which makes it easy to miss the fact that, even taken in isolation, content moderation has had a really interesting year. Even while they have been dealing – along with everyone else – with changing user habits, professionals in the space have had to keep one eye on how upcoming legislation will soon rework how businesses work with user-generated content.

Perhaps most prominent has been the EU’s Digital Services Act which, when it is ratified, will require much more extensive reporting, complaints mechanisms, and (for the largest platforms) auditing. It’s not alone: Australia’s Online Safety Act 2021, passed this summer, creating a governmental eSafety Commissioner position with oversight of online content, while the UK is currently working on an Online Safety Bill to regulate content.

It’s likely that businesses will still have a long way to go in order to prepare for these regulations: research we ran towards the start of the year found that few were prepared for – and many were unaware of – the Digital Services Act.

Even the growing attention from governments on how content moderation operates, however, might not be the biggest thing facing the industry right now.

Looking forward

That’s because businesses are looking at an even more immediate impact in terms of how users actually use their platforms. We can tell the story in a few key statistics: nearly 50% of internet users look for videos related to a product or service before visiting a store; 72% of customers would rather learn about a product or service by way of video; social video generates 1200% more shares than text and image content combined.

In other words, before thinking about changes in how content is moderated, we need to deal with the fact that what is being moderated is changing rapidly. Whether the task is automated or taken on by a human, video is significantly more difficult and time-consuming to moderate than plain text. Even outside of video, as we’ve recently discussed, AI-led content moderation needs to improve its capacity to keep up with the speed at which human communication evolves online.

If we’re going to make a prediction for the next year of content moderation trends, then, we should start where businesses should always start: by thinking about the user or customer, what they need and want, and how we can step up to meet those desires.

From that perspective, here at Besedo we think that the story of 2022 for content moderation is going to be one of rising as a strategic priority in many different kinds of business. User habits and expectations are clearly changing, but the kinds of user-generated content available (and, more importantly, the quality of that content) is still very unevenly spread across different businesses and platforms. Where one clothes shop, for example, might be enabling users to upload videos, another might only just have introduced text reviews. That makes content, when done well, a powerful competitive differentiator, in a way that will come to the fore as our new assumptions about how we use the internet solidify.

Historically, content moderation has often been seen as a defensive measure, protecting businesses against negative outcomes, and new legislation may well sharpen what that looks like. The real opportunity coming up, however, is to see how it can be an asset to the customer experience, ensuring that this is not just content they have the option of seeing, but the content they really want to see.

Martin Whalstrand

Sales Director – Americas

It is no surprise that during the festive season, record numbers of people use the internet. Customers turn to e-commerce for all their Christmas shopping needs, dating apps see spikes in users as people swipe right to find new people to talk to and online gaming proves a popular option for those with extra time off during the holidays, looking to relax and take part in some online recreational fun.

It is a huge opportunity for online retailers, especially for online marketplaces, where people will be looking for the best deals available – but this also presents an opportunity for scammers and fraudsters, which is too good to miss, looking to take advantage of what can only be described as a chaotic season online.

Safety of the user, not profit for the platform

For brands, this is the time of year when the hard work put into creating good experiences to strengthen reputation really comes home. No one wants to be the brand to ruin Christmas or fall short at a critical moment. In online business, it should be the safety of the user, and not the profit for the platform, that comes first, putting user experience at the heart of their priorities to increase customer trust, acquisition, and retention.

Christmas is a high-pressure moment in the calendar, however, it’s also a time where good experiences will be more impactful, and negative ones will have greater consequences. Whilst individuals should naturally take more additional care online and remain vigilant, we should also expect platforms – whether shopping, dating, or gaming, to protect users from potential harm, even more so at times of peak traffic.

It’s the most fraudulent time of the year

All marketplaces need to take precautions to prevent fraud on their platforms in the lead up to Christmas and onwards, keeping user experience front of mind. Larger marketplaces may see a smaller percentage of fraudulent posts, but considering that they have a much larger user base, even small percentages can lead to thousands of users becoming victims and associating a negative experience with the site. These sorts of harmful experiences could risk long-term reputational damage and the potential for fraud on their platform to spiral out of control.

As passions run, it’s not surprising that bad actors take the opportunity to engage in scams and fraud during the festive period. Last year, we decided to prove it. Our moderators investigated nearly three thousand listings of popular items on six popular UK online marketplaces, in order to understand whether marketplaces have content moderation pinned down, or, whether a fraudulent activity was still slipping through the net. The findings revealed that 15% of items reviewed showed signs of being fraudulent or dangerous.

This year, the risk is set to be higher than ever: a survey from an independent UK parcel carrier Yodel suggests that almost a third of people plan to do the entirety of their festive shopping online this year, more than a fourfold increase from last Christmas.

Good user experience, not just for Christmas

While this spike in usage means that brands have to be more vigilant than ever, it’s also the case that these trends are unlikely to reverse. This makes this Christmas an important learning opportunity, which will stress-test businesses’ moderation systems at activity levels, which might soon become the norm.

A positive and seamless customer experience at Christmas will not only drive sales in the short term but will also help to engage customers, building an emotional bond with the brand and in turn, increasing customer loyalty. But it should be remembered – the importance of a positive customer experience this Christmas is not only essential for the festive season but throughout the year.

With seasonal greetings

Axel Banér

Sales Director – EMEA

User-generated content, or UGC, can open businesses up to a great deal of risk. While the text, video, and audio that a user creates are, generally, tied to their profile and listed under their name, all of that content is nonetheless presented as part of the business’s identity.

Even if it’s only unconscious, interacting with unwanted content can significantly affect a brand’s reputation – and if a user is led off-platform by a piece of UGC, any negative consequences that are likely to still be associated in their mind with the platform they originally came from.

In spite of this, the adoption of UGC in online business is continuing at a high pace. This is obviously the case for companies for which content and interaction are their raison d’être, such as social media networks, but it might make less obvious sense for those where this content is optional, such as online retail.

If you can trade without it, why take the risk?

The consequences of creativity

There are many good answers to this question, of course. UGC brings users’ endorsements of the platform’s value into the heart of the user experience. It creates feedback loops of interaction that can encourage people to stay on the platform. It enables businesses to develop a richer identity and culture with less up-front investment.

It could be argued, however, that these and other answers are really just examples of a larger, more fundamental point about the value of UGC. Where interacting with a traditional business means choosing options from a menu of what that business thinks a person might like, UGC capabilities give people a much more authentic sense of choice and agency.

This freedom to shape one’s own path is what leads to all other outcomes, whether positive or negative.

People, in short, love to create, and it is the emotional driver of creativity that businesses are tapping into when they allow users to set up their own storefront, create and join sub-communities, or craft an online dating profile that really feels like them.

As in the physical world, however, that freedom and agency do not come without potential issues. The freedom for users to present their authentic selves could be misused to imitate someone else. The creative leeway for sellers to brand their online shopfronts could be misused as an opportunity to lead buyers away to other platforms.

The consequences of creativity, then, are emotional fulfillment – but also a serious threat to a business’s sustainability.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

The art of moderating art

Understanding creativity as the driver of the value businesses can glean from UGC, however, has important consequences for how we might think about managing and moderating that content.

In the example of users being led off-platform, for instance, the immediate consequences might include revenue loss, as users transact outside of the platform’s channel, and reputational damage when users who are out of reach of the platform’s protections suffer losses.

A traditional view of content moderation might be to maximize the ability to identify and eliminate these interactions; a well-trained AI system can spot signs of such activity, such as disguised URLs or phone numbers, and elevate cases to a human team of moderators when the nature of the interaction is ambiguous.

A mature approach might further use those tools to generate insight into how a platform is performing, and what the context of inappropriate actions tends to be. If issues are consistently being flagged around a particular product category, or in certain markets, businesses can take action such as modifying the user interface or adding targeted warning messages to make those events less likely.

If, on the other hand, we see what users are doing on a platform not just as interaction, but as creativity, that might point us towards the need to use moderation in a way that maximizes their scope for self-expression. Rather than relying only on the ‘stick’ approach of punishing bad content – which will always shift the experience closer to the traditional model of having limited options from a business – we can also look to offer a ‘carrot’ approach which avoids a sense of limitation on what users can do.

This might, for instance, involve automatically promoting content that closely matches the brand’s values to the forefront of a user’s experience, giving them a clear social model of how they could or should behave on the platform.

It might respond to potentially problematic content by asking the user to reconsider their approach, rather than immediately putting it in a queue for approval by a human moderator. It might even allow people to manage what kinds of content they are comfortable seeing, giving other users greater leeway to express themselves freely.

Ultimately, the goal of offering UGC options is to attract and retain the users who best match a brand’s personality and values. That means allowing them to exercise their creative instincts – and content moderation tools can be just as valuable here as they are for limiting inappropriate speech.

Find out more about working with us and request a demo today.

Otis Burris

By Otis Burris

VP – Partnerships, Mergers & Acquisitions

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background