In this blog post, we continue digging into the concepts presented in our model 3 pillars of marketplace success. Our last post focused on the content quality pillar, but now we will delve into user trust and how content moderation can positively affect it.
The concept of trust is an ingrained part of human society. It is fundamental for smooth-flowing commerce and insanely important for online marketplaces. To quickly assess how well your site is doing regarding user trust, ask yourself the following question: “Do users trust that they’re safe when browsing your site, interacting with other users, or completing transactions through it?”
It‘s important to note that what constitutes a perfectly safe user experience for one target group could easily be considered questionable or even harmful to others. Users of a marketplace for computer games might not feel unsafe due to bad language, but parents shopping for second-hand children’s clothes would probably not feel secure buying from a site full of profanities
Here are some things you can do to raise trust in your marketplace.
If your users fall for scams listed on your site, it damages your brand. Apart from losing the victim, you also risk bleeding users due to word of mouth and bad PR.
But even if the scam is so blatantly obvious that no one would fall for it, it still doesn’t look good, and users will start questioning the legitimacy of all listings on your site. If one is a scam, how do they know the rest are genuine?
Make sure that
- Scams are caught and removed in your moderation process before they go live.
- Reported scams are investigated promptly and thoroughly and removed if the complaint is valid.
Allowing counterfeits on your site carries two major dangers. First, you risk getting sued for copyright infringement, as has happened to eBay multiple times. Secondly, users will most likely be disappointed when they think they’re purchasing a Luis Vuitton bag through your site and end up with a cheap imitation.
Make sure that:
- Your content moderation team knows how to spot a counterfeit version of your site’s largest, most well-known brands.
- Make it clear in your external (and internal) policies that counterfeits are not allowed.
Harassment or discrimination
Unfortunately, with the anonymity offered by the Internet, harassment and discrimination are frequent online. But just because it happens often doesn’t mean that users are desensitized to it. As such, users finding harassment like racism or discrimination based on gender identity, lawful sexual activity, or sexual orientation on your site are likely to get turned off from further usage.
Make sure that:
- What constitutes harassment or discrimination in one part of the world might not seem offensive in another. Make sure that your rules and content moderation team understand and fit the culture of your user base.
- Check local laws and make sure your rules are in adherence.
Recently the CEO of Backpage.com was charged with pimping due to illegal content posted on the site. In many countries, the site owner is liable for the user-generated content posted on your site. While that should be enough to keep your site free of illegal items and services, striving to build and maintain user trust offers another good reason only to allow ads compliant with local laws.
Most law-abiding citizens will feel at least a bit apprehensive about purchasing a car found in an ad lodged between prostitution offers and drug adverts.
Make sure that:
- Ads for illegal items or services often contain code or masked offers to avoid notice. Make sure your team is trained in spotting obfuscation.
- As with Harassment and discrimination, check local laws and make sure your rules are in adherence.
As we have seen in the recent Napalm girl controversy, which Facebook first deleted, then after the severe backlash reinstated the Pulitzer prize-winning photo, there are times and places when nude photos are perfectly okay. However, in most cases, they are not and will likely turn your users off from using your site.
Make sure that:
- Your content moderation team understands that there are gray-area cases and to look for a second opinion if in doubt.
- You deal with reported cases promptly and check through your offenders’ other pictures to ensure they aren’t also obscene.
Some audiences are more sensitive to profanities than others, and in some online groups, swearing is part of the culture. On most marketplaces, however, it will foster an unwelcoming atmosphere and should not be allowed.
Make sure that:
- Profanities are often hidden or obfuscated to avoid detection. Ensure your team knows common internet slang, abbreviations, and other ways to mask insults.
- You understand your community and target audience. What is considered profanities in one culture may not be in others and vice versa.
Why user trust is crucial for marketplaces
For digital marketplaces, user trust is everything. Without it, you won’t be able to run your business. While building trust online has been a challenge for the e-commerce industry, it’s even more difficult for sites facilitating commerce between two, to each other unknown parties. Regular e-commerce sites “only” have to build confidence in their brand for shoppers to trust them enough to go through with transactions.
On the other hand, two-sided marketplaces have to convince the buyer that the seller, a third party and often anonymous individual, is genuine, which is a much harder task.
Proving the trustworthiness of one seller might not be too troublesome, but when you grow and scale your business and suddenly have hundreds of thousands of sellers, it becomes harder. Instead, you have to displace the trust in your brand.
When trust in your brand is strong enough, it doesn’t matter how anonymous the seller is. People will trust that you, as a marketplace, take the measures needed for safe transactions. Just look at eBay. Since it’s such a cemented brand, people trust the site enough to believe their promise of a refund or replacement if products bought through the site don’t live up to expectations. As such, the buyer does not need to trust the seller. Their trust is displaced in eBay even though they are buying from an often anonymous seller.
How to ensure user trust
The best way to build trust in your brand is always to provide users with a safe experience on your site. If they never see anything that makes them feel uncomfortable, they will be more prone to trust and buy from third-party sellers on your site.
To ensure a recurring safe experience, you need to moderate the content and have good rules and processes in place for your moderation efforts to be effective.
It’s also a great idea to utilize AI moderation or at least some form of automation to ensure that content is reviewed quickly, catching harmful listings without impacting the user experience of the seller too much due to a long time to site.
Content Moderation Glossary
Get in the know with our ultimate glossary of content moderation. From UGC to AI-powered moderation, we’ve got you covered. Learn the lingo now!
Digital Services Act (DSA): What It Is and What It Means for Content Moderation
We explain what you need to know everything you need to know about this new law in an easy-to-understand way. Stay ahead of the game in 2023, from transparency and accountability to prohibiting dark patterns.
Doxxing: How to Protect Your Platform and Users
From high-profile doxxing incidents to the potential consequences for victims and businesses, our post covers everything you need to know about this serious threat to online privacy and security.
Creating Trust and Safety in UX Design: Balancing Convenience and Security
Learn how to enhance UX design with trust and safety. Discover tips and best practices for creating secure user experiences that build trust.
Announcing Our Reporting Feature: Download and Visualize Your Data
Announcing Besedo reporting! Download and import your data into your favorite business intelligence tool to create all sorts of graphs, charts, and data magic.
The Advantages of Outsourcing Content Moderation
Discover the advantages of outsourcing content moderation, including cost savings, improved efficiency, access to expertise, scalability, and an improved user experience.
What Is User-Generated Content (UGC)?
Learn everything there is about user-generated content (UGC) and how it’s used. We also take a look at great real-world examples of UGC, and how it affects businesses worldwide.
The Job Scams Epidemic
Learn more about how hackers use brands to harvest personal details. We share how you can fight back using content moderation on your job board.
10 Tips For Startups Dealing With User-Generated Content
As a startup, it’s important to focus on your core idea and product development. However, many distractions and clutter can take your focus away from what you must do. Here are 10 tips for startups dealing with user-generated content!
Keeping Your Gaming Platform Safe And Enhancing Your User Experience
Prevent bullying, grooming, and harassment on the gaming platform you’re running. In-app messaging should be a safe place for all gamers – your users’ safety, and your reputation is on the line.
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.