In this blog post we continue digging into the concepts presented in our model “3 pillars of marketplace success”. Our last post focused on the content quality pillar, but now we are going to delve into the concept of user trust and how moderation can positively affect it.
It‘s important to note that what constitutes a perfectly safe user experience for one target group, could easily be considered questionable or even harmful to others. Users of a marketplace for computer games might not feel unsafe due to bad language, but parents shopping for second-hand children’s clothes would probably not feel secure buying from a site full of profanities
Here are some things you can do to raise the trust in your marketplace.
If your users fall for scams listed on your site it’s of course very damaging to your brand. Apart from the loss of the victim, you also risk bleeding users due to word of mouth and bad PR.
But even if the scam is so blatantly obvious that no one would fall for it, it still doesn’t look good and users will start questioning the legitimacy of all listings on your site. If one is a scam, how do they know the rest are genuine?
Make sure that:
- Scams are caught and removed in your moderation process before they go live.
- Reported scams are investigated promptly and thoroughly and that they are removed if the complaint is valid.
Allowing counterfeits on your site carries two major dangers. First of all, you risk getting sued for copyright infringement as has happened to eBay multiple times. Secondly users will most likely be disappointed when they think they’re purchasing a Luis Vuitton bag through your site and end up with a cheap imitation.
Make sure that:
- Your moderation team knows how to spot a counterfeit version of the largest, most well-known brands listed on your site.
- Make it clear in your external (and internal) policies that counterfeits are not allowed.
Harassment or Discrimination
Unfortunately with the anonymity offered by the Internet, harassment and discrimination is a frequent occurrence online. But just because it happens a lot doesn’t mean that users are desensitized to it. As such users finding harassment like racism or discrimination based on gender identity, lawful sexual activity or sexual orientation on your site are likely to get turned off from further usage.
Make sure that:
- What constitutes harassment or discrimination in one part of the world, might not seem offensive in another. Make sure that your rules and moderation team understand and fit to the culture of your user base.
- Check local laws and make sure your rules are in adherence.
Recently the CEO of Backpage.com was charged with pimping due to illegal content posted on the site. In many countries the site owner is liable for the user generated content posted on your site. While that should be enough reason to keep your site free of illegal items and services, striving to build and maintain user trust offers another good reason to only allow ads that are compliant with local laws.
Most law-abiding citizens will feel at least a bit apprehensive about purchasing a car found in an ad lodged between prostitution offers and drug adverts.
Make sure that:
- Ads for illegal items or services will often contain code or masked offers to avoid notice. Make sure your team is trained in spotting obfuscation.
- As with Harassment and discrimination check local laws and make sure your rules are in adherence.
As we have seen in the recent Napalm girl controversy where Facebook first deleted, then after the severe backlash reinstated the Pulitzer price winning photo, there are times and places when nude photos are perfectly okay. However in most cases they are not and will likely turn your users off from using your site.
Make sure that:
- Your moderation team understands that there are gray area cases and to look for a second opinion if in doubt.
- You deal with reported cases promptly and check through your offenders’ other pictures to ensure that they aren’t also obscene.
Learn how to moderate without censoring
Why moderating content without censoring users demands consistent, transparent policies.
Some audiences are more sensitive to profanities than others and in some online groups swearing is even part of the culture. On most marketplaces, however it will foster an unwelcoming atmosphere and as such should not be allowed.
Make sure that:
- Profanities are often hidden or obfuscated to avoid detection. Make sure your team knows of common internet slang, abbreviations and other ways to mask insults.
- You understand your community and target audience. What is considered profanities in one culture, may not be in others and vice versa.
Why Is User Trust Crucial for Marketplaces?
For digital marketplaces user trust is everything. Without it you won’t be able to run your business. While building trust online has been a challenge for the ecommerce industry as a whole, it’s even more difficult for sites facilitating commerce between two, to each other unknown parties. Regular ecommerce sites “only” have to build confidence in their brand for shoppers to trust them enough to go through with transactions. Two-sided marketplaces on the other hand, has to convince the buyer that the seller, a third party and often anonymous individual, is genuine and that is a much harder task.
Proving the trustworthiness of one seller might not be too troublesome, but when you grow and scale your business and suddenly have hundreds of thousand sellers it becomes harder. Instead you have to displace the trust to your brand.
When trust in your brand is strong enough, it doesn’t matter how anonymous the seller is, people will trust that you as a marketplace take the measures needed for safe transactions to take place. Just look at eBay. Since it’s such a cemented brand people trust the site enough to believe their promise of a refund or replacement if products bought through the site don’t live up to expectations. As such the buyer has no need to trust the seller, their trust is displaced to eBay even though they are buying from an often anonymous seller.
How to Ensure User Trust
The best way to build trust in your brand is to always provide users with a safe experience on your site. If they never see anything that makes them feel uncomfortable they will be more prone to trust and buy from third party sellers on your site.
To ensure a recurring safe experience you need to moderate the content and you need to have good rules and processes in place for your moderation efforts to be effective.
It’s also a great idea to utilize AI moderation or at least some form of automation to ensure that content is reviewed quickly, catching harmful listings without impacting the user experience of the seller too much due to long time to site.
Protect your users by knowing when scammers strike, feel free to use our scam awareness calendar.
The latest around content moderation, straight in your inbox
Subscribe to get our newsletter to stay updated.
How Bad UX Can Ruin Your Online Brand
With user-generated content platforms you’re essentially handing over a massive chunk of your user experience to your community.
The World’s Top Online Marketplaces 2022
Find out which online marketplaces are the biggest in various countries, categories, and more in our list of the biggest marketplaces online.
How can dating apps be flirty but not dirty?
Evolution of language, part three
Making sure dating apps are about ’amore’ not fraud
Evolution of language, part two
Evolution of language, part one
All change: a quick look at content moderation’s big trends
How to not be the brand that ruins Christmas
Can users be given creative freedom without exposing them to risk?
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.