The 3 trust mechanisms of online conversion and how to boost them through content moderation

Online marketplaces rely on 3 key trust mechanisms which serve to build, maintain and reinforce trust constantly, facilitating the conversions that every online marketplace strives for.

We take a closer look at the three main types of trust needed to maintain user loyalty and promote conversion. It’s a delicate balance to get right, but good tools, processes, and content moderation helps.

For many of us, online shopping has become commonplace. Speed, convenience, and safety are a recipe for success. 10 years ago people were a lot more cautious with shopping online, but gradually eCommerce players and online marketplaces have built up a general trust in online shopping.

Still, this is a fragile trust that asks buyers to have faith that their items will arrive and their personal data kept safe even as they are dealing with total strangers. It relies on 3 key trust mechanisms which serve to build, maintain and reinforce trust constantly which facilitate the conversions that every online marketplace strive for.

In this article, we take a closer look at the three main types of trust needed to maintain user loyalty and promote conversion. It’s a delicate balance to get right but additional measures, like content moderation, can help.

 

User Trust

The main trust challenge in online marketplaces and classified sites is that the vendor worries, ‘Will I get paid?’ while the buyer wonders, ‘What if they don’t send my purchase?’. While these are genuine concerns, neither party can completely guarantee the honesty of the other.

This is why online marketplaces implement more and more features or elements that support interpersonal trust online. Simple features like peer reviews and links to social profiles (something Airbnb has perfected through its Social Connections feature) go a long way to prove each party’s ‘realness’.

An in-marketplace chat function can also help users feel more connected. It is beneficial both because it enables buyer and seller to vet each other, and because channels like these can be monitored and reviewed too by moderation teams catching fraud that might otherwise happen outside the platform.

Other easily implemented measures can be taken to help support user trust. For instance, content moderation teams can vet all new users, so veteran users have an additional assurance that those they’re dealing with are as genuine as themselves.

Read up on the 6 ways to generate user trust through moderation.

 

Brand Trust

Let’s face it even reviews can be faked and a great digital reputation can be bought for a few dollars on Fiverr.
That is why a general trust in the user needs to be combined with a more overarching trust in the brand.
Users of online marketplaces should feel secure that the platform they are interacting on is committed to keeping them safe.

That means that to create brand trust each marketplace needs to hold itself responsible for upholding its own rules so that users don’t encounter anything out-of-place, off-brand, or unsavory. Content moderation – whether manual or automated – is a way of ensuring anything questionable is prevented from being posted or is removed as quickly as possible. Moderators, automated tools such as filters or AI moderation, or a combination of all three are great for monitoring suspicious activity and identify keywords or phrases that are used by scammers. Having good moderation processes in place and communicating this to users, helps build brand trust and credibility.

 

Transactional Trust

Presuming that a marketplace has enough brand credibility to perpetuate the effect of ‘situational normality’, there’s still the question of trust in online transactions – when money changes hands.

Content moderators often raise warning flags when online marketplace vendors insist on payment by PayPal (or Western Union) only; primarily because these methods of payment require the minimum of recipient details – all you need to transfer money to someone is an email address. But because PayPal remains a quick and convenient way to send and receive small amounts of cash, it’s highly popular. This is why the company has gone to great lengths to assert its credibility; by offering things like buyer protection.

While online marketplaces can certainly offer something similar, from a moderation perspective it’s not viable to suspect every mention of an online payment service equates to potential fraud. That’s why moderation tools can be configured to identify a sequence of terms and fraud markers, that only trigger investigation when featured together in a single listing.

 

Trust & Transparency

Ultimately, the more access users have to information about each other, the more trustworthy their dealings become. But the problem lies in ensuring the validity of the information shared.

Good online marketplaces should act as mediators; validating our trustworthiness and that of other users while ensuring that they reinforce trust right across their platform. It’s certainly possible (just look at eBay’s continued success), and with the help of good content moderation, it’s getting a lot easier too.

Want to learn more?
Join the crowds who receive exclusive content moderation insights.