Reviews can make or break a business. The same applies to online marketplaces, classifieds, and even dating sites. And they don’t just impact these platforms – they affect how people see the brands that advertise on them, as well as individual vendors, and of course, those looking for love and companionship.
However, in a world where User-Generated Content (UGC) is so prevalent, the fact is anyone from anywhere can leave a good or bad review and have it seen in a very public way.
While it’s clear why bad reviews can hurt businesses and brands, fake positive ones can damage reputations too.
Confused? It’s a tricky area to navigate.
Let’s consider the ways in which reviews can build trust and how online marketplaces can address this moderation challenge.
Reviews Build Consumer Trust
As we’ve discussed in previous articles, trust is at the epicentre of the digital economy. As consumers we take ‘trust leaps’ when deciding if a particular online product or service is suitable for us. This is why reviews matter so much – they help us form an opinion.
In a practical sense, many of these sentiments (which can largely be attributed to economist and TED speaker, Rachel Botsman) are grounded in our search for social proof; which forms one of the key cornerstones of the ‘Trust Stack’ – which encompasses: trust in the idea, trust in the platform, and (as is the case here) trust in the user.
Because the three have an interdependent relationship, they reinforce each other – meaning that user trust leads to trust in the platform and idea; and vice versa.
If it sounds improbable that consumers are more likely to trust complete strangers, then consider the numbers. Stats show that 88% of consumers trust online reviews as much as personal recommendations – with 76% stating that they trust online reviews as much as recommendations from family and friends.
Needless to say, they factor in a great deal. Customer reviews are therefore essential indicators of trust – which is why bad reviews can negatively impact businesses so heavily.
While on some marketplaces, a 3.5 out of 5 for ‘average’ service might be deemed acceptable – for many businesses, a slip in the way they’re reviewed is perceived to have disastrous consequences.
Some companies have fought back at negative reviews; but instead of challenging customers over their comments, or trying to figure out where they could do better, they’ve actively tried to sue their critics.
One particular hotel in New York State, US, even stated in its ‘small print’ that visitors would be charged $500 for negative Yelp reviews. While some service providers have slated – and even looked to sue – online review giant, Yelp for the way in which it has ‘prioritised’ reviews with the most favourable first.
But why are overly positive reviews that detrimental? Surely a positive review is what all companies are striving for? The issue is inauthenticity. A true reflection of any experience rarely commands 5-stars across the board; and businesses, marketplaces, and consumers are wise to it.
The latest around content moderation, straight in your inbox
Subscribe to get our newsletter to stay updated.
Authenticity Means ‘No Astroturfing’
It’s clear that many companies want to present themselves in the best possible light. There’s absolutely nothing wrong with that. However, when it comes to reviews of their products and services, if every single rating is overwhelmingly positive, consumers would be forgiven for being suspicious.
In many cases, it seems, they probably are. Creating fake reviews – a practice known as ‘astroturfing’ – has been relatively widespread since the dawn of online marketplaces and search engines. But many are now wise to it and actively doing more to prevent the practice.
For example, Google has massively cracked down on companies buying fake Google reviews designed to positively influence online listings – removing businesses that do from local search results. Similarly Amazon has pledged to put a stop to the practice of testers being paid for reviews and being reimbursed for their purchases.
Astroturfing isn’t just frowned upon, it’s also illegal. Both the UK’s Competition and Markets Authority (CMA) and the US Federal Trade Commission have strict rules in place over misleading customers.
In Britain, the CMA has taken action against social media agency, Social Chain, for failing to disclose that a series of posts were in fact part of a paid for campaign; and took issue with an online knitwear retailer posting fake reviews.
While some may consider astroturfing a victimless crime, when you consider the faith that shoppers have in online reviews and the fact that their favourite sites may be deliberately trying to mislead them, then it’s clear that there’s a major trust issue at stake.
For classified sites, dating apps, and online marketplace owners, who have spent so long building credibility, gaining visibility, and getting users and vendors onboard; a culture where fake reviews persist can be disastrous.
But when so many sites rely on User-Generated Content the task of monitoring and moderating real reviews, bad reviews, and fake reviews is an enormous undertaking – and often a costly one.
Manual Vs. Automated Moderation
While many fake reviews are often easy to spot (awkwardly put together, bad spelling and grammar) when they appear at scale manually moderating them becomes unsustainable – even for a small team of experts.
That’s why new ways to detect and prevent are starting to gain traction. For example, many sites and marketplaces are starting to limit review posting to those who’ve bought something from a specific vendor. However, as per the Amazon example above, this is a practice that is easy to circumvent.
A more reliable method is automated moderation – using machine learning algorithms that can be trained to detect fake reviews, as well as other forms of unwanted or illegal content on a particular classifieds site or marketing. By using filters, the algorithm is continually fed examples of good and bad content to the point that it can automatically identify between the two.
It’s a process that also works well in tandem with manual moderation efforts. When a user review is visible, a notification can be sent to the moderation team; allowing them to make the final judgement call on a review’s authenticity.
Ultimately, In a world where online truths can often be in short supply, companies – whether they’re brands or marketplaces – that are open enough for customers to leave honest, reasonable reviews stand a better chance of building trust among their users.
While it’s clear businesses have a right to encourage positive online reviews – as part of their marketing efforts – any activities that attempt to obscure the truth (no matter how scathing) or fabricate a rose-tinted fake review, can have an even more negative impact than a humdrum review itself.
The bad review report
To help you better understand how to naviguate the tricky world of bad review we have compiled a report highlighting the top 5 most frequent complaints on online marketplaces. Our insights will help you better understand how to improve user experience, negate churn and increase user acquisition on your platoform.
Bad Reviews Report
In the report you will learn:
The top 5 most frequent complaints on online marketplaces
Transform negative reviews about your site into action points
How to improve user experience, negate churn and increase user acquisition on your platform
How Bad UX Can Ruin Your Online Brand
With user-generated content platforms you’re essentially handing over a massive chunk of your user experience to your community.
The World’s Top Online Marketplaces 2022
Find out which online marketplaces are the biggest in various countries, categories, and more in our list of the biggest marketplaces online.
How can dating apps be flirty but not dirty?
Evolution of language, part three
Making sure dating apps are about ’amore’ not fraud
Evolution of language, part two
Evolution of language, part one
All change: a quick look at content moderation’s big trends
How to not be the brand that ruins Christmas
Can users be given creative freedom without exposing them to risk?
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.