The term ‘sharing economy’ is famously difficult to define. For some, it refers to any digital platform that connects people more directly than traditional business models. For others, a business is only true in the sharing economy if it enables people to make money out of things they would buy and own anyway.
What all forms of the sharing economy share, though, is a reliance on trust. Whether you are hailing a ride, staying in someone’s spare room, borrowing a lawn mower, or paying someone to do a small one-off job, you’re entering into a transaction that starts with a decision to trust a stranger.
The difficulty of encouraging that decision is exacerbated by the fact that, from the user’s perspective, getting this wrong is potentially a high-stakes issue: while sometimes it might mean merely getting a dissatisfying product, interactions like borrowing a car or renting a room can pose serious risks to health and wellbeing.
Content’s double-edged sword
The question for platforms, then, is what kinds of structures and tools best encourage both positive outcomes and – almost as importantly – a sense of trust amongst the userbase.
Alongside approaches like strict rules on what can be listed and physical checks of users’ offerings, many sharing economy platforms turn to user-generated content (UGC) for this purpose. User reviews, photos of what is on offer, communication options, and even selfies can all help to humanize the platform, validate that users are real people, and generate a sense of trustworthiness.
At the same time, however, allowing UGC can open the door to specific risks. Low-quality images, for example, can worry people and erode trust, while giving users more control over how listings are presented creates greater potential for scams, fraud, and fake profiles. A permissive approach to content can also lead to users conducting business off-platform, side-stepping both safety, and monetization systems.
This is why there is such a variety of approaches to UGC in the sharing economy. Where some platforms, like Airbnb, encourage users to share as much about themselves and their property as possible, others, like Uber, allow only a small selfie and the ability to rate riders and drivers out of five stars. In between these open and defensive approaches, there are any number of combinations of content and sharing permissions a business might choose – but what delivers the best outcome?
Learn how to moderate without censoring
Why moderating content without censoring users demands consistent, transparent policies.
Using the carrot, not just the stick
Intuitively, many might assume that the platforms which feel safest will be those with the strictest rules, only allowing interaction between users when absolutely necessary and banning those who engage in damaging behavior. In recent research, a group of organizational psychologists described this as ‘harsh’ regulation, as opposed to the ‘soft’ regulation of supporting users, encouraging interaction, and influencing them to engage in positive behavior.
Perhaps surprisingly, the research found that soft regulation has a stronger positive impact than harsh regulation. The sharing economy, after all, digitalizes something humans have always done in the physical world: try to help one another in mutually beneficial ways. Just as we take our cues on how to behave in everyday life from the people around us, seeing positive engagements on platforms sets a standard for how we treat each other – and trust each other – in digital spaces. Being able to talk, share, and humanize helps people to engage, commit, and trust.
This suggests that we may need to shift how we think about managing content in order to make the most of its potential to drive long-term growth. Content moderation is seen, first and foremost, as a way of blocking unwanted content – and that’s certainly something it achieves. At the same time, though, having clear insight into and control over how, when, and where content is presented gives us a route towards lifting the best of a platform into the spotlight and giving users clear social models of how to behave. In essence, it’s an opportunity to align the individual’s experience with the best a platform has to offer.
Ultimately, creating high-trust sharing economy communities is in everyone’s best interest: users are empowered to pursue new ways of managing their daily lives, and businesses create communities where people want to stay, and promote to their friends and family, for the long term. To get there, we need to focus on tools and approaches which enable and promote positive interactions.
The latest around content moderation, straight in your inbox
Subscribe to get our newsletter to stay updated.
How Bad UX Can Ruin Your Online Brand
With user-generated content platforms you’re essentially handing over a massive chunk of your user experience to your community.
The World’s Top Online Marketplaces 2022
Find out which online marketplaces are the biggest in various countries, categories, and more in our list of the biggest marketplaces online.
How can dating apps be flirty but not dirty?
Evolution of language, part three
Making sure dating apps are about ’amore’ not fraud
Evolution of language, part two
Evolution of language, part one
All change: a quick look at content moderation’s big trends
How to not be the brand that ruins Christmas
Can users be given creative freedom without exposing them to risk?
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.