The term sharing economy is famously difficult to define. For some, it refers to any digital platform that connects people more directly than traditional business models. For others, a business is only true in the sharing economy if it enables people to make money out of things they would buy and own anyway.
What all forms of the sharing economy share, though, is a reliance on trust. Whether you are hailing a ride, staying in someone’s spare room, borrowing a lawn mower, or paying someone to do a small one-off job, you’re entering into a transaction that starts with a decision to trust a stranger.
The difficulty of encouraging that decision is exacerbated by the fact that, from the user’s perspective, getting this wrong is potentially a high-stakes issue: while sometimes it might mean merely getting a dissatisfying product, interactions like borrowing a car or renting a room can pose serious risks to health and wellbeing.
Content’s double-edged sword
The question for platforms, then, is what kinds of structures and tools best encourage both positive outcomes and – almost as importantly – a sense of trust amongst the user base.
Alongside approaches like strict rules on what can be listed and physical checks of users’ offerings, many sharing economy platforms turn to user-generated content (UGC) for this purpose. User reviews, photos of what is on offer, communication options, and even selfies can help humanize the platform, validate that users are real people, and generate a sense of trustworthiness.
At the same time, however, allowing UGC can open the door to specific risks. Low-quality images, for example, can worry people and erode trust, while giving users more control over how listings are presented creates greater potential for scams, fraud, and fake profiles. A permissive approach to content can lead users to conduct business off-platform, side-stepping safety and monetization systems.
This is why there are various approaches to UGC in the sharing economy. Some platforms, like Airbnb, encourage users to share as much about themselves and their property as possible, while others, like Uber, allow only a small selfie and the ability to rate riders and drivers out of five stars. Between these open and defensive approaches, there are any combinations of content and sharing permissions a business might choose – but what delivers the best outcome?
Using the carrot not just the stick
Intuitively, many might assume that the safest platforms will be those with the strictest rules, only allowing user interaction when necessary and banning those who engage in damaging behavior. In recent research, a group of organizational psychologists described this as ‘harsh’ regulation, as opposed to the ‘soft’ regulation of supporting users, encouraging interaction, and influencing them to engage in positive behavior.
Perhaps surprisingly, the research found that soft regulation has a stronger positive impact than harsh regulation. The sharing economy, after all, digitalizes something humans have always done in the physical world: try to help one another in mutually beneficial ways. Just as we take our cues on how to behave in everyday life from the people around us, seeing positive engagements on platforms sets a standard for how we treat and trust each other in digital spaces. Being able to talk, share, and humanize helps people to engage, commit, and trust.
This suggests we may need to shift how we manage content to maximize its potential to drive long-term growth. Content moderation is seen, first and foremost, as a way of blocking unwanted content – and that’s certainly something it achieves. At the same time, having clear insight into and control over how, when, and where content is presented gives us a route toward lifting the best of a platform into the spotlight and giving users clear social models of how to behave. Essentially, it’s an opportunity to align the individual’s experience with the best platform.
Summary and key takeaways
Creating high-trust sharing economy communities is in everyone’s best interest: users are empowered to pursue new ways of managing their daily lives. Businesses create communities where people want to stay and promote to their friends and family for the long term. We need to focus on tools and approaches that enable and promote positive interactions to get there.
Content Moderation Glossary
Get in the know with our ultimate glossary of content moderation. From UGC to AI-powered moderation, we’ve got you covered. Learn the lingo now!
Digital Services Act (DSA): What It Is and What It Means for Content Moderation
We explain what you need to know everything you need to know about this new law in an easy-to-understand way. Stay ahead of the game in 2023, from transparency and accountability to prohibiting dark patterns.
Doxxing: How to Protect Your Platform and Users
From high-profile doxxing incidents to the potential consequences for victims and businesses, our post covers everything you need to know about this serious threat to online privacy and security.
Creating Trust and Safety in UX Design: Balancing Convenience and Security
Learn how to enhance UX design with trust and safety. Discover tips and best practices for creating secure user experiences that build trust.
Announcing Our Reporting Feature: Download and Visualize Your Data
Announcing Besedo reporting! Download and import your data into your favorite business intelligence tool to create all sorts of graphs, charts, and data magic.
The Advantages of Outsourcing Content Moderation
Discover the advantages of outsourcing content moderation, including cost savings, improved efficiency, access to expertise, scalability, and an improved user experience.
What Is User-Generated Content (UGC)?
Learn everything there is about user-generated content (UGC) and how it’s used. We also take a look at great real-world examples of UGC, and how it affects businesses worldwide.
The Job Scams Epidemic
Learn more about how hackers use brands to harvest personal details. We share how you can fight back using content moderation on your job board.
10 Tips For Startups Dealing With User-Generated Content
As a startup, it’s important to focus on your core idea and product development. However, many distractions and clutter can take your focus away from what you must do. Here are 10 tips for startups dealing with user-generated content!
Keeping Your Gaming Platform Safe And Enhancing Your User Experience
Prevent bullying, grooming, and harassment on the gaming platform you’re running. In-app messaging should be a safe place for all gamers – your users’ safety, and your reputation is on the line.
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.