✨ New ✨ The Digital Services Act: A fireside chat covering all angles Watch it here → ×

The User-Generated Content Conundrum: Exploring its Impact on the Trust in the Sharing Economy

Contents

    The term sharing economy is famously difficult to define. For some, it refers to any digital platform that connects people more directly than traditional business models. For others, a business is only true in the sharing economy if it enables people to make money out of things they would buy and own anyway.

    What all forms of the sharing economy share, though, is a reliance on trust. Whether you are hailing a ride, staying in someone’s spare room, borrowing a lawn mower, or paying someone to do a small one-off job, you’re entering into a transaction that starts with a decision to trust a stranger.

    Three men happily surrounding a laptop

    The difficulty of encouraging that decision is exacerbated by the fact that, from the user’s perspective, getting this wrong is potentially a high-stakes issue: while sometimes it might mean merely getting a dissatisfying product, interactions like borrowing a car or renting a room can pose serious risks to health and wellbeing.

    Content’s double-edged sword

    The question for platforms, then, is what kinds of structures and tools best encourage both positive outcomes and – almost as importantly – a sense of trust amongst the user base.

    Alongside approaches like strict rules on what can be listed and physical checks of users’ offerings, many sharing economy platforms turn to user-generated content (UGC) for this purpose. User reviews, photos of what is on offer, communication options, and even selfies can help humanize the platform, validate that users are real people, and generate a sense of trustworthiness.

    At the same time, however, allowing UGC can open the door to specific risks. Low-quality images, for example, can worry people and erode trust, while giving users more control over how listings are presented creates greater potential for scams, fraud, and fake profiles. A permissive approach to content can lead users to conduct business off-platform, side-stepping safety and monetization systems.

    This is why there are various approaches to UGC in the sharing economy. Some platforms, like Airbnb, encourage users to share as much about themselves and their property as possible, while others, like Uber, allow only a small selfie and the ability to rate riders and drivers out of five stars. Between these open and defensive approaches, there are any combinations of content and sharing permissions a business might choose – but what delivers the best outcome?

    Using the carrot not just the stick

    Intuitively, many might assume that the safest platforms will be those with the strictest rules, only allowing user interaction when necessary and banning those who engage in damaging behavior. In recent research, a group of organizational psychologists described this as ‘harsh’ regulation, as opposed to the ‘soft’ regulation of supporting users, encouraging interaction, and influencing them to engage in positive behavior.

    Perhaps surprisingly, the research found that soft regulation has a stronger positive impact than harsh regulation. The sharing economy, after all, digitalizes something humans have always done in the physical world: try to help one another in mutually beneficial ways. Just as we take our cues on how to behave in everyday life from the people around us, seeing positive engagements on platforms sets a standard for how we treat and trust each other in digital spaces. Being able to talk, share, and humanize helps people to engage, commit, and trust.

    This suggests we may need to shift how we manage content to maximize its potential to drive long-term growth. Content moderation is seen, first and foremost, as a way of blocking unwanted content – and that’s certainly something it achieves. At the same time, having clear insight into and control over how, when, and where content is presented gives us a route toward lifting the best of a platform into the spotlight and giving users clear social models of how to behave. Essentially, it’s an opportunity to align the individual’s experience with the best platform.

    Summary and key takeaways

    Creating high-trust sharing economy communities is in everyone’s best interest: users are empowered to pursue new ways of managing their daily lives. Businesses create communities where people want to stay and promote to their friends and family for the long term. We need to focus on tools and approaches that enable and promote positive interactions to get there.

    This is Besedo

    Global, full-service leader in content moderation

    We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

    Form background

    Contents