When British journalist, Sali Hughes, found herself the victim of online trolls, she took it upon herself to understand why. As part of these efforts, she actually met one of them in person.
While it’s vital that victims, like Ms. Hughes, speak out against the responsible individuals – what of the sites where all of these comments are posted? Negative User Generated Content (UGC) continues to be problematic for all kinds of online marketplaces and classified sites – as well as chat forums. But to what extent can they be held accountable for enabling these kinds of viewpoints to be aired – and to what extent should they be punished?
While the human impact is a critical concern that many businesses, governments, and advocacy groups are striving to curb, there’s a similarly disastrous impact for the sites themselves – not just in terms of reputation or bad user experience; there’s a financial impact too.
Let’s consider the ways in which negative UGC can affect the business bottom line and look at ways companies can put a stop to it.
Bad Content: The Bigger Picture
The need for UGC platforms to respond swiftly and decisively to negative content is a given. Legislators have continually weighed in on these issues. For example, last year, (as mentioned in a previous blog) the EU voted to give online businesses one hour (from being contacted by law enforcement authorities) to remove terrorist-related content – from the moment that it’s identified on a site. Failure to comply could incur these businesses a fine of up to 4% of their global revenue.
Similar efforts would have governments taking a more proactive stance – such as British media watchdog, Ofcom’s proposals to police social media earlier this year. But given concerns over freedom of speech and expression, such moves are bound to provoke a backlash – from businesses and individuals.
Another solution is for businesses to regularly audit their own sites: which larger platforms like Facebook and YouTube do already (and with varying degrees of success given Facebook’s continued fines over under-reporting illegal activity). In addition, organizations like GARM (Global Alliance for Responsible Media) brings advertisers, agencies, media companies, platforms and industry organizations together to improve digital safety.
While the vast majority of online platforms do everything they can to ensure their sites remain safe places for all of their users and customers, the issue is that all of these combined actions don’t stop trolls and cyber criminals.
The Business Impact
In addition to the ongoing regulatory maelstrom, the urgency to respond is exacerbated by myriad business concerns. These include retention, conversion, engagement, reputation, and customer satisfaction – all of which can be easily damaged and disrupted by bad or harmful User Generated Content. This in turn can pave the way for other types of negative online behaviours: from scams to fake ads.
Retention, Conversion, & Engagement
There’s a negative impact when customers lose faith in an online platform – be it a service or a marketplace.
It follows that if users leave; revenues will drop. Lower engagement leads to fewer conversions. But that’s not all. Costs increase too as a result – as it becomes increasingly more expensive to win back old, or entice new, customers.
Lower user retention stemming from a negative experience pushes up the cost of user acquisition. Similarly, higher user leakage means that the lifetime value of users will drop too.
Given that, a bad UGC experience can contribute to a fairly rapid downward spiral, the case for prevention rather than reaction is more important than most site owners consider.
A company’s reputation online matters just as much – if not more so – than how it’s perceived offline. After all, content tends to have a habit of lingering online. That’s why bad UGC can be so damaging.
It can often be hard for brands to shake the stigma of bad content published about them. By the same token, their reputation can also be damaged by (unknowingly) hosting it. This can be disastrous for online marketplaces, classifieds, and chat sites, who often need to rebuild trust from the bottom up.
Then, of course, there are legal and liability issues that can stem from unauthorized UGC as well as harmful content. Take the 2017 case of Kayla Kraft vs Anheuser-Busch. When an image of the claimant was supposedly submitted as part of a campaign to crowdsource advertising images, she filed a lawsuit asserting the image had been used without her consent.
While many businesses will focus on providing multichannel support, and making it as easy as possible for customers to access their business and support channels, reducing the number of support requests doesn’t always factor in as highly.
This is a mistake. Ultimately, the more calls, emails, and support tickets there are, the higher the cost of customer service – as better trained staff are needed to deal with incoming queries. But, with a more robust, preventative solution in place, the need for a bigger support offering reduces significantly.
Take our client, Connected2Me – a social media platform where users can chat to each other anonymously. While the idea itself is intended to be fun; the team were experiencing more negative User Generated Content than they could handle. As a result, they were getting an increased number of support tickets, which were proving difficult for the in-house team to keep on top of.
When they contacted us in 2018, Connected2Me had tried adding automation to their content moderation workflow, but had not been able to find a solution which could live up to their required accuracy. The team manually moderated content but needed to ensure 24/7 monitoring to reduce the amount of support tickets and provide the best possible user experience.
With our help Connected2Me now has an accurate moderation solution in place covering numerous languages. They can now move forward confidently – meeting user expectations and provide the experience they were originally aiming for. These efforts are also helping them attract new investment and develop a loyal and happy user base.
User Safety = Sales Success
Task most people with introducing themselves to a crowd of strangers in person, and the chances are you’ll see them do their very best to present a positive version of themselves.
But transpose this to an online environment, add a degree of anonymity, allow people to share content, and all kinds of intriguing behaviors can manifest themselves.
Ultimately, online platforms can unwittingly play host to a torrent of negativity, so preventative action is wholly necessary at a site level.
From a company perspective, the need to counteract it is as much a business concern as it is a user-centric one. But, when you think about it, the two are in fact one and the same. A trusted site that’s known for quality content, reliable customers, and a great user experience will attract more prospects than a platform in which they’re likely to be subject to scams and abuse.
And as for Sali Hughes? She was surprised by the person she met. It wasn’t some bitter, twisted, housebound hacktivist – it was a well-dressed, professional woman in her mid 30s; the kind of person she might even be friends with in other circumstances.
It just goes to show: you can never second guess when, where, or from whom online abuse will come from – which is why a moderation strategy that can be applied at scale and is specifically designed to uphold your site’s rules and procedures is a safer bet for all.
If, like the companies mentioned here, your online business relies on User Generated Content then you need to make sure that every single customer gets the best experience possible.
Here at Besedo, our goal is to help companies do that.
Besedo Appoints Hanna Marklund as New Chief Financial Officer
Besedo is delighted to announce the appointment of Hanna Marklund as the company’s new Chief Financial Officer (CFO), effective immediately.
Building Trust and Safety: Why It Matters and How to Get It Right
Discover the importance of trust and safety for websites and apps, learn effective strategies, and explore case studies to ensure a secure user experience.
Sharing Economy vs. Online Marketplaces: Key Differences and Opportunities
Learn the differences between sharing economy companies and online marketplaces. Plus a look at successful sharing economy companies and content moderation.
Content Moderation Glossary
Get in the know with our ultimate glossary of content moderation. From UGC to AI-powered moderation, we’ve got you covered. Learn the lingo now!
Digital Services Act (DSA): What It Is and What It Means for Content Moderation
We explain what you need to know everything you need to know about this new law in an easy-to-understand way. Stay ahead of the game in 2023, from transparency and accountability to prohibiting dark patterns.
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.