So, as another year draws down and a new one dawns, it’s that time when we reflect on what happens – and take a moment to think about what the future holds. Living in these disrupted, dramatic times, though, it can be hard to see the wood for the trees: with so much going on and so much uncertainty about what it all means, we might feel like it’s a tougher job than usual to take stock this year.
In the case of content moderation, for example, so many of the important recent changes have been felt by the technology industry as a whole.
Even as the world, on average, emerges from the other side of the pandemic, we see a little pullback from the spike in technology usage that it triggered. Commerce is more online than ever, work is more remote than ever, lives are more tied to platforms than ever, and social expectations about how we use technology have been set on a new path.
All of this makes it easy to miss that content moderation has had a really interesting year, even taken in isolation. Even while they have been dealing – along with everyone else – with changing user habits, professionals in the space have had to keep one eye on how upcoming legislation will soon rework how businesses work with user-generated content.
Perhaps most prominent has been the EU’s Digital Services Act which, when it is ratified, will require much more extensive reporting, complaints mechanisms, and (for the largest platforms) auditing. It’s not alone: Australia’s Online Safety Act 2021, passed this summer, creating a governmental eSafety Commissioner position with oversight of online content, while the UK is currently working on an Online Safety Bill to regulate content.
Likely, businesses will still have a long way to go to prepare for these regulations: research we ran towards the start of the year found that few were prepared for – and many were unaware of – the Digital Services Act.
However, even the growing attention from governments on how content moderation operates might not be the biggest thing facing the industry right now.
That’s because businesses are looking at an even more immediate impact in terms of how users actually use their platforms. We can tell the story in a few key statistics: nearly 50% of internet users look for videos related to a product or service before visiting a store; 72% of customers would rather learn about a product or service by way of video; social video generates 1200% more shares than text and image content combined.
In other words, before thinking about changes in how content is moderated, we need to deal with the fact that what is being moderated is changing rapidly. Whether the task is automated or taken on by a human, video is significantly more difficult and time-consuming to moderate than plain text.
Even outside of video, as we’ve recently discussed, AI-led content moderation needs to improve its capacity to keep up with the speed at which human communication evolves online.
If we’re going to make a prediction for the next year of content moderation trends, we should start where businesses should always start: by thinking about the user or customer, what they need and want, and how we can step up to meet those desires.
From that perspective, here at Besedo, we think that the story of 2022 for content moderation will be one of rising as a strategic priority in many different kinds of business. User habits and expectations are changing, but the kinds of user-generated content available (and, more importantly, the quality of that content) is still very unevenly spread across different businesses and platforms.
Where one clothes shop, for example, might be enabling users to upload videos, another might only just have introduced text reviews.
That makes content, when done well, a powerful competitive differentiator that will come to the fore as our new assumptions about how we use the internet solidify.
Historically, content moderation has often been seen as a defensive measure, protecting businesses against negative outcomes, and new legislation may well sharpen what that looks like. The real opportunity coming up, however, is to see how it can be an asset to the customer experience, ensuring that this is not just content they have the option of seeing but the content they want to see.
More on our blog
What Is Cyberbullying and How Can It Be Stopped?
Discover effective strategies to stop cyberbullying and learn how Besedo’s content moderation solutions can create safer online communities.
The DSA: An Executive Summary of the New Online Rules for Platform Businesses
Here’s a friendly guide, an executive summary if you will, to be compliant with the Digital Services Act for online businesses.
Content Filtering vs Content Moderation: The Key to Scaling
Content filtering is only part of the content moderation process, but it’s an important gatekeeper that allows platforms to scale. Let’s have a closer look.
Misinformation vs Disinformation: What Is the Difference and How Do They Interact?
Learn about misinformation and disinformation, how they interact, how false information spreads, and an unusual example from 1835.
Report: The Effects of Fraud and Poor Content Quality on Online Marketplaces
The new Besedo report highlights the effects of fraud and poor content quality on online marketplaces. Get easy-to-understand graphs, stats, and insights.