What is content moderation?
Content moderation is when an online platform screen and monitor user-generated content based on platform-specific rules and guidelines to determine if the content should be published on the online platform, or not.
In other words, when content is submitted by a user to a website, that piece of content will go through a screening process (the moderation process) to make sure that the content upholds the regulations of the website, is not illegal, inappropriate, or harassing, etc.
Content moderation as a practice is common across online platforms that heavily rely on user-generated content, such as social media platforms, online marketplaces, sharing economy, dating sites, communities and forums, etc.
There are a number of different forms of content moderation; pre-moderation, post-moderation, reactive moderation, distributed moderation, and automated moderation. In this article we’re looking closer at human moderation and automated moderation, but if you’re curious to learn more, here’s an article featuring the 5 moderation methods.
What is human moderation?
Human moderation, or manual moderation, is the practice when humans manually monitor and screen user-generated content which has been submitted to an online platform. The human moderator follows platform-specific rules and guidelines to protect online users by keeping unwanted, illegal, and inappropriate content as well as scams and harassment, away from the website.
What is automated moderation?
Automated moderation means that any user-generated content submitted to an online platform will be accepted, refused, or sent to human moderation, automatically – based on the platform’s specific rules and guidelines. Automated moderation is the ideal solution for online platforms that want to make sure that quality user-generated content goes live instantly and that users are safe when interacting on their site.
According to a study done by Microsoft, humans only stay attentive for 8-seconds on average. Therefore, online platforms cannot afford to have slow time-to-site of user-generated content or they might risk losing their users. On the other hand, users who encounter poor quality content, spam, scam, inappropriate content, etc., are likely to leave the site instantly. So, where does that leave us? In order for online platforms not to jeopardize quality or time-to-site, they need to consider automated moderation.
When talking about automated moderation, we often refer to machine learning AI (AI moderation) and automated filters. But what are they really?
What is AI moderation?
AI moderation, or tailored AI moderation, is a machine learning model built from online platform-specific data, to efficiently and accurately catch unwanted user-generated content. An AI moderation solution will take highly accurate automated moderation decisions – refusing, approving, or escalating content automatically.
One example that showcases the power of AI moderation is the Swiss online marketplace, Anibis, which successfully automated 94% of its moderation whilst achieving 99.8% accuracy.
As long as you have a high-quality dataset that models can be built on, AI moderation is going to be great for routine decisions. It excels at dealing with cases that almost always look the same or very similar. This usually includes the vast majority of items that are posted to online marketplaces and as such most platforms can benefit from using AI moderation.
It should also be mentioned; that AI moderation can be built on generic data. These models can be effective but are not as accurate as a tailored AI solution as they don’t take site-specific rules and circumstances into account.
What is Automated filter moderation?
Automated filter moderation is a set of rules to automatically highlight and catch unwanted content. The filters (or rules) are efficient at finding content that can’t be misinterpreted or are obvious scams.
Filters are also great for covering sudden rule changes, where the AI has not gotten up to speed yet (Training takes some time and a quality data set). This was well illustrated when the Corona pandemic suddenly made masks and toilet paper problematic. This makes filters a solid complimentary automation tool for your moderation setup.
Automated filters can easily be created, edited, and deleted in our all-in-one content moderation tool, Implio – learn how to create filters here.
Learn how to moderate without censoring
Why moderating content without censoring users demands consistent, transparent policies.
Do’s and don’ts of content moderation
Determining what to do and not to do in content moderation, may vary from site to site. There are many elements and factors that need consideration to get the moderation set up best suited for your specific needs.
However, regardless if you’re running an online marketplace, social media platform, sharing economy site, etc., there are some things true about what to do and not to do when it comes to content moderation.
Do’s of content moderation
- Do: Select the moderation method that’s right for your needs – Start off by looking at what kind of content your site hosts and who your users are. This will help you create a clear picture of what’s required from your moderation method and setup. For example, the type of user-generated content found on Medium contra Facebook is very different, as their users’ behavior too. This makes their moderation methods and setups look different in order to fit their platform’s specific needs.
- Do: Create clear rules and guidelines for your content moderation – Rules and guidelines need to be clear for everyone who is directly involved with your online platform’s content moderation. Everyone from the data scientist developing your AI moderation to the human moderator reviewing content, regardless if they sit in-house or are outsourced to partners. Uncertainty in your rulebook can set your moderation efforts back; both from a financial and a user experience perspective.
- Do: Moderate all types of content – Regardless if you’re running an online marketplace, dating site, or social media platform, your users are key contributors to your platform. Making sure they’re enjoying pleasant experiences and are met with quality content on your site, should be of your interest. To achieve this, you need to make sure your content moderation is done right. In a perfect world moderating all types of content on your site, from text and images to videos and 1-to-1 messages, would be ideal. The reality though, is that this is not an approach possible for all online platforms; for financial and technical reasons. If that’s your case, as a minimum approach make sure to identify your high-risk categories and content and start your moderation efforts there.
Don’ts of content moderation
- Don’t: misinterpret what good content is – Quality content is key to building user trust and achieving a splendid user experience on your online platform, but it’s important to understand what good content is. Don’t make the mistake of misinterpreting good content and end up rejecting user-generated content simply because it’s of negative nature. For example, a negative comment or review following a transaction can still be good content, as long as no harsh language is used of course. Genuine content is what you want, as it enhances quality and user trust.
- Don’t: wait too long before you get started with moderation – If you’re in the early stages of establishing your online platform, getting started with content moderation might feel like it’s miles away. It’s not. Don’t get us wrong, perhaps it shouldn’t be your main priority right out of the gate, but you need to have a plan for how to handle user-generated content, from a moderation perspective, when you scale. As you’re growing, and the network effect kicks in, you often see a rapid increase of content flooding into your site. You need to be prepared to handle that; if not, your big break might actually end up hurting you in the long run.
- Don’t: waste resources – Don’t reinvent the wheel. With multiple content moderation tools and solutions, like Implio, available in the market, it’s important that you prioritize your resources carefully. Innovation and growth are what will boost your online platform to success, and this is where your dev resources will give you the most competitive advantage. Find your way to free up your resources for innovation, without risking falling behind with your moderation efforts.
The latest around content moderation, straight in your inbox
Subscribe to get our newsletter to stay updated.
Welcome to the Age of Fake Dating Profiles
With the rise of online dating comes the problem of fake profiles. So why do people create fake dating profiles and what is done to stop it?
How Bad UX Can Ruin Your Online Brand
With user-generated content platforms you’re essentially handing over a massive chunk of your user experience to your community.
The World’s Top Online Marketplaces 2022
Find out which online marketplaces are the biggest in various countries, categories, and more in our list of the biggest marketplaces online.
How can dating apps be flirty but not dirty?
Evolution of language, part three
Making sure dating apps are about ’amore’ not fraud
Evolution of language, part two
Evolution of language, part one
All change: a quick look at content moderation’s big trends
How to not be the brand that ruins Christmas
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.