Content moderation is when the online platform screens and monitors user-generated content based on platform-specific rules and guidelines to determine whether or not the content should be published on the online platform.
In other words, when content is submitted by a user to a website, that piece of content will go through a screening process (the moderation process) to ensure that the content upholds the website’s regulations, is not illegal, inappropriate, or harassing, etc.
Content moderation as a practice is common across online platforms that heavily rely on user-generated content, such as social media platforms, online marketplaces, sharing economy, dating sites, communities, and forums, etc.
There are many different methods by which companies can decide how content should be moderated. These are often referred to as;
- Reactive moderation
- Distributed moderation
- Automated moderation
These methods require both automation and a human touch to secure the absolute best result.
Let’s look at how the methods are put into practical work by looking at human and automated content moderation. If you’re curious to learn more about the other types of content moderation, check out this other article about the 5 moderation methods.
What is human content moderation?
Human moderation, or manual moderation, is when humans manually monitor and screen user-generated content submitted to an online platform. The human moderator follows platform-specific rules and guidelines to protect online users by keeping unwanted, illegal, inappropriate content, scams, and harassment away from the website.
What is automated content moderation?
Automated content moderation means that any user-generated content submitted to an online platform will automatically be accepted, refused, or sent to human moderation based on the platform’s specific rules and guidelines. Automated moderation is the ideal solution for online platforms that want to ensure quality user-generated content goes live instantly and that users are safe when interacting on their websites.
According to a study done by Microsoft, humans only stay attentive for 8 seconds on average. Therefore, online platforms cannot afford to have slow time-to-site of user-generated content, or they might risk losing their users. On the other hand, users who encounter poor-quality content, spam, scam, inappropriate content, etc., are likely to leave the site instantly.
So, where does that leave us?
For online platforms not to jeopardize quality or time-to-site, they need to consider automated content moderation.
We often refer to machine learning AI (AI moderation) and automated filters when talking about automated content moderation. But what are they really?
What is AI content moderation?
AI content moderation, or tailored AI moderation, is a machine learning model built from online platform-specific data to efficiently and accurately catch unwanted user-generated content. An AI moderation solution will automatically take highly accurate automated moderation decisions – refusing, approving, or escalating content.
One example that showcases the power of AI moderation is the Swiss online marketplace, Anibis, which successfully automated 94% of its moderation whilst achieving 99.8% accuracy.
AI moderation will be great for routine decisions as long as you have a high-quality dataset that models can be built on. It excels at dealing with cases that almost always look identical or similar. This usually includes the vast majority of items that are posted to online marketplaces, and as such most platforms can benefit from using AI moderation.
It should also be mentioned; that AI moderation can be built on generic data. These models can be effective but are not as accurate as a tailored AI solution as they don’t consider site-specific rules and circumstances.
What is Automated filter content moderation?
Automated filter content moderation is a set of rules to highlight and catch unwanted content automatically. The filters (or rules) efficiently find content that can’t be misinterpreted or are obvious scams.
Filters are also great for covering sudden rule changes where the AI has not gotten up to speed yet (Training takes some time and a quality data set). This was well illustrated when the Corona pandemic suddenly made masks and toilet paper problematic. This makes filters a solid complimentary automation tool for your moderation setup.
Automated filters can easily be created, edited, and deleted in our all-in-one tool, Implio – learn how to create filters here.
Do’s and don’ts of content moderation
Determining what to do and not to do in content moderation may vary from site to site. There are many elements and factors that need consideration to get the moderation set up best suited for your specific needs.
However, regardless if you’re running an online marketplace, social media platform, sharing economy site, etc., there are some things true about what to do and not to do when it comes to content moderation.
Do’s of content moderation
Do select the method that’s right for your needs
Start off by looking at what kind of content your site hosts and who your users are. This will help you clearly understand what your content moderation method and setup requirements. For example, the type of user-generated content found on Medium contra Facebook is very different from their users’ behavior. This makes their moderation methods and setups look different to fit their platform’s specific needs.
Do create clear rules and guidelines
Rules and guidelines must be clear for everyone directly involved with your online platform’s content moderation. Everyone from the data scientist developing your AI moderation to the human moderator reviewing content, regardless if they sit in-house or are outsourced to partners. Uncertainty in your rulebook can set your moderation efforts back from a financial and user experience perspective.
Do moderate all types of content
Whether you’re running an online marketplace, dating site, or social media platform, your users are key contributors to your platform. Making sure they’re enjoying pleasant experiences and are met with quality website content should be of interest to you. To achieve this, you must ensure your content moderation is done right. In a perfect world, moderating all types of content on your site, from text and images to videos and 1-to-1 messages, would be ideal. The reality is that this is not an approach possible for all online platforms; for financial and technical reasons. If that’s your case, as a minimum approach, make sure to identify your high-risk categories and content and start your moderation efforts there
Don’ts of content moderation
Don’t misinterpret what good content is
Quality content is key to building user trust and achieving a splendid user experience on your online platform, but it’s important to understand good content. Don’t make the mistake of misinterpreting good content and ending up rejecting user-generated content simply because it’s of negative nature. For example, a negative comment or review following a transaction can still be good content as long as no harsh language is used. Genuine content is what you want, as it enhances quality and user trust.
Don’t wait too long before you get started with moderation
If you’re in the early stages of establishing your online platform, getting started with content moderation might feel like it’s miles away. It’s not. Don’t get us wrong, perhaps it shouldn’t be your main priority right out of the gate, but you need to plan how to handle user-generated content from a moderation perspective when you scale.
As you’re growing and the network effect kicks in, you often see a rapid increase of content flooding into your site. You need to be prepared to handle that; if not, your big break might actually end up hurting you in the long run.
Don’t waste resources
Don’t reinvent the wheel. With multiple content moderation tools and solutions available in the market, it’s important that you prioritize your resources carefully. Besedo can help you with any content moderation need; just send us a message.
Innovation and growth are what will boost your online platform to success, and this is where your dev resources will give you the most competitive advantage. Find your way to free up your resources for innovation without risking falling behind with your moderation efforts.
The Job Scams Epidemic
Learn more about how hackers use brands to harvest personal details. We share how you can fight back using content moderation on your job board.
10 Tips For Startups Dealing With User-Generated Content
Review code and design – not d**k pics. Focusing on your core product and your design is what makes startups fun.
Keeping Your Gaming Platform Safe And Enhancing Your User Experience
Prevent bullying, grooming, and harassment on the gaming platform you’re running. In-app messaging should be a safe place for all gamers – your users’ safety, and your reputation is on the line.
Welcome to the Age of Fake Dating Profiles
With the rise of online dating comes the problem of fake profiles. So why do people create fake dating profiles and what is done to stop it?
How Bad UX Can Ruin Your Online Brand
With user-generated content platforms you’re essentially handing over a massive chunk of your user experience to your community.
The World’s Top Online Marketplaces 2022
Find out which online marketplaces are the biggest in various countries, categories, and more in our list of the biggest marketplaces online.
How can dating apps be flirty but not dirty?
Evolution of language, part three
Making sure dating apps are about ’amore’ not fraud
Evolution of language, part two
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.