What is content moderation?

What is content moderation?

Content moderation is when an online platform screen and monitor user-generated content based on platform-specific rules and guidelines to determine if the content should be published on the online platform, or not.

In other words, when content is submitted by a user to a website, that piece of content will go through a screening process (the moderation process) to make sure that the content upholds the regulations of the website, is not illegal, inappropriate, or harassing, etc.

Content moderation as a practice is common across online platforms that heavily rely on user-generated content, such as social media platforms, online marketplaces, sharing economy, dating sites, communities and forums, etc.

There are a number of different forms of content moderation; pre-moderation, post-moderation, reactive moderation, distributed moderation, and automated moderation. In this article we’re looking closer at human moderation and automated moderation, but if you’re curious to learn more, here’s an article featuring the 5 moderation methods.

 

What is human moderation?

Human moderation, or manual moderation, is the practice when humans manually monitor and screen user-generated content which has been submitted to an online platform. The human moderator follows platform-specific rules and guidelines to protect online users by keeping content like unwanted, illegal, scam, inappropriate, and harassment, off the site.

 

What is automated moderation?

Automated moderation means that any user-generated content submitted to an online platform will be accepted, refused, or sent to human moderation, automatically – based on the platform’s specific rules and guidelines. Automated moderation is the ideal solution for online platforms who want to make sure that qualitative user-generated content goes live instantly and that users are safe when interacting on their site.

According to a study done by Microsoft, humans only stay attentive for 8-seconds on average. Therefore, online platforms cannot afford to have slow time-to-site of user-generated content or they might risk losing their users. On the other hand, users who encounter poor quality content, spam, scam, inappropriate content, etc., are likely to leave the site instantly. So, where does that leave us? In order for online platforms not to jeopardize quality or time-to-site, they need to consider automated moderation.

When talking about automated moderation, we often refer to machine learning AI (AI moderation) and automated filters. But what are they really?

What is AI moderation?

AI moderation, or tailored AI moderation, is machine learning models built from online platform-specific data, to efficiently and accurately catch unwanted user-generated content. An AI moderation solution will take highly accurate automated moderation decisions – refusing, approving, or escalating content automatically.

One example that showcases the power of AI moderation is the Swiss online marketplace, Anibis, who successfully automated 94% of their moderation whilst achieving 99.8% accuracy.

It should also be mentioned; AI moderation can be built on generic data. These models can be very effective but are in most cases not as accurate as a tailored AI solution.

What is Automated filter moderation?

Automated filter moderation is a set of rules to automatically highlight and catch unwanted content. The filters (or rules) are efficient while finding content that can’t be misinterpreted or are obvious scams. This makes them a solid complimentary automation tool for your moderation set up. Automated filters can easily be created, edited and deleted in our all-in-one content moderation tool, Implio – learn how to create filters here.

 

Do’s and don’ts of content moderation

Determining what to do and not to do in content moderation, may vary from site to site. There are many elements and factors that need consideration to get the moderation set up best suited for your specific needs.

However, regardless if you’re running an online marketplace, social media platform, or sharing economy site, etc., there are some things true of what to do and not to do when it comes to content moderation.

Do’s of content moderation

  • Do: Select the moderation method that’s right for your needs

    Start off by looking at what kind of content your site hosts and who your users are. This will help you create a clear picture of what’s required from your moderation method and setup. For example, the type of user-generated content found on Medium contra Facebook is very different, and their users’ behavior too. This makes their moderation methods and setups look differently in order to fit their platform’s specific needs.

  • Do: Create clear rules and guidelines

    Your content moderation rules and guidelines need to be clear for everyone who is directly involved with your online platform’s content moderation. Everyone from the data scientist developing your AI moderation to the human moderator reviewing content, regardless if they sit in-house or are outsourced to partners. Uncertainty in your rulebook can set your moderation efforts back; both from a financial and from a user experience perspective.

  • Do: Moderate all types of content

    Regardless if you’re running an online marketplace, dating site, or a social media platform, your users are key contributors to your platform. Making sure they’re enjoying pleasant experiences and are met with quality content on your site, should be of your interest. To achieve this, you need to make sure your content moderation is done right.

    In a perfect world moderating all types of content on your site, from text and images, to videos and 1-to-1 messages, would be ideal. The reality though, is that this is not an approach possible for all online platforms; for financial and technical reasons. If that’s your case, as a minimum approach make sure to identify your high-risk categories and content and start your moderation efforts there.

Don’ts of content moderation

  • Don’t: misinterpret what good content is

    Quality content is key to build user trust and achieve a splendid user experience on your online platform, but it’s important to understand what good content is. Don’t make the mistake of misinterpreting good content and end up rejecting user-generated content simply because it’s of negative nature.

    For example, a negative comment or review following a transaction can still be good content, as long as no harsh language is used of course. Genuine content is what you want, as it enhances quality and user trust.

  • Don’t: wait too long before you get started with moderation

    If you’re in the early stages of establishing your online platform, getting started with content moderation might feel like its miles away. It’s not.

    Don’t get us wrong, perhaps it shouldn’t be your main priority right out of the gate, but you need to have a plan for how to handle user-generated content, from a moderation perspective, when you scale. As you’re growing, and the network effect kicks in, you often see a rapid increase of content flooding into your site. You need to be prepared to handle that; if not, your big break might actually end up hurting you in the long run.

  • Don’t: waste resources

    Don’t reinvent the wheel. With multiple content moderation tools and solutions, like Implio, available in the market, it’s important that you prioritize your resources carefully. Innovation and growth are what will boost your online platform to success, and this is where your dev resources will give you the most competitive advantage. Find your way to free up your resources for innovation, without risking falling behind with your moderation efforts.

Want to learn more?
Join the crowds who receive exclusive content moderation insights.