✨ New ✨ The Digital Services Act: A fireside chat covering all angles Watch it here → ×

What Is Content Moderation? (Plus Best Practices)

Contents

    Content moderation is the process of reviewing and monitoring user-generated content on online platforms to ensure that it meets certain standards and guidelines. This includes removing inappropriate or offensive content and enforcing community guidelines and terms of service.

    In other words, when a user submits content to a website, that content will undergo a screening process (known as the moderation process) to ensure that the content upholds the website’s regulations and is not illegal, inappropriate, harassing, etc.

    Photo of an office with several people at their desks
    Photo by Israel Andrade on Unsplash

    Content moderation as a practice is common across online platforms that heavily rely on user-generated content, such as social media platforms, online marketplaces, sharing economy, dating sites, communities, and forums, etc.

    There are many different methods by which companies can decide how content should be moderated. These are often referred to as;

    1. Pre-moderation
    2. Post-moderation
    3. Reactive moderation
    4. Distributed moderation
    5. Automated moderation

    These methods require automation and a human touch to secure the best result.

    Let’s look at how the methods are put into practical work by looking at human and automated content moderation. If you’re curious to learn more about the other types of content moderation, check out this article about the 5 moderation methods.

    What is human content moderation?

    Human moderation, or manual moderation, is when humans manually monitor and screen user-generated content submitted to an online platform. The human moderator follows platform-specific rules and guidelines to protect online users by keeping unwanted, illegal, inappropriate content, scams, and harassment away from the website.

    What is automated content moderation?

    Automated content moderation means that any user-generated content submitted to an online platform will automatically be accepted, refused, or sent to human moderation based on the platform’s specific rules and guidelines. Automated moderation is the ideal solution for online platforms that want to ensure quality user-generated content goes live instantly and that users are safe when interacting on their websites.

    According to a study done by Microsoft, humans only stay attentive for 8 seconds on average. Therefore, online platforms cannot afford to have slow time-to-site of user-generated content, or they might risk losing their users. On the other hand, users who encounter poor-quality content, spam, scam, inappropriate content, etc., are likely to leave the site instantly.

    So, where does that leave us?

    For online platforms not to jeopardize quality or time-to-site, they need to consider automated content moderation.

    We often refer to machine learning AI (AI moderation) and automated filters when talking about automated content moderation. But what are they really?

    What is AI content moderation?

    AI content moderation, or tailored AI moderation, is a machine learning model built from online platform-specific data to efficiently and accurately catch unwanted user-generated content. An AI moderation solution will automatically take highly accurate automated moderation decisions – refusing, approving, or escalating content.

    One example that showcases the power of AI moderation is the Swiss online marketplace, Anibis, which successfully automated 94% of its moderation whilst achieving 99.8% accuracy.

    AI moderation will be great for routine decisions if you have a high-quality dataset on which models can be built. It excels at dealing with cases that almost always look identical or similar. This usually includes the vast majority of items posted to online marketplaces, and as such most platforms can benefit from using AI moderation.

    It should also be mentioned; that AI moderation can be built on generic data. These models can be effective but are not as accurate as a tailored AI solution as they don’t consider site-specific rules and circumstances.

    There is also real-time content moderation to consider if you have chats for instance.

    What is Automated filter content moderation?

    Automated filter content moderation is a set of rules to automatically highlight and catch unwanted content. The filters (or rules) efficiently find content that can’t be misinterpreted or are obvious scams.

    Filters are also great for covering sudden rule changes where the AI has not gotten up to speed yet (Training takes some time and a quality data set). This was well illustrated when the Corona pandemic suddenly made masks and toilet paper problematic. This makes filters a solid complimentary automation tool for your moderation setup.

    Automated filters can easily be created, edited, and deleted in our all-in-one tool, Implio – learn how to create filters here.

    Do’s and don’ts of content moderation

    Determining what to do and not to do in content moderation may vary from site to site. Many elements and factors need consideration to get the moderation set up best suited for your specific needs.

    However, regardless if you’re running an online marketplace, social media platform, sharing economy site, etc., there are some things true about what to do and not to do when it comes to content moderation.

    Do’s of content moderation

    Do select the method that’s right for your needs

    Start by looking at what kind of content your site hosts and who your users are. This will help you clearly understand your content moderation method and setup requirements. For example, the type of user-generated content found on Medium contra Facebook differs greatly from their users’ behavior. This makes their moderation methods and setups look different to fit their platform’s specific needs.

    Do create clear rules and guidelines

    Rules and guidelines must be clear for everyone directly involved with your online platform’s content moderation. Everyone from the data scientist developing your AI moderation to the human moderator reviewing content, regardless if they sit in-house or are outsourced to partners. Uncertainty in your rulebook can set your moderation efforts back from a financial and user experience perspective.

    Do moderate all types of content

    Whether you’re running an online marketplace, dating site, or social media platform, your users are key contributors to your platform. Making sure they enjoy pleasant experiences and are met with quality website content should interest you. To achieve this, you must ensure your content moderation is done right. In a perfect world, moderating all types of content on your site, from text and images to videos and 1-to-1 messages, would be ideal.

    The reality is that this is not an approach possible for all online platforms; for financial and technical reasons. If that’s your case, as a minimum approach, make sure to identify your high-risk categories and content and start your moderation efforts there

    Don’ts of content moderation

    Don’t misinterpret what good content is

    Quality content is key to building user trust and achieving a splendid user experience on your online platform, but it’s important to understand good content. Don’t make the mistake of misinterpreting good content and ending up rejecting user-generated content simply because it’s negative.

    For example, a negative comment or review following a transaction can still be good content as long as no harsh language is used. Genuine content is what you want, as it enhances quality and user trust.

    Don’t wait too long before you get started with moderation

    If you’re in the early stages of establishing your online platform, getting started with content moderation might feel like it’s miles away. It’s not. Don’t get us wrong, perhaps it shouldn’t be your main priority right out of the gate, but you need to plan how to handle user-generated content from a moderation perspective when you scale.

    As you’re growing and the network effect kicks in, you often see a rapid increase of content flooding into your site. You need to be prepared to handle that; if not, your big break might actually end up hurting you in the long run.

    Don’t waste resources

    Don’t reinvent the wheel, and if possible, start early. With multiple content moderation tools and solutions available in the market, you must prioritize your resources carefully. Besedo can help you with any content moderation need.

    Innovation and growth will boost your online platform to success, and this is where your dev resources will give you the most competitive advantage. Find your way to free up your resources for innovation without risking falling behind with your moderation efforts.

    Glossary

    Recently we added a glossary about content moderation. So get in the know with our ultimate glossary of content moderation. From deciphering UGC to navigating the murky waters of hate speech – and just what in the world is this time to site anyway?

    This is Besedo

    Global, full-service leader in content moderation

    We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

    Form background

    Contents