To reach high accuracy and precision, your content moderation team tools that provide them with enough insights to make the right decisions.
“It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.”— Sir Arthur Conan Doyle, through Sherlock Holmes
With that in mind, we are developing Implio to present as much relevant data as possible to users when faced with a moderation decision. The keyword here is relevant. We want to ensure that key information isn’t drowned out by filler data but also that moderators have easy access to earlier conclusions and historical data pertaining to the person or item they are reviewing.
Our most recent step on the road to full representation is Implio’s newest feature: moderation notes.
With Moderation Notes, moderators can share insights about end users and the content items they post. These insights then automatically appear next to items being reviewed, as relevant.
Moderation notes are for instance very powerful in fighting fraud. When a moderator rejects an item as fraudulent, they can leave a note stating exactly that and even add additional information. Next time an item comes in from the same user or from the same IP address or email, the next moderator in charge will see the note that was left behind and know that they should be extra diligent when reviewing the item.
How to leave notes
The more your team uses notes, the more powerful they become.
Once you start using the feature ask your team to start leaving notes with insights that contributed to their moderation decision or that may be useful in the future.
Notes can be left by clicking the note icon located in the top-right corner of an item.
Clicking that icon reveals a text field which allows you to leave a note, up to 2,000 characters long:
You can create as many notes as you need to, but notes cannot be edited or deleted. This is to ensure that important data isn’t accidentally removed.
How does moderation notes work?
Implio will look for any relevant notes for any incoming item in a moderation queue and display them.
This happens for any note left on an item sharing one or more of the following attributes with the item currently being reviewed:
- same item ID
- same user ID
- same IP address
- same email address
- same phone number
Attributes in common between the note and the item being reviewed are symbolized by icons displayed above the note itself.
If the icon is greyed out it means, there’s no relation between that specific data point and the item being reviewed.
For instance, if the name and email are different, but the IP and phone number are the same you will see the former greyed out while the latter will be highlighted.
The moderator who left the note and the date at which it was left are indicated below the note.
It’s important to consider that moderation notes are to be used as additional information to help moderators take the right decision. On their own they are not enough to give a full picture of the user and their actions. They are however an important piece of the puzzle when dealing with grey area cases and a powerful complement to existing insights.
There’s more to come
This is the first version of the moderation notes feature, but we have big plans on how to make it an even better tool in our ongoing objective to improve efficiency and accuracy.
“A feature like moderation notes might sound simple, but used collaboratively in moderation teams, it can be incredibly powerful.
We’ve designed the feature around the needs from our customers, with a strong focus on ease of use. But we’ve also looked forward ensuring that notes can be leveraged by other parts of Implio, to make it even more useful.
The next step is to have automation rules make use of moderation notes. For instance, by automatically sending new contents for manual review if the user has received a note with a specific keyword like ‘fraud’ in the past.”
– Maxence Bernard, Chief R&D Officer at Besedo
Building Trust and Safety: Why It Matters and How to Get It Right
Discover the importance of trust and safety for websites and apps, learn effective strategies, and explore case studies to ensure a secure user experience.
Sharing Economy vs. Online Marketplaces: Key Differences and Opportunities
Learn the differences between sharing economy companies and online marketplaces. Plus a look at successful sharing economy companies and content moderation.
Content Moderation Glossary
Get in the know with our ultimate glossary of content moderation. From UGC to AI-powered moderation, we’ve got you covered. Learn the lingo now!
Digital Services Act (DSA): What It Is and What It Means for Content Moderation
We explain what you need to know everything you need to know about this new law in an easy-to-understand way. Stay ahead of the game in 2023, from transparency and accountability to prohibiting dark patterns.
Doxxing: How to Protect Your Platform and Users
From high-profile doxxing incidents to the potential consequences for victims and businesses, our post covers everything you need to know about this serious threat to online privacy and security.
Creating Trust and Safety in UX Design: Balancing Convenience and Security
Learn how to enhance UX design with trust and safety. Discover tips and best practices for creating secure user experiences that build trust.
Announcing Our Reporting Feature: Download and Visualize Your Data
Announcing Besedo reporting! Download and import your data into your favorite business intelligence tool to create all sorts of graphs, charts, and data magic.
The Advantages of Outsourcing Content Moderation
Discover the advantages of outsourcing content moderation, including cost savings, improved efficiency, access to expertise, scalability, and an improved user experience.
What Is User-Generated Content (UGC)?
Learn everything there is about user-generated content (UGC) and how it’s used. We also take a look at great real-world examples of UGC, and how it affects businesses worldwide.
The Job Scams Epidemic
Learn more about how hackers use brands to harvest personal details. We share how you can fight back using content moderation on your job board.
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.