Media consumption is rapidly shifting from text to images to video. Social video now generates 1200% more shares than text and image content combined. It won’t stop there: this year we have also seen audio formats like podcasts and voice notes drive increasing amounts of engagement.

Many companies have responded to changes in content consumption and enriched their customer experience by adding video formats. Since Instagram introduced Stories and Reels, more marketplaces have incorporated videos of products, as well as allowing users to post video reviews so shoppers can see items in action.

As new types of content are added to enhance the customer experience and drive business growth, businesses become ever more shaped by the content their users share. This is a powerful opportunity to take the transformative interactivity of digital technology to the next level, building a brand’s userbase into its identity and value. It’s also a risk, of course – no matter the format, if the content is harmful to the user, it’s harmful to the brand.

This is why content moderation technology is critical. If the content is not moderated, harmful content goes undetected and valuable content goes underutilized. When a user sees content that damages their experience, their customer loyalty is then impacted and ultimately business growth is diminished.

Act now – reacting to harmful content is too late

Businesses must have proactive control over the user-generated content (UGC) on their website. It’s not enough to react when a harmful video appears on the site, as the damage will already be done. Businesses need to take a preventative approach where they stay ahead of damaging UGC to protect their users.

As more and more user-generated content is created, especially in several different formats, it becomes harder for businesses to keep up with content moderation needs. Businesses must constantly adapt their content moderation efforts to keep up with new consumer behaviors, but this product development takes time. Developer teams are then placed under pressure to deliver constant innovation to be able to handle all types of content.

This is a huge challenge, which is time-consuming and requires constant agility, but if not addressed the potential for damage to the customer experience is huge.

CTOs and the tech teams they lead have a lot on their plate. Across different industries we can see incredible innovation happening with UGC, making the customer experience more sincere, more useful, more delightful, and more meaningful by empowering user interaction. Managing that content, however, can lead to an unpleasant decision between building that capability in-house (and stretching developers yet thinner) or buying a solution off-the-shelf (and risk discovering that it is not truly fit for purpose).

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Lean on an expert business partner

Technologies like video UGC are powerful growth drivers – but without control over what appears on the platform, that growth is unsustainable. So how can businesses ensure their teams can keep up with moderation needs whilst reaping the benefits new formats provide to the user experience at the same time?

Working with a business partner who can offer content moderation tools that address new formats like video is extremely valuable. At Besedo we are currently developing moderation for video content. Outsourcing means the partner is a source of innovation for the business – but we also take the pressure in terms of product development and our teams specialize in content moderation alone.

At the same time, Besedo offers a partnership, not just off-the-shelf tools, so moderation is fit-for-purpose and tailored to needs.

We realize that every company’s UGC needs are different and build customized content moderation solutions that address the differing needs of businesses; from dating apps where users are connecting, to marketplaces where sellers are interacting with customers.

An expert partner can ensure the moderation of all types of UGC proactively, too. We use a combination of artificial intelligence and human moderation to ensure no bad content slips through the cracks. Businesses need a trusted partner where they know their user experience will be safe.

When harmful content is prevented, the user experience improves, users will stay loyal, and lifetime user value increases. Being able to innovate your approach to content moderation to keep up with users’ content consumption habits is key to sustainable growth

By Maxence Bernard

Global Head of Product – Besedo/ Implio

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Video is quickly becoming a must have for any digital platform that wants to engage users. That’s no different for online marketplaces.

According to a study by Animoto consumers are 73% more likely to purchase a product or service that’s presented with a video.

With that in mind it’s not surprising that we’re seeing an increasing number of our clients embracing video as a way to increase conversions and engagement.

But with the new content type comes new moderation challenges.

That’s why we at Besedo have been hard at work to ensure that both videos and audio can be managed alongside the traditional content types in Implio.

As a first step it is now possible to send video and audio through our API. They will await for review in moderation queues as other content pieces, with a response sent back to your servers once moderated.

How does video and audio moderation in Implio work?

Like traditional moderation items, you access video and audio content through moderation queues. To help moderators take quicker and more accurate decisions the video and audio clips will be surrounded by all available supporting user and content fields, such as previously-moderated items, geolocation, etc.

Agents can enlarge the video to better inspect details, and of course mute, pause and seek through the video.

If the video breaks the rules of the platform, agents can remove it. They can reverse the later in case a mistake was made.

Video and audio can be moderated as standalone items, or combined with text and images. In the latter case it is possible to refuse a video while accepting everything else or vice versa.

With the addition of multimedia moderation capabilities, Implio empowers digital platform owners to keep site quality high and their users safe as they embrace video and incorporate the format into their service offering.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

To reach high accuracy and precision your moderation team tools that provide them enough insights to take the right decisions.

It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.

– Sir Arthur Conan Doyle through Sherlock Holmes.

With that in mind we are developing Implio to present as much relevant data as possible to users when they are faced with a moderation decision. The keyword here is relevant. We want to ensure that key information isn’t drowned out by filler data, but also that moderators have easy access to earlier conclusions and historical data pertaining to the person or item they are reviewing.

Our most recent step on the road to full representation is Implio’s newest feature: moderation notes.

With Moderation Notes, moderators can share insights about end users and the content items they post. These insights then automatically appear next to items being reviewed, as relevant.

Moderation notes are for instance very powerful in fighting fraud. When a moderator rejects an item as fraudulent, they can leave a note stating exactly that and even add additional information. Next time an item comes in from the same user or from the same IP address or email, the next moderator in charge will see the note that was left behind and know that they should be extra diligent when reviewing the item.

How to leave notes

The more your team uses notes, the more powerful they become.

Once you start using the feature ask your team to start leaving notes with insights that contributed to their moderation decision or that may be useful in the future.

Notes can be left by clicking the note icon located in the top-right corner of an item.
Clicking that icon reveals a text field which allows you to leave a note, up to 2,000 characters long:

moderation notes icon

You can create as many notes as you need to, but notes cannot be edited or deleted. This is to ensure that important data isn’t accidentally removed.

How does moderation notes work?

For any incoming moderation item in a moderation queue, Implio will look for any relevant notes and display them.

This happens for any note left on an item sharing one or more of the following attributes with the item currently being reviewed:

  • same item ID
  • same user ID
  • same IP address
  • same email address
  • same phone number

Attributes in common between the note and the item being reviewed are symbolized by icons displayed above the note itself.

moderation notes attributes

If the icon is greyed out it means, there’s no relation between that specific data point and the item being reviewed.

For instance, if the name and email are different, but the IP and phone number are the same you will see the former greyed out while the latter will be highlighted.

The moderator who left the note and the date at which it was left are indicated below the note.

It’s important to consider that moderation notes are to be used as additional information to help moderators take the right decision. On their own they are not enough to give a full picture of the user and their actions. They are however an important piece of the puzzle when dealing with grey area cases and a powerful complement to existing insights.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


There’s more to come.

This is the first version of the moderation notes feature, but we have big plans on how to make it an even better tool in our ongoing objective to improve efficiency and accuracy.

A feature like moderation notes might sound simple, but used collaboratively in moderation teams, it can be incredibly powerful.

We’ve designed the feature around the needs from our customers, with a strong focus on ease of use. But we’ve also looked forward ensuring that notes can be leveraged by other parts of Implio, to make it even more useful.

The next step is to have automation rules make use of moderation notes. For instance, by automatically sending new contents for manual review if the user has received a note with a specific keyword like ‘fraud’ in the past.

– Maxence Bernard, Chief R&D Officer at Besedo

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Most online platforms would agree that Images are one of the absolute most important elements of a user profile. When it comes to building trust between strangers, having an actual picture of the person you are about to engage with is vital. Whether you are looking for a date, booking a ride or renting a holiday home interacting with a total stranger can be daunting. In the real world visual clues like facial expression and body language is intuitively used to decode intent. In the digital world we must emulate these trust markers and images are a crucial part of this.

There’s one problem though. While good images can put a face on a stranger and boost user trust, bad images can have the exact opposite effect. This is one of the reasons we advocate for image quality and why we’re continuously expanding on Implios capabilities for catching and managing bad, inappropriate or low-quality images.

The latest tool in Implios image moderation toolbox is a misoriented AI module.

Why should you care about misoriented images?

The use case is straight forward of course, misoriented images (E.g. Wrongly rotated or upside down) will be caught by the model and sent for manual moderation.
Catching misoriented images is important for the overall impression of your site. A bunch of upside-down faces will make browsing time-consuming and confusing at best or make your platform look unprofessional and scammy at worst.
As more users access and create their profiles using mobile phones the number of images that are misoriented increase and the need to efficiently deal with the issue grows accordingly.

Which is why we’re excited to announce that Implio can now help you automatically identify misoriented images.

How to automatically detect misoriented images

The misoriented module will be available to all Implio users off the shelf soon. For now to gain access, just reach out to us and we’ll activate it for you. When the module is active all images will be scanned by the AI and tagged with misoriented if they are rotated wrongly.

This tag can then be utilized in Implios powerful rule creator where you can decide to send to manual moderation, reject outright (not recommended) or take no action if you want to use the rule for tracking purposes only.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Here’s an example of an image caught by the new misoriented module. As you can see the picture is upside down and it’s been tagged by the AI with “face” and “misoriented”

rule match for misoriented images

To the right you can see that it has matched the misoriented rule.

If you decide to send misoriented images to the manual queue moderators will be able to fix the issue. Here’s a view of Implios image editing tool. Here you can crop and rotate images as you see fit.

rule match for misoriented images

This version of the misoriented image model works best with human subjects, but we’re hard at work expanding on it and soon we’ll add capabilities that will allow the model to tag items with the same level of accuracy.

If you’re looking for a way to optimize the way you handle misoriented images on your site or platform then get in touch. We can help you with the setup and have a look at your site for other low hanging content quality issues that can easily be resolved with a good moderation setup.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

The newest feature to be released in Implio is “Quick Add to List”. It’s a functionality that makes it possible to add information to automation lists while moderating without breaking flow.

When hovering over an IP address, a phone number, an email or an item, or user ID, moderators with the roles automation specialist or admin will now get a small icon allowing them to add the info to an already existing list.

Smoothly populate lists with the click of a button

This new feature makes it much easier to populate all kinds of black/white/grey lists. Use cases include auto-refusing select IP addresses (for instance those confirmed to be abused by scammers), or auto-approving items of select user IDs (professional users for example).

How to use the Quick Add to List Feature

To illustrate how “Quick Add to List” works, let’s assume you’d like a list of blacklisted IPs. If you already have a list you can import the current IPs to the list, but in this example, we’ll start from scratch building a new list.

Create a list in Implio which will contain all your blacklisted IPs


Create a rule that acts whenever an IP from that list is detected


When moderating hover over the IP to make the quick add to list-icon appear and click if you want to add it to a list. You can choose any list you have. In this case, we’ll add it to the Scam_IP list.


Quick add to list works with the following standard field values:

  • Item ID
  • UserID
  • Email address
  • Phone number
  • IP address

The quick add to list feature is just one of many small “quality of life” improvements we’re building into Implio. The goal is to continue offering our users the most efficient content moderation platform on the market.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Sometimes small features can have a big impact. With our newly implemented user counter you get a whole new level of insights about your users.

What it does

The user counter shows you how many items the user has had approved and how many they’ve had rejected. You can also quickly access an overview to see the actual listings that were approved or rejected giving insight into user behavior and listing habits.

How it works

Click an item in the Item log.


This brings up the item overview window. Here, next to the User ID you’ll find the user counter. The number in green shows approved listings by this user. The one in red, how many listings the user has had rejected.  

Use cases for user counter

If you have experience with content moderation you’ve probably already thought of several use cases for the user counter.

Here are a couple of examples of how it can be used in Implio.

1. Qualifying returning users

Need to understand the quality of a user? Check their listings history. If they have only rejections, this user may cause problems going forward as well.

2. Assistance in grey area decisions

When manually moderating items you sometimes come across grey area cases, where it’s hard to judge whether the listing is genuine or problematic. In those cases where you have to make a snap decision either way, having the user’s previous history to lean on can be helpful. A user with only approved listings in the past, is unlikely to have suddenly turned abusive. Although be cautious there are scammers turning this logic to their benefit through Trojan Horse scams. Here they first post a couple of benign listings, then once their profile looks good, they start posting scams.

3. Spotting users in need of education

Have you found a user who consistently get their listings rejected for non-malign reasons? A quick educational mail might help them out and cut down on your moderation volumes.

4. Identify new users

It’s always good to pay extra attention to new users as you don’t yet know whether they are bad actors. Knowing that a user has no previous history of listing items can act as a sign to be extra thorough when moderating. On the flip site, seeing a user with only approved listings allow you to speed up moderation of the item in question as it’s likely OK too. Just keep an eye out for the aforementioned Trojan Horse scammers.

To give a better understanding of how the user counter helps increase productivity and quality of moderation, we’ve asked a couple of our moderators for their experience working with the new feature.

“The user counter helps me get a perspective on the profile. If I see that a user has had listings refused more than two times, I access the profile to see the reason of the refusals. That allows me to make a better decision on the profile. It allows me to spot scammers quickly and make faster decisions.”

– Cristian Irreño. Content moderator at Besedo

“The user counter has allowed me to see the trends on profile decisions. It makes me be more careful when I see accounts with a higher number of refusals. Also, when I am working on a new account, I know I must be more careful with my decision.”

– Diego Sanabria. Content moderator at Besedo

“The counter helps me identify profiles that have frequent acceptance or refusals, and to spot new users.”

– Cristian Camilo Suarez. Content moderator at Besedo

The user counter is available to all Implio users regardless of plan. Want to start using Implio for your moderation? Let us know and we’ll help you get started.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Our new Implio feature enhances the quality of the images shared on your platform. Your manual moderation team can now crop and rotate user-generated images quickly and efficiently in the moderation tool.

For online platforms like online marketplaces and dating sites, creating a good user experience and trustworthy environment is essential; and high-quality pictures are crucial in that matterIn our user search study, users unanimously picked quality images as the reason to prefer one site over another.

Profile picture or images are crucial for users to trust the person on the other side of the screen or what they want to sell or buy. And as a company, you want to create and maintain that trust for your users.
On dating sites or online marketplaces, the cropping and rotating feature helps you to moderate pictures to comply with your company’s guidelines. For instance, cropping profile pictures so that only one person appears or ensuring that the user’s face is distinctly visible. On top of this, images can also be rotated to make sure that images submitted upside down, or from the wrong angle, can easily be corrected.

The cropping and rotation feature in Implio helps you improve trust and user experience for both your sellers and buyers.

Here’s how the feature works:

Implio cropping and rotation

Curious to learn more about our new feature?

Have a look at our Implio features list or sign up to our all-in-one content moderation tool Implio, and try it out.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Efficiency and accuracy are two of the most valuable KPIs online marketplaces track to evaluate their manual moderation performance. The key to an optimized manual moderation team is to find the right balance between efficiency and accuracy.

However, here’s the pitfall, if you push your moderators too hard to achieve efficiency, this can, in time, lessen their accuracy and jeopardize the quality of the content published on your marketplace. Low-quality content is likely to slip through the cracks threatening the reputation of your platform, damaging user trust, and putting your users at risk, varying from user experience issues to more serious threats such as identity theft or scams. 

For your online marketplace to succeed and to keep potential issues at bay, it’s imperative to provide your moderation team with the right moderation tools to help them be as efficient and accurate as possible.

At Besedo, we are continually looking to improve our all-in-one content moderation tool, Implio, by adding features to ensure your content moderators perform at their best.
Whether it’s highlighting keywords, working with specialized moderation queues, enabling quick links or warning messages, many features available in Implio are created to ease your moderators’ daily work and improve their overall performance.

Keyboard shortcuts – efficient manual moderation

Implio’s brand new feature, Keyboard shortcuts, helps your moderators easily make decisions with a single click and navigate through listings without leaving their keyboard, making manual moderation both efficient and accurate.

From our initial tests, we found that keyboard shortcuts increased the manual moderation efficiency up to 40%, and we’re expecting to see that number increase as the moderators grow more familiar with the feature.

Here’s how the keyboard shortcuts work:

Ready to improve your moderation efficiency?

Get in touch with a content moderation expert today, or try Implio for free.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Is your site suffering from ‘marketplace leakage’? If so it’s because your customers are sharing their personal details with each other – to avoid paying site fees. But by doing so they also put themselves at risk. Here’s how to make sure your business protects itself from marketplace leakage and those that use it.

Marketplace leakage (also referred to as ‘breakage’) is a real problem for many online businesses. According to Venture Capitalists, Samaipata, the term can be defined as ‘what happens when a buyer and seller agree to circumvent the marketplace and continue transacting outside the platform.’

Broadly speaking, there are several ways in which personal details are shared – via listings, embedded in images, and within one-to-one chats. Information shared typically includes phone numbers, email addresses, WhatsApp details, and money transfer account details.

From a user perspective, it might make sense to try and do so. However, many don’t realize the wider ramifications of marketplace leakage and the negative impact it can have on the platforms they transact on – and on their own businesses.

Let’s look more closely at the impact of sharing personal details online via marketplaces and what can be done to prevent it.

How personal details do damage

As we see it, there are 3 key ways in which sharing personal details can have a negative impact.

1. Conversions

From eBay to Airbnb; Amazon to Fiverr – the vast majority of marketplaces facilitate the trade of goods and services. As a result, a core part of each platform is its payment infrastructure.

But not only do these solutions offer a trusted way for users to transact, they can also be used to collect fees – a percentage paid for using the platform.

In the early days of a platform’s existence, many sites may be available to both buyers and sellers for free – whilst the marketplace is trying to scale and get as many users as possible. However, once it’s reached a certain threshold and networks effects are visible, it’s common for them to begin charging, often through the transaction.

This is often when users – primarily those selling on these sites – will try to circumvent the platform and include their contact details in each post. It might be that they paste their email address in the product description itself, or create an image that has details included within it.

When this occurs, your marketplace loses out on conversions. It’s something that’s easy to overlook and – on the odd occasion – let slide. But in the long-term, activities like this will seriously dent your revenue generation.

2. Retention

One of the major differentiating factors between online marketplaces is whether they’re commoditized or non-commoditized – particularly where service-focused platforms are concerned.

While commoditized service providers are more about getting something specific fixed, delivered, or completed (think Uber or TaskRabbit); non-commoditized providers (eg Airbnb) take into account a number of determining factors – such as location, quality, and available amenities.

Due to the nature of these sorts of services, they are more likely to encourage personal interactions – particularly when repeat transactions with the same vendor are involved. Once trust and reliability are established, there’s little incentive for either party to remain loyal to the platform – meaning conversions are more likely to be forfeited.

Leakage of this nature was partly to blame for the demise of Homejoy – an on-demand home services recruitment platform. The nature of the work involved increased the likelihood of recurring transactions. However, it transpired that the features facilitated by the site – in-person contact, location proximity, and reliable workmanship – were of greater value than the incentives offered by using the site itself in many cases.

As a result, more and more transactions began happening outside of the marketplace; meaning that the site lost out on recurring revenues.

3. User safety

Losing control of the conversation and having users operate outside of your marketplace, increases the risk of them being scammed.

This is particularly prevalent in online dating, where even experienced site users can be duped into providing their personal details to another ‘lonely heart’ in order to take the conversation in a ‘different direction’.

eHarmony offers some great advice on what users should be wary of, but the general rule of thumb is to never disclose personal details of any kind until a significant level of trust between users has been established.

While similar rules apply to online marketplace users too, some telltale signs of a scammer are requests for alternative payment methods – such as bank or money transfers, or even checks.

An urgency to trade outside of the marketplace itself is also a sign to be aware of. So it’s important to advise your users to be cautious of traders that share their personal details. Also, make a point of telling them to be wary of vendors who are ‘unable’ to speak directly to them – those who request funds before any arrangements have been made.

In all cases, marketplaces that don’t monitor and prevent this kind of activity put their customers at risk. And if their transaction is taken away from your site, they forfeit the protection and assurances your online marketplace provides.

But unless your users understand the value and security of your platform, they’ll continue to pursue conversations off your site and expose themselves to potential scammers.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Preventing marketplace leakage

The best way to overcome these issues and prevent marketplace leakage is to do all you can as a marketplace owner to keep buyer-seller conversations on your site and reinforce why it’s in their (and to some extent your) interest not to share personal details and remain on your platform.

There are several ways to do this.

Stronger communication

The stronger the communication channels are within your platform, the less incentive there is for customers to navigate away from your site.

From eBay and Airbnb’s messaging functionality (which look and feel like email servers) to one-to-one chat platforms (similar to Facebook Messenger or WhatsApp), or even on-site reviews and ratings; the more user-friendly and transparent you make conversations between different parties, the greater the likelihood they’ll remain on your site. A point we also highlighted and covered in our webinar about trust building through UX design.

In addition, it’s always worth reinforcing exactly what your marketplace offers users – and reminding them of their place within it. For example, telling them they’re helping build a trust-based peer-to-peer network is a powerful message – one that speaks to each user’s role as part of a like-minded online community.

Provide added value services

If users feel as though there’s no real value to using your site – other than to generate leads or make an occasional purchase – there’s very little chance that you’ll establish any meaningful connection.

The best way to foster user loyalty is to make the experience of using your marketplace a better experience to the alternative. In short, you need to give them a reason to remain on your site.

In addition to safety and security measurements – consider incentives, benefits, and loyalty programs for both vendors and buyers.

Turo, the peer-to-peer car rental site is an example of a company that does this very well – by offering insurance to lenders and travelers: both a perk and a security feature.

In a similar way, eBay’s money-back guarantee and Shieldpay’s ‘escrow’ payment service – which ensures all credible parties get paid; regardless of whether they’re buying or selling – demonstrate marketplaces acting in both customers and their own interests.

Another way in which marketplaces offer better value is through the inclusion of back-end tools, which can help vendors optimize their sales. Consider OpenTable’s booking solution for example. The restaurant reservation platform doesn’t just record bookings and show instant availability; it also helps its customer fill empty seats during quieter services.

Platforms that can see past their initial purpose and focus on their customers’ needs are those that thrive. They offer a holistic, integrated solution that addresses a wider range of pain points. Which is a great way of ensuring they’ll remain loyal to your business; ultimately reducing leakage.

Filter and remove personal details

A relatively straightforward way to prevent marketplace leakages is to monitor and remove any personal details that are posted on your site.

However, this can turn out to become quite the task, especially when the amount of user-generated content increases.

The next logical step here would be to direct efforts towards improving your content moderation. Either improve your manual moderation and expand your team or look at setting up an automated moderation solution.

An automated filter is a great solution to help prevent personal details to be shared, and although the filter creation process can be complex, it’s definitely possible to create highly accurate filters to automatically detect and remove personal details in moderation tools like Implio.

Machine learning AI is another great automated moderation solution that will help with preventing personal details, and much more. Built on your platform-specific data, a tailored AI moderation setup is developed to meet your marketplace’s unique needs. This solution is a great option for online marketplaces that look for a complete customized solution.

Added value and moderation – a mutual benefit

Trust, security, and accountability are the most valuable features that any marketplace or classified sites can offer its users. However, they’re not always the most visible components.

But when they’re parts of a broader benefit – such as optimized user experience or a suite of useful features – the need to share personal details and transact way from a site is mitigated.

That said, shared personal details will always contribute to marketplace leakage. And without the right monitoring and moderation processes in place, it’s impossible for marketplace owners to overcome the challenge of marketplace leakage.

At Besedo, we work with online marketplace and classified sites to help them make the right choices when it comes to safeguarding their businesses and users by removing personal details.

To learn more about how you can prevent personal details form your marketplace, specifically through automated filters, check out our on-demand Filter Creation Masterclass.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Every day businesses and individuals are subject to online fraud and scam attempts.

Stockholm-based Besedo already has over 400 employees in 6 countries, which reviews over 500 million ads and listings each year. With the help of new capital, primarily from the Investment company AB Spiltan, Besedo can now intensify the launch of the company’s enhanced services, automating a large part of the work through artificial intelligence and machine learning.

“Up to every fifth post on ad or dating sites can be deceptive. We help classifieds sites and online marketplaces like Blocket or eBay Germany as well as global dating sites like find fake profiles, fraudulent ads, fight scam and keep millions of users safe. Through the new investment, we are even better equipped for the next step and gain resources to accelerate our sales. I’m especially pleased that my management team and I now also join as owners to continue pushing this company to an even stronger position in the market, “says Patrik Frisk, CEO of Besedo.“

Petter Nylander chairman of the board, previously CEO of Unibet and TV3 Sweden, participated in the total investment of approximately EURO 3 million.

“Besedo is my next big commitment and I really look forward to seeing the company grow, as more and more companies realize the need to keep their sites safe and remove unwanted content. With our automated product, we can quickly, smoothly and efficiently help our customers ensure that their websites provide the right information at a high level of quality. We are also developing image recognition methods, yet despite the rapid technological developments, AI and algorithms cannot solve all problems. We also need people to continuously develop and improve our services and take Besedo into more markets, “comments Petter Nylander.

The majority of the new investment comes from Investment company Spiltan, which now becomes a new major shareholder with just over 22 per cent of the shares.

“This deal fits very well into Spiltans ambition to help talented entrepreneurs continue to develop their businesses. The fact that CEO and management now also become an owner strengthens Besedo for the continued journey. The company has an exciting mix where the existing already profitable business together with the new, more automated services make this a very nice addition to our portfolio of growing entrepreneurial companies, “says Göran Pallmar, Investment Manager at Investment AB Spiltan who will now also join the company’s board.

About Besedo
Sweden based Besedo has around 400 employees in 6 countries. They offer a platform for content moderation of websites. Every year they review more than 570 million ads and listings for classifieds sites and online marketplaces like Schibsted, Ebay Europe and dating sites like helping them remove unwanted content and prevent fraud. On a yearly basis Besedo removes more than 40 million scam attempts and irrelevant content pieces. Read more at

About Investment AB Spiltan
Investment AB Spiltan is an investment company with access to around 500 million euro which they invest in companies with proven business models, growth potential and lead by talented entrepreneurs. Through long-term, personal engagement and an entrepreneurial approach Spiltan creates the foundations for stable development and return on investment. Spiltan has approximately 2900 shareholders and is traded at listan more info at


CEO at Besedo Patrik Frisk | | 0705-70 92 02

Chairman Petter Nylander | 0765-25 09 55

Investment Manager Göran | 0708-89 81 12

Communications Manager Gustav | 0707-74 74 00

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background