So you have a product, or you’re developing one, leveraging user-generated content (UGC). Great! Would you rather spend time designing and developing your product – and business – or start your mornings reviewing inappropriate content? 

That’s a somewhat leading question.

An illustration displaying the various layers of content moderation

As a startup, it’s important to focus on your core idea, product development, and design to create a successful product. Those efforts will set your product apart from the competition and make it successful. However, many distractions and clutter can take your focus away from what you need to do to what you feel you must do.

The expression ‘hire your weakness’ means to double down on your strengths while leveraging the benefit of others’ expertise. In some cases, you hire marketing and customer support. In others, you want to outsource, like accounting or content moderation. By outsourcing these tasks, you can free up your time to focus on product development and design.

5 tips for early days development

  1. Set rules: Your content will only be as good as the rules you set up as a business. What type of content do you want to allow, and what don’t you like? Set these rules early and put them clearly in your Terms of Service. Your users will have a much better experience if everyone obeys the house rules.
  2. Be proactive: Be clear about your standards and expectations for content posted on your site or apps. Start moderating content from the get-go and set the tone for what is and isn’t acceptable. Being reactive is much more complicated, and you will only try to catch up.
  3. Be transparent: If users see that you’re being proactive about moderation, they’ll be more likely to trust the process and feel comfortable submitting their own content. Use user-generated content to improve your product and reach new customers. Take feedback seriously, use it to inform your marketing strategy, and always look for ways to turn negative experiences into positive ones.
  4. Be smart and nice: Comedian Jimmy Carr said on Mike Birbiglia’s podcast Working It Out, “Never tell a joke if you gotta look around before telling it.” The same applies to apps and websites; if you need to take a quick look around and say, “it’s just us here – look at what is on this website I run,” that’s an indicator you’ve been neither smart nor nice.
  5. Ask for help: Hiring your weakness also means asking for help when needed. When developing a product, there’s no shame in not being the best accountant or lawyer in the world. You ask for help. The same goes for content moderation. You’ll be surprised by what you can learn from an email or a phone call with an expert.

“Never tell a joke if you gotta look around before telling it.”

— Jimmy Carr

5 reasons why you should outsource content moderation

There are several ways to outsource content moderation for your startup. We are, of course, somewhat biased on the matter *ahem* but here are five tips on outsourcing content moderation for your startup. 

  1. The cost is cheaper: When you outsource content moderation, you only pay for the service itself. There are no additional costs, such as advertising or hiring a full-time moderator. Doing this in-house would require your staff to work around the clock to keep up with everything, and you also have to consider all the languages they need to speak. And the time zones they need to cover during their shifts.
  2. It’s faster and more efficient: An outsourced and automated content moderation can handle a large volume of content quickly and efficiently. The best content moderators will ensure you get that hybrid solution where a human touch is needed. How are you supposed to know the difference between bullying and banter? And how do you train your staff to understand leet speak? Besedo uses Natural Language Processes, and our filters can detect millions of variations for each unwanted word.
  3. The quality is higher: A hybrid model with automation and manual moderation is the only way to ensure the highest possible quality. AI is scalable, hard to deceive, and has high automation levels. Combine that with filters, a control panel, and manual moderation with extremely high accuracy. Manual moderators will also understand the context, something an AI might only sometimes understand.
  4. You don’t have to worry about it: Doing content moderation in-house can be a logistical nightmare. But when you outsource content moderation, you don’t have to plan for holidays, sick leave, vacations, overtime, etc. 
  5. It’s a good investment: Outsourcing content moderation allows businesses to regulate all posts, comments, and reviews that users post on the company’s website. This includes websites, apps, and other digital platforms where fake reviews, NSFW material, and disinformation are rampant. Monitoring scams and threats is a time-consuming process that requires constant attention.

The consequences of harmful user-generated content

Harmful user-generated content can have several consequences for your startup. First and foremost, it can damage your brand reputation. Users posting negative comments or reviews about your product or service can dissuade other potential customers from using your business. Harmful user-generated content can also lead to decreased traffic and engagement on your website or social media platforms. Suppose users constantly see negative content about your company. In that case, they may be less likely to return to your site or interact with your brand online. 

Finally, harmful user-generated content will damage your relationship with the people who generate it. No one wants to see trolls run rampant in the commentary section or constantly see inappropriate content uploaded to a platform they generally like. They may become frustrated and stop submitting if you’re continually letting others destroy a platform they are investing their time and effort into helping grow. Strike a balance between giving users a platform to share their thoughts and maintaining the quality of your site.

You can also download our checklist “Set up your content moderation operations in 6 simple steps” for marketplaces. That’s an excellent start. We’d love to hear what your biggest challenges are right now. 

With the rise of online dating comes the inevitable problem of fake profiles. These are profiles created by people who are not interested in dating but are looking to scam others out of their money or boost their own ego. Learn to spot a fake profile and protect yourself from being scammed, fooled, or harassed.

Photo by Victoria Heath on Unsplash

The problem with fake profiles is bigger than you think

The problem with fake profiles is more widespread than you think. Some sites estimate that as many as 10% of dating profiles are fake. That means that for every 10 people you see on a dating site, one of them is likely not even a real person.

So why do people create fake profiles?

There are a few reasons. Some people do it to scam others out of money. They create a profile, build up a relationship with someone, and then ask for money. Others do it to boost their ego. They create a profile with photos that make them look much more attractive than they really are and then message people they know they will never meet.

Whatever the reason, fake profiles are a problem because they ruin the user experience for everyone else. That is just as bad for customers as it is for businesses.

There are a few things you can do to spot a fake profile, but the best thing you can do is to be aware of the problem and be cautious when you’re interacting with people online. If something seems too good to be true, it probably is. Trust your gut and be careful out there!

5 ways to detect fake profiles

How do you detect fake profiles? It’s actually not as difficult as you might think. There are a few key things to look for that can help you spot a fake profile pretty easily.

  1. The first thing to look for is a lack of personal information. If a profile doesn’t have much information about the person, it’s likely fake.
  2. Look for inconsistencies in the information that is provided. If a person’s age, location, or other details don’t seem to match up, it’s probably because they’re not being truthful.
  3. Ask to see their social media accounts. If they can’t provide you with their Instagram account in this day in age, well, we don’t know what to tell you.
  4. Have a look at the follower/following ratio. A fake profile has zero or below 10 followers. Come on, how many people do you know with less than 5 friends on Instagram or Facebook?
  5. Take a look at the photos that are posted on the profile. Fake profiles will often use stock photos or photos clearly not of the person claiming to be behind the profile. If something looks too good to be true, it probably is!

Also, if someone you match with quickly seems to get you all too quickly. You have the same interests, they mention music or tv-show you like, not saying that’s a red flag but if you look at the big picture it might be someone you know who is catfishing you.

How do we stop harassment in dating apps?

We’re all too familiar with the scenario. You mind your own business, chatting away on your favorite dating app, when suddenly the conversation turns left, and you’re bombarded with messages from someone you don’t know. It’s annoying, it’s intrusive, and it can even be scary.

Well, this is where content moderators play an important role in keeping chat apps safe for everyone. We help to stop harassment and bullying by enforcing the policies set by the company behind the platform. Most dating apps have a chat functionality in place that you can use once you have matched with someone. That’s all great.

Say what you want, but be nice and obey the house rules.

Besedo offers chat moderation in real-time

When someone doesn’t obey the house rules, technology like Besedo’s will ensure you don’t get a chance to see the offensive messages sent to you. Real-time filtering for profanity or nudity will ensure that we can keep chats clean and civilized.

That way Besedo also works to keep the app user-friendly and respectful by enforcing the app’s terms of use.

I have been harassed on a dating app

You can do a few things to protect yourself from harassment in chat apps.

  1. Make sure you have the latest version of the app installed. Many chat apps include features that allow you to block or report users who are being abusive.
  2. If someone is harassing you, use these reporting tools to protect yourself. Always report the user and block them.
  3. Take screenshots to provide proof. It can also help protect others.
  4. Be careful about who you add to your chat app contacts list. If you don’t know someone well or if they seem sketchy, it’s best not to add them. This will help reduce the chances of being harassed by someone you don’t know.
  5. Finally, remember that you can always leave a chat app if it’s making you uncomfortable. There’s no shame in doing so, and it’s often the best way to protect yourself from abuse. If someone makes you feel unsafe or uncomfortable, just hit the exit button and move on.

At a time when people are looking for love and connection more than ever, it’s important to be aware of the risks of fake dating profiles. While most dating platforms do their best to keep users safe, there are always going to be some bad actors who slip through the cracks.

That’s why it’s important to be vigilant in online dating. If something seems too good to be true, it probably is. Be wary of anyone who asks for money early on, or who wants to move too fast without getting to know you first.

Scammers will hurt your brand with poor reviews

Dating apps have been accused of promoting a hook-up culture and fostering an environment where users are more likely to swipe in the search for someone better endlessly. But another side to dating apps that can be just as problematic is the proliferation of fake profiles.

Fake profiles are not only a problem for users but also pose a risk to the app itself. If users come across too many fake profiles, they may start to question the legitimacy of the app and its users. This can lead to them deleting the app and leaving negative reviews, which will damage the app’s reputation.

If you’re running a dating app, it’s important to ensure that you’re taking steps to prevent fake profiles from being created. This includes things like requiring verification for new users, using artificial intelligence to identify suspicious activity, and monitoring user reviews for feedback about fake profiles. 

Hi, we’re Besedo; we should talk! Nudge, nudge 😉

Taking these measures can help protect your app from being tainted by scams and poor reviews.

Don’t waste your time with dishonest people

With the prevalence of dating apps and websites, it’s no surprise that there are fake profiles out there. The important thing to remember is to be vigilant and do your research before meeting anyone in person. If you suspect that someone you’re talking to is a fake, report them to the site or app so that they can be removed. And most importantly, don’t waste your time on someone who isn’t being honest with you. 

There is plenty of fish in the sea, so keep swimming!

William Singam

Sales Director APAC

William is the Besedo Sales Director for APAC and you can meet him at the GDI Singapore Conference in July 2022 where he is one of the speakers. When he is not on stage, he’ll be happy to share his wealth of experience about 1-2-1 chat moderation, user experience, app reviews, and just about anything content related.

A good website loads fast, boasts a beautiful design, is search engine friendly and offers a brilliant user experience design. In fact, having a website with a poor design could make users feel like your brand is of poor quality or untrustworthy.

*record scratch*

But if you peel off that top layer of design elements – what is a user experience, really? 

Nielsen Norman Group probably says it best that “user experience encompasses all aspects of the end-user’s interaction with the company, its services, and its products.”

All your design efforts will come up short if your website, or app, is not supporting your users’ goals. To most business owners, these goals are so fundamental that they risk being forgotten when you’re focused on all aspects of your business. With user-generated content platforms such as dating apps, marketplaces, video streaming, etc., you’re essentially handing over a massive chunk of your user experience to your community.

Consider this: You are interested in buying a bike, so you hop on your favorite marketplace app and search for bikes. The search result shows hundreds of postings near you. Great! The only thing is; first, you must wade through 4 pages of inappropriate images, scams, and harassment.

Two apps showing content with and without content moderation
Moderated content is a big part of creating a great user experience

To quote Donald Miller, “a caveman should be able to glance at it and immediately grunt back what you offer.” This is referred to as the Grunt Test; it’s a real thing.

Many marketing reports show poor design decisions are culprits why customers may leave your site. That’s a given. One report says that 88% of online consumers are unlikely to return to a website after a poor experience.

With user-generated content platforms you’re essentially handing over a massive chunk of your user experience to your community.

Most likely are those numbers closer to 99% should we remove content moderation from the user experience equation.

The User Experience Honeycomb

At the core of UX is ensuring that users find value in what you provide. Peter Morville presents this magnificent through his User Experience Honeycomb. 

The user experience honeycomb as presented by

One of the 7 facets of his honeycomb is “credible,” as Morville notes that for there to be a meaningful and valuable user experience, information must be:

Credible: Users must trust and believe what you tell them.

So what if your information and content are user-generated? Then you aren’t the one providing the credibility.

User Experience in user-generated content

We would argue that Credible (or Trust) serves best as the base for your user experience when it comes to user-generated content apps and websites. After all, the user experience is more than just something intuitive to use.

When User Experience Fails Despite Good Design

Few things will hurt your users’ confidence in your app faster than harassment or irrelevant content. In-game chats and, to some extent, dating apps are breeding grounds for trolling. Flame wars can create an unfriendly online environment, making other users feel compelled to respond to abusers or leave your platform entirely. 

Harassment still happens, and no one is immune, despite your platform’s fantastic design.

The emphasis on trust and credibility can not be overstated when your platform relies on user-generated content.

Online reviews and comments from social media are the new word-of-mouth advertisement. With a growing pool of information available online to more consumers, this form of content could either become an effective branding tool or the undoing of branding. Even if the content does not appeal to children, they may still flag it on the site or tell an adult they trust.

Trust user reviews, images, and videos

Suppose handing over a big part of your customers’ user experience to largely unknown users feels like a scary ordeal. In that case, you’re in for a rollercoaster regarding reviews.

Fake online reviews are more prevalent than you might think and could lead you to purchase a product you would not have otherwise. Fake customer reviews are usually glowing, even over-the-top, reading more like infomercials than reviews. One MIT study found that fake reviews typically contained more exclamation points than genuine reviews. Fake reviewers believe that by adding these marks, they’ll emphasize the negative emotions behind their feedback. 

Conversely, it is not uncommon for sellers to purchase fake, one-star reviews to flood competitors’ pages.

According to research, 91% of people read online reviews regularly or occasionally, and 84% trust online reviews as much as personal recommendations.

Building trust into the user journey

Your online business aims to attract, retain, or engage users; creating an experience that turns them off is definitely not a smart step in this direction. It should be kept in mind that users should have an accessible and user-friendly experience when going on this journey with you. We even published a webinar about building trust into a user journey if you’re interested.

Download the presentation

Fill out your email below to get your free copy of the presentation


Presentation: From 0% to 94% automation

Hear Jelena Moncilli, Anti-Fraud Specialist at, share their journey to high moderation efficiency. Here’s what you’ll learn:

  • How to handle specific challenges facing online marketplaces.
  • Anibis’ journey to reach 94% automation.
  • Tangible results in moderation efficiency, when working with AI.
  • How Besedo has helped Anibis throughout this process.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Download the presentation

Fill out your email below to get your free copy of the presentation


presentation: I searched for T-shirt and it kept suggesting cars

Explore the results of our study on user search experience:

  • How to engage your users within the first 10 seconds.
  • How irrelevant content affects your users’ behavior.
  • The importance of accurate categorization.
  • Honest comments from marketplace users.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Download the presentation

Fill out your email below to get your free copy of the presentation


eBook: Keeping the Faith

How online marketplaces can build trust and loyalty

You will learn about:

  • Tools and techniques for more effective trust building
  • How to Build trust in the platform and the seller
  • The science of confidence and how to apply it

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Download the presentation

Fill out your email below to get your free copy of the presentation


eBook: Why moderating content without censoring users demands consistent, transparent policies.

You will learn:
  • Why moderation and freedom of speech aren’t mutually exclusive
  • How content moderation can help you facilitate community growth
  • How you can implement moderation in a way that builds trust with your users

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

For many, online dating is now the default way to meet new people. As we become increasingly time-poor, digital devices act as a way for us to navigate our day-to-day lives and how we interact with others, including our relationships.

Despite attitudes towards dating apps becoming more positive and platforms gaining popularity in recent years, throughout their short history, they have attracted a great deal of attention on the risks they pose to users. While dating apps are an incredibly convenient way to maintain our love lives, they come with their own threats.

Risk vs risqué

Like any form of dating, connecting with strangers doesn’t come without risk. This is also the case when using an online dating platform. The exchange of information be it a phone number, address, or other personal details can be exploited if placed in the wrong hands. Dating scams, catfishing, and abuse attract headlines – and for platforms, advertising, misuse, and nudity also threaten to damage the user experience and brand reputation.

Finding the right balance between restricting content to protect users and allowing organic interactions to flourish is crucial to enabling platforms to grow and realize true potential. The power of online dating is its ability to make connections virtually, while the freedom which makes it possible to engage in negative interactions is also what makes it possible to have genuine, authentic, and meaningful relationships.

Growing a dating platform means harnessing the opportunities in the content it creates. Platforms cannot be seen to ‘scaremonger’ users, but it’s imperative they provide substantial safety features and guidelines to protect users and brand reputation, whilst using technology to enhance user experience and focus on retention to grow their platforms.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Creating a safe space, without killing the mood

The recent context of lockdowns demonstrated the power of online dating; even without in-person interaction, it functioned as a place to make human connections. It works best, therefore, when it delivers the same surprise, joy, and meaningfulness of speaking to someone new in real life.

With online dating, it is tempting to see shutting down opportunities to interact as the only way to remove risk. But this isn’t what users want. They want to feel protected and trusting of the systems in place to be able to interact in confidence. It is now an expectation, not a “nice to have”, for platforms to filter out all harmful content from fake profiles to indecent imagery. Providing a sophisticated app to allow users to interact with who they choose is likely to result in increased brand loyalty as opposed to blocking all connections which could be deemed as harmful.

An engaging and reliable messaging experience is the foundation of retention on a successful dating platform. Creating a positive space to connect, however, relies on really understanding how people use the platform and what works for them. With many users engaging in conversations to meet new partners, its important technology doesn’t get in the way and ‘kill the mood’, with an unstable or over-censored chat platform.

Content moderation can help strike the right balance. As well as blocking the most objectionable – or illegal – content, it delivers insight that enables dating sites to encourage sincere, positive behaviours. Online dating is a space of rapid innovation and as brands create new ways to help people connect more effectively, platforms need to ensure interactions remain safe, with custom moderation approaches.

Ultimately, stopping deceitful users from harming the user experience and removing unwanted content to keep people safe will protect brand reputations. With content moderation, your dating site can become the brand you want it to be.  

Find out more about working with us and request a demo today.

edmond vassallo

By Edmond Vassallo

Head of Customer Success

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.
Form background

Why creating sustainable growth means looking beyond the digital present

Over the past decade, it has become common to suggest that every company is now a tech company.

The exponential growth in digital usage quickly outgrew what we traditionally think of as the technology sector and, for users, the agility of the internet didn’t stay confined to the online world. Technology has shifted expectations about how everything can or should work. Soon, companies selling everything from furniture to financial services started to look and act more like innovative tech companies. They find new ways to solve old problems through digital channels.

In other words, business leaders seeking to guarantee growth turned to digital technology – to the point that, now, the Chief Technology Officer is a key part of the C-suite.

After a year when we’ve all relied on the internet more than ever, in every aspect of our lives, growth through digital has never been more apparent. For business, digital communication has at times been the only possible way of staying in touch with customers, and there’s no sign that the CEO’s focus on agility and technology is fading. In recent surveys, IBM found that 56% of CEOs are ‘aggressively pursuing operational agility and flexibility’, PwC found that they see cyber threats as the second biggest risk to business, and Deloitte found that 85% think the pandemic accelerated digital transformation.

If the exponential growth of digital has made every company a technology company, though, it has also made terms like ‘technology’ and ‘agility’ less useful. If every CEO is pursuing a digital strategy, that term must be encompassing a vast range of different ideas. As we look towards the next decade of growth – focused on managing the challenge of achieving more responsible and sustainable business along the way – we will need to think carefully about what comes next once digitalisation is universal.

Supercharged tech growth has skyrocketed user-generated content

Of course, the importance of agile technology has never been the tech itself, but what people do with it. For customers we’ve seen tech innovation create new ways of talking, direct access to brands, and large changes in how we consume media and make purchases.

As digital channels take on a greater share of activity than ever, one of the effects of an exponential growth in digital is an exponential growth in user-generated content (UGC).

This user-led interaction, from product reviews to marketplace listings to social interactions, fully embodies the agility that companies have spent the last decade trying to bring to their processes; because it is made by people, UGC is rapid, diverse, and flexible by default. While it may be too soon to say that every business will become a content business, it’s clear that this will become an increasingly important part of how businesses operate. Certainly, it’s already a major driving force for sectors as diverse as marketplaces, gaming, and dating.

A UGC business must be protected to maximise opportunity

In the move towards UGC, a business’s user interaction and user experience will have consequences across the organisation – from profit margin, to brand positioning, to reputational risk, to technological infrastructure. Across all of these, there will be a need to uphold users’ trust that content is being employed responsibly, that they are being protected from malign actors, and that their input is being used for their benefit. Turning content into sustainable growth, then, is a task that needs to be addressed across the company, not confined to any one business function.

Marketers, for instance, have benefited from digitalisation’s capacity to make the customer experience richer and more useful – but it has also introduced an element of unpredictability in user interactions. When communities are managed and shaped, marketers need to ensure that those efforts produce a public face in line with the company’s ethos and objectives.

While tech teams need to enable richer user interaction, their rapid ascent to become a core business function has left them under pressure to everything, everywhere. Their innovation in how content is managed, therefore, needs a middle path between the unsustainable workload of in-house development and the unsustainable compromises of off-the-shelf tooling.

With the ultimate outcomes of building user trust being measured in terms of things like brand loyalty and lifetime user value, finance departments will also need to adapt to this form of customer relationship. The creation of long-term financial health needs investments and partnerships which truly understand how the relationship between businesses and customers is changing.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


UGC as a vital asset for sustainable business growth

Bringing this all together will be the task needed to create sustainable growth – growth which is fit for and competitive in the emerging context of UGC, sensitive to the increasing caution that users will have around trusting businesses, and transparent about the organisations ethos, purpose, and direction. It will require not just investing in technology, but understanding how tech is leading us to a more interactive economy at every scale.

As digitalisation continues to widen and deepen, we may find UGC, and the trust it requires, becoming just as vital an asset for businesses as product stock or intellectual property. To prepare for that future and maximise their business growth from their UGC, businesses need to start thinking and planning today.

By Petter Nylander

CEO Besedo Global Services

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Having quality tools is key to performing well. That’s true for content moderation as well. An outdated or feature incomplete platform affects everyone from CEO to content moderator. And the consequences can be dire ranging from lacking user and operational insights to decreased productivity or even the inability to perform important tasks.

If you are looking for your first content moderation tool or considering exchanging your in-house platform it’s a good idea to take a step back and consider the options available. It might be tempting to hand the task and your specifications to your dev team and await delivery. On the other hand, it might be better to buy an off the shelves solution that’s plug and play. In short should you build or buy?

The answer is of course very dependent on your situation, your company and your current requirements. We’ve created a list of questions that you should ask yourself and an overview of pros and cons to help you take a more informed decision.

  1. Is content moderation business defining or business critical?
  2. What are the time constraints for implementing a moderation solution?
  3. What developer resources do you have available?
  4. What is your budget?

Is content moderation business defining or business critical?

10 years ago, most online platforms didn’t prioritize content moderation as there was a lot less understanding othe negative impact from bad content. Today, most agree that content moderation is business critical to ensure user trust and ensure conversion. However, business critical isn’t the same as business defining.

For most customer facing businesses, a help-desk tool is critical, but unless customer support and how it’s delivered is a core part of your USP the software you use for customer queries doesn’t have to be unique to your business.

What you do want though, is a platform that is stable, has all required features and is updated regularly as requirement shift.

What are the time constraints for implementing a new moderation solution?

What is the timeline for implementing your new solution? Building a moderation platform from scratch can be a year-long project depending on how many developers you can throw at the task.

At the other side of the coin, even out of the box solutions usually require some level of integration with your current systems. It’s good to get an idea of requirements early in the decision process so you understand the timeline for either option. If you do decide to buy, shop around a bit to understand the difference between vendors and how much effort they will put into supporting you during the integration step.

What developer resources do you have available?

Before committing to building or buying you should figure out what developer resources you realistically have at your disposal. Keep in mind that content moderation tools are an almost living entity that needs to develop with your product and ongoing trends.

When you evaluate your developer need for an in-house solution, remember to include ongoing maintenance and new feature development.

For bought solutions you should as mentioned before do a discovery call with potential vendors to understand how much time integration will take.

What is your budget?

Budget is always an important aspect when deciding whether to build or buy. It can be really hard to estimate the cost for a content moderation solution regardless of whether you build and buy. Many vendors will have on-boarding fees, monthly fee and price/item which can make obscure to predict the actual monthly bill.

For in-house projects on the other hand, it’s easy to forget the cost of management, salaries, project meetings, and ongoing management and feature updates.

Most importantly many companies forget to keep opportunity cost in mind. What unique features could our developers have created for our core product instead of building a moderation platform?

Whether you decide to build or buy, spend some time investigating potentially hidden costs to avoid unpleasant surprises.

If you buy, go with a partner that is transparent and straight forward.

If you build, map out every cost, not just direct developer costs, but also adjacent expenditures, especially after the project has been delivered.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.


Building a Content Moderation Platform:

Buying a Content Moderation Platform:

The decision to buy or build is never easy and very dependent on the business and the use cases. In most cases, asking yourself whether building an in-house solution will give you a competitive edge or if the needs can be sufficiently covered by an already existing content moderation platform.

If you want to better understand your options, feel free to reach out to us and get a transparent offer for a tailored content moderation solution.

Or check out Kaidees experience with our content moderation tools.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background