Sometimes small features can have a big impact. With our newly implemented user counter you get a whole new level of insights about your users.

What it does

The user counter shows you how many items the user has had approved and how many they’ve had rejected. You can also quickly access an overview to see the actual listings that were approved or rejected giving insight into user behavior and listing habits.

 

How it works

Click an item in the Item log.

Item listing

This brings up the item overview window. Here, next to the User ID you’ll find the user counter. The number in green shows approved listings by this user. The one in red, how many listings the user has had rejected.  

 

Implio user counter feature

Use cases for user counter

If you have experience with content moderation you’ve probably already thought of several use cases for the user counter.

Here are a couple of examples of how it can be used in Implio.

1. Qualifying returning users

Need to understand the quality of a user? Check their listings history. If they have only rejections, this user may cause problems going forward as well.

2. Assistance in grey area decisions

When manually moderating items you sometimes come across grey area cases, where it’s hard to judge whether the listing is genuine or problematic. In those cases where you have to make a snap decision either way, having the user’s previous history to lean on can be helpful. A user with only approved listings in the past, is unlikely to have suddenly turned abusive. Although be cautious there are scammers turning this logic to their benefit through Trojan Horse scams. Here they first post a couple of benign listings, then once their profile looks good, they start posting scams.

3. Spotting users in need of education

Have you found a user who consistently get their listings rejected for non-malign reasons? A quick educational mail might help them out and cut down on your moderation volumes.

4. Identify new users

It’s always good to pay extra attention to new users as you don’t yet know whether they are bad actors. Knowing that a user has no previous history of listing items can act as a sign to be extra thorough when moderating. On the flip site, seeing a user with only approved listings allow you to speed up moderation of the item in question as it’s likely OK too. Just keep an eye out for the aforementioned Trojan Horse scammers.

To give a better understanding of how the user counter helps increase productivity and quality of moderation, we’ve asked a couple of our moderators for their experience working with the new feature.


“The user counter helps me get a perspective on the profile. If I see that a user has had listings refused more than two times, I access the profile to see the reason of the refusals. That allows me to make a better decision on the profile. It allows me to spot scammers quickly and make faster decisions.”

– Cristian Irreño. Content moderator at Besedo

“The user counter has allowed me to see the trends on profile decisions. It makes me be more careful when I see accounts with a higher number of refusals. Also, when I am working on a new account, I know I must be more careful with my decision.”

– Diego Sanabria. Content moderator at Besedo

“The counter helps me identify profiles that have frequent acceptance or refusals, and to spot new users.”

– Cristian Camilo Suarez. Content moderator at Besedo

The user counter is available to all Implio users regardless of plan. Want to start using Implio for your moderation? Let us know and we’ll help you get started.

COVID-19 continues to create new challenges for all. To stay connected, we’re seeing businesses and consumers spend an increasing amount of time online – using different chat and video conferencing platforms to stay connected, and combat social distancing and self-isolation.

We’ve also seen the resurgence of interaction via video games during the lockdown, as we explore new ways to entertain ourselves and connect with others. However, a sudden influx of gamers also brings a new set of content moderation issues – for platform owners, games developers, and gamers alike.

Let’s take a closer look.

Loading…

The video game industry was already in good shape before the global pandemic. In 2019, ISFE (Interactive Software Federation of Europe) reported a 15% rise between 2017 and 2018, turning over a combined €21bn. Another report by ISFE shows that over half of the EU’s population played video games in 2018 – some 250 million players, gaming for an average of nearly 9 hours per week: with a pretty even gender split.

It’s not surprising that the fastest growing demographic was the 25-34 age group – the generation who grew alongside Nintendo, Sony, and Microsoft consoles. However, gaming has broader demographic appeal too. A 2019 survey conducted by AARP (American Association Of Retired Persons) revealed that 44% of 50+ Americans enjoyed video games at least once a month.

According to GSD (Games Sales Data) in the week commencing 16th March 2020, right at the start of the lockdown, video games sales increased by 63% on the previous week. Digital sales have outstripped physical sales too, and console sales rose by 155% to 259,169 units in the same period.

But stats aside, when you consider the level of engagement possible, it’s clear that gaming is more than just ‘playing’. In April, the popular game Fortnite held a virtual concert with rapper Travis Scott; which was attended by no less than 12.3 million gamers around the world – a record audience for an in-game event.

Clearly, for gaming the only way is up right now. But given the sharp increases, and the increasingly creative and innovative ways gaming platforms are being used as social networks – how can developers ensure every gamer remains safe from bullying, harassment, and unwanted content?

Ready Player One?

If all games have one thing in common, it’s rules. The influx of new gamers presents new challenges in a number of ways, where content moderation is concerned. Firstly, because uninitiated gamers (often referred to as noob/newbie/nub) are likely to be unfamiliar with established, pre-existing rules for online multiplayer games or the accepted social niceties or jargon of different platforms.

From a new user’s perspective, there’s often a tendency to carry over offline behaviours into the online environment – without consideration or a full understanding of the consequences. The Gamer has an extensive list of etiquette guidelines which get frequently broken by online multiplayer gamers, from common courtesies such as not swearing in front of younger users on voice-chat, not spamming chat-boxes to not ‘rage-quitting’ a co-operative game due to frustration.

However, when playing in a global arena, gamers might also encounter subtle cultural differences and behave in a way which is considered offensive to certain other groups of people.

Another major concern, as outlined by Otis Burris, Besedo’s Vice President Of Partnerships, outlined in a recent interview, which affects all online platforms, is the need to “stay ahead of the next creative idea in scams and frauds or outright abuse, bullying and even grooming to protect all users” because “fraudsters, scammers and predators are always evolving.”

Multiplayer online gaming is open to negative exploitation by individuals with malicious intent or grooming, simply because of the potential anonymity and sheer numbers of gamers taking part simultaneously around the globe.

While The Gamer list spells out that kids (in particular) should never use someone else’s credit card to pay for in-game items, when you consider just how open gaming can be from an interaction perspective, the fact that these details could easily be obtained by deception or coercion needs to be tackled.

A New Challenger Has Entered

In terms of multiplayer online gaming, cyberbullying and its regulation continue to be a prevalent issue. Some of the potential ways in which users can manipulate gaming environments in order to bully others include:

  • Ganging up on other players
  • Sending or posting negative or hurtful messages (using in-game chat-boxes for example)
  • Swearing or making negative remarks about other players that turn into bullying
  • Excluding the other person from playing in a particular group
  • Anonymously harassing strangers
  • Duping more vulnerable gamers into revealing personal information (such as passwords)
  • Using peer pressure to push others into perform acts they wouldn’t normally have

Whilst cyberbullying amongst children is fairly well researched, negative online interactions between adults are less well documented and studied. The 2019 report ‘Adult Online Harms’ (commissioned by the UK Council for Internet Safety Evidence Group) investigated internet safety issues amongst UK adults, and even acknowledges the lack of research into the effect of cyberbullying on adults.

With so much to be on the lookout for, how can online gaming become a safer space to play in for children, teenagers, and adults alike?

Pause

According to a 2019 report for the UK’s converged communications regulator Ofcom: “The fast-paced, highly-competitive nature of online platforms can drive businesses to prioritize growing an active user base over the moderation of online content.

“Developing and implementing an effective content moderation system takes time, effort and finance, each of which may be a constraint on a rapidly growing platform in a competitive marketplace.”

The stats show that 13% of people have stopped using an online service after observing harassment of others. Clearly, targeted harassment, hate speech, and social bullying need to stop if games manufacturers want to minimize churn rate and risk losing gamers to competitors.

So how can effective content moderation help?

Let’s look at a case study cited in the Ofcom report. As an example of effective content moderation, they refer to the online multiplayer game ‘League Of Legends’ which has approximately 80 million active players. The publishers, Riot Games, explored a new way of promoting positive interactions.

Users who logged frequent negative interactions were sanctioned with an interaction ‘budget’ or ‘limited chat mode’. Players who then modified their behavior and logged positive interactions gained release from the restrictions.

As a result of these sanctions, the developers noted a 7% drop in bad language in general and an overall increase in positive interactions.

Continue

Taking ‘League Of Legends’ as an example, a combination of human and AI (Artificial Intelligence) content moderation can encourage more socially positive content.

For example, a number of social media platforms have recently introduced ways of helpfully offering users alternatives to UGC (user generated content) which is potentially harmful or offensive, giving users a chance to self-regulate and make better choices before posting. In addition, offensive language within a post can be translated into non-offensive forms and users are presented with an optional ‘clean version’.

Nudging is also another technique which can be employed to encourage users to question and delay posting something potentially offensive by creating subtle incentives to make the right choice and thereby help to reduce the overall number of negative posts.

Chatbots, disguised as real users, can also be deployed to make interventions in response to specific negative comments posted by users, such as challenging racist or homophobic remarks and prompting an improvement in the user’s online behavior.

Finally, applying a layer of content moderation to ensure that inappropriate content is caught before it reaches other gamers will help keep communities positive and healthy. Ensuring higher engagement and less user leakage.

Game Over: Retry?

Making good from a bad situation, the current restrictions on social interaction offer a great opportunity for the gaming industry to draw in a new audience and broaden the market.

It also continues to inspire creative innovations in artistry and immersive storytelling, offering new and exciting forms of entertainment, pushing the boundaries of technological possibility, and generating new business models.

But the gaming industry also needs to ensure it takes greater responsibility for the safety of gamers online by ensuring it incorporates robust content management strategies. Even if doing so at scale, especially when audience numbers are so great, takes a lot more than manual player intervention or reactive strategies alone.

This is a challenge we remain committed to at Besedo – using technology to meet the moderation needs of all digital platforms. Through a combination of machine learning, artificial intelligence, and manual moderation techniques we can build a bespoke set of solutions that can operate at scale.

To find out more about content moderation and gaming, or to arrange a product demonstration, contact our team!

As part of our ongoing efforts to help our clients manage the challenges of user-generated content to improve the customer experience on their platforms, Besedo has recently hired Otis Burris as VP of Partnership.

The goal is to build a vast network of quality-driven companies who all strive to add value to online platforms through tech or expertise.

 

Interviewer: Tell us a bit about yourself.

Otis: I recently joined Besedo, back in November 2019. I have a long working history in technology solution sales & partnerships – helping to drive client efficiencies, and transition from the traditional ways of working that may be a little less tech-driven, towards more innovative solutions.

From SaaS to PaaS, I have worked with early mobile application platforms long before the “mobile-first” approach, and with digital innovations in online services and AI solutions. Now I’m at Besedo and very excited about both the mission and the approach, which is combining technology with specialized knowledge to help platforms keep their users safe.

 

Interviewer: Tell us a bit about yourself. How does Besedo help protect and improve users’ online experiences?

Otis: Besedo has made its name by being skilled at detecting and preventing improper or inappropriate behavior online. Many companies rightly put a lot of effort into making sure only “good” users get access to their platform. But what happens when “good” users misbehave?

Besedo provides the technology and expertise needed to monitor and control this behavior and ensure that users are not abusing the platform policies or creating a negative experience for everyone else.

Fraudsters, scammers and predators are always evolving, and it’s difficult for many companies to keep up if Content Moderation is not their core business. Our goal has always been to stay ahead of the next creative idea in scams and frauds or outright abuse, bullying and even grooming to protect all users.

Besedo focuses heavily on content moderation, and as such, we’re able to deliver very high quality and focused service to our clients.

 

Interviewer: What are the industries Besedo directs their services towards?

Otis: Besedo has predominantly worked with online marketplaces, and we have a strong history of delivering to classifieds online marketplaces.

But UGC occurs in many other places and our moderation knowledge can be transferred and applied to most digital platforms. We now moderate dating site profiles and support a lot of the up and coming sharing economy platforms. And then there are online communities where content is continuously shared in a variety of channels making content moderation a must.

 

Interviewer: Why is Besedo launching a partnership program?

Otis: Our clients and their user’s journey start long before the user becomes active on the platform, runs as long as they participate on the site and extends even after they leave.

From onboarding to payments, or customer support and reviews, there is a set of activities that are complementary to a successful experience. To deliver a high-level of satisfaction to all users, several components need to work seamlessly. Our goal is to connect those pieces into a single flow that enhances the overall perception of the client’s platform and that’s where we need partners.

Building a reliable eco-system of partners creates value for all participants. The end result is an offering that attracts more users, increases engagement and reduces churn.

In short, we’re trying to create a one-stop-shop where platforms can go to pick from a portfolio of services that can help them improve their user experience.

 

Interviewer: What kind of partnerships is Besedo seeking?

Otis: There are four main types of partners that we believe will help us deliver a high-quality offering to the online experience, either driven by our partners or by us.

Technology partners can be complementary when reviewing clients’ onboarding journeys.

Service partners are critical to scaling or supporting spikes during periods of heavy or unpredictable traffic – a great example is Covid-19 keeping people at home, with more time on their hands to create online content and complaints.

Our System Integration partners who are great at delivering complete IT solution projects on a significant scale are sometimes lacking the competence in certain niche areas (content moderation for example), which are outside their core deliveries. Besedo’s history and standing in the content moderation industry, makes them a credible partner for those large projects, particularly because we can deliver on a global scale in a range of languages.

Last but not least, are our Industry advocates, who are not only working continuously to educate and evangelize the different industries on the do’s and don’ts of Content Moderation but also to share the latest challenges, ideas, and innovations that can help companies stay ahead of fraudsters, scammers and other online predators.

 

Interviewer: Who should partner with us?

Otis: User Verification companies – They cover the first hurdle. They are the bouncers at the doors who ensure good users arrive on the platform. We then monitor the user’s behaviors once they arrive. Consider us the surveillance and alert team that can remove any negative or harmful elements that managed to get past the first stage.

Payment Solutions – They complete the behavioral transactions and add value to the endpoint conversion cycle and provide essential attributes that can be tagged to user behavior.

Service providers / BPOs – Often, they utilize their client’s systems which are typically not purpose-built. Partnering with us allows service providers to introduce solutions to their clients for higher efficiency and greater control of the QA process. Earn credibility by improving the client’s capabilities. As well as helping us to scale when there is a sudden spike in demand for our services.

Industry Advocates & Marketing Events help us to stay connected with our eco-system and provide opportunities to discover new trends and technologies occurring in the space.

Finally, since starting this role, I’ve had a lot of conversations I struggle to classify, but they often have exciting technology or approaches that we can leverage. Having a partner space allows those types of companies to reach out to us as well with their proposal on how we can collaborate.

 

Interviewer: How can partnering with Besedo help your business?

Otis: Partnering with Besedo can benefit your business in several ways.

It’s a great way to reduce the risks to your clients and deliver an improved user experience at a higher value since, with our combined capabilities, we’ll cover more ground.

If content moderation is not your core business, partnering with us will help build competence and trust in your content moderation delivery projects in a scalable way.

Once you earn credibility within the Besedo eco-system, it will open more opportunities and new revenue streams.

You will gain access to industry-leading moderation expertise in multiple languages built on 18 years of experience. We’re happy to share our knowledge so you can succeed.

Finally, Besedo are friendly towards our partners in the commercial sense. We are happy to aggressively share revenue, especially when it comes to new business or existing business where you’ve proven your ability to add value.

We’re very keen on improving ourselves, both from a tech and service standpoint. Any partner, any organization, any technology that can deliver results, we’d like to make sure you’re rewarded for that. So please reach out if you feel your goals align with ours.

 

Interviewer: What are the options for partnering with us?

Otis: It depends on the approach of your company, but here are a couple of possible setups:

Partners can be set up as a revenue share which is based on how active the partner is in the discovery, development and delivery of the opportunity we collaborate on.

You work as a referral partner, where you identify an opportunity, but Besedo does all the meetings, and so on. That’ll be considered a referral. There’ll be a referral bonus there that can be standardized.

If you look at a reseller scenario, you will have some training on Besedo’s platform Implio, so you can demo it and handle the sales process upfront and we’ll support you in the background. That will warrant a bigger bonus since you as the partner is taking on more of the work.

Typically, partners who are more involved in the process earn a larger share of the revenue.

We also want to include consulting and marketing partners. For marketing collaboration, we’d usually not have a monetary exchange. Instead, we’ll work with blog exchange and other types of content or campaign collaboration. Cost-saving, increased reach and exposure are just some of the benefits to gain from presenting us to the market with a unified front.

 

Interviewer: How can partnerships bring value to companies for a safer internet?

Otis: If you go alone you can go faster; if you go together, you can go further. If Besedo tries to make the Internet safer alone, we’re not going be able to do it all. As I mentioned our user verification partners, they do a great job of KYC (know your customers). They do a lot of homework before the user gets on the platform. That’s a critical step. But their software doesn’t extend onto the platform in the way Besedos does. So, we have to do a good job as well to add to the safety stack, and the same thing goes for the payment solutions, delivery services and so on. There’s a dependency on each solution to be of high quality. And if one is missing or of questionable quality, the safety of your platform decreases significantly.

Data sharing is another area where partnerships can help improve Internet safety.

In the age of data regulation – GDPR, Patriot Acts & other local privacy requirements – it’s becoming harder to leverage or share datasets across platforms or continents. This means that we’re missing out on valuable knowledge and insights that could be used to catch bad actors faster.

Partnerships allow companies to benefit from the complementary expertise of each company, without the partners exposing their core secrets or data insights, and user data. If we can work together to create a safer internet, we will together attract more users to the eco-system, simply because they feel safe.

It’s human nature – more people fly than ever before because planes crash less. The same goes for online users. Fewer bad experiences will attract more users, which in turn increases the revenue opportunities for all members of the eco-system.

 

Interviewer: I want to partner with Besedo. What’s the next step?

Otis: Reach out to me directly, I’m the VP of Partnerships and I’m always interested in hearing ideas on how we can collaborate. otis.burris@besedo.com

You can also fill in the form on our partner page. Add as much info as you can and I’ll be in touch to schedule a call.

And of course, grab us when you see us at any of the many industry events we attend. We’re always happy to have a chat about potential partnerships.

 

Interviewer: Any final thoughts thoughts you’d like to share?

Otis: It’s exciting times! New platforms are popping up everywhere. Content moderation is such a relevant space. It touches everyone, from parents to kids. Everyone is consuming content. Everyone is on a platform doing something, and they need to be protected. I don’t think we can be in a more relevant space at this time.

I’m really excited about Besedo having such a vast amount of experience and now taking it one step further by adding partnerships and technology to improve the solution moving forward.

I’m looking forward to having a lot of great conversations with potential partners.

The outbreak of COVID-19 or Coronavirus has thrown people all over the world into fear and panic for their health and economic situation. Many have been flocking to stores to stock up on some essentials, emptying the shelves one by one. Scammers are taking advantage of the situation by maliciously playing on people’s fear. They’re targeting items that are hard to find in stores and make the internet – and especially online marketplaces – their hunting ground, to exploit desperate and vulnerable individuals and businesses. Price gouging – or charging unfairly high prices – fake medicine or non-existent loans are all ways scammers try to exploit marketplace users.

In this worldwide crisis, now is a great time for marketplaces to step up and show social responsibility by making sure that vulnerable individuals don’t fall victim to corona related scams and that malicious actors can’t gain on stockpiling and selling medical equipment sorely needed by nurses and doctors fighting to save lives.

Since the start of the Covid-19 epidemic we’ve worked closely with our clients to update moderation coverage to include Coronavirus related scams and have helped them put in place new rules and policies.

We know that all marketplaces currently will be struggling to get on top of the situation and to help we’ve decided to share some best practices to handle moderation during the epidemic.

Here are our recommendations on how to tackle the Covid-19 crisis to protect your users, your brand and retain the trust users have in your platform.

Refusal of coronavirus related items

Ever since the outbreak started, ill-intentioned individuals have made the price of some items spike to unusually high rates. Many brands have already taken the responsible step of refusing certain items they wouldn’t usually reject, and some have set bulk-buying restrictions (just like some supermarkets have done) on ethical and integrity grounds.

Google stopped allowing ads for masks, and many other businesses have restricted the sale or price of certain items. Amazon removed thousands of listings for hand sanitizer, wipes and face masks and has suspended hundreds of sellers for price gouging. Similarly, eBay banned all sales of hand sanitizer, disinfecting wipes and healthcare masks on its US platform and announced it would remove any listings mentioning Covid-19 or the Coronavirus except for books.

In our day to day work with moderation for clients all over the world we’ve seen a surge of Coronavirus related scams and we’ve developed guidelines based on the examples we’ve seen.

To protect your customers from being scammed or victim of price-gouging and to preserve your user trust, we recommend you refuse ads or set up measures against stockpiling for the following items.

  • Surgical masks and face masks (type ffp1, ffp2, ffp3, etc.) have been scarcely available and have seen their price tag spike dramatically. Overall, advertisements for all kinds of medical equipment associated with the Covid-19 should be refused.
  • Hands sanitizer and disposable gloves are also very prone to being sold by scammers at incredibly high prices. We suggest either banning the ads altogether or setting regular prices on these items.
  • Empty supermarket shelves of toilet paper have caused this usually cheap item to be sold online at extortionate prices, we suggest you monitor and ban these ads accordingly.
  • Any ads with the mention of Coronavirus or Covid-19 in the text should be manually checked to ensure that they aren’t created with malicious intends.
  • The sale of magic medicines pretending to miraculously cure the virus.
  • Depending on the country and its physical distancing measures, ads for home services such as hairdressers, nail technicians and beauticians should be refused.
  • In these uncertain times, scammers have been selling loans or cash online, preying on the most vulnerable. Make sure to look for these scams on your platform.
  • Similarly, scammers have been targeting students talking about interest rates being adjusted.

Optimize your filters

Ever since the crisis started, scammers have become more sophisticated as days go by, finding loopholes to circumvent security measures. By finding alternative ways to promote their scams, they use different wordings such as Sars-CoV-2 or describing masks by their reference numbers such as 149:2001, A1 2009 etc. Make sure your filters are optimized and your moderators continuously briefed and educated to catch all coronavirus-related ads.

Right now, we suggest that tweak your policies and moderation measures daily to stay ahead of the scammers. As the crisis evolves malicious actors will without doubt continue to find new ways to exploit the situation. As such it’s vital that you pay extra attention to your moderation efforts over the following weeks.

If you need help tackling coronavirus-related scams on your platform, get in touch with us.

How can online marketplaces convert more? This question has many answers, but there is one undeniable consequence: if your users can’t find what they are looking for on your online marketplace, your conversion rates will plummet.

Internal search is a great tool to help users find what they are seeking in your online marketplace. Unfortunately, many sites still implement search as the result of an afterthought rather than establishing it as a pre-requisite for conversion optimization.

A common mistake – which you might be making on your site – is to hide the search tool behind a magnifying glass in the top right corner, which will undoubtedly be detrimental to your conversion rates. If your users can’t see it, they won’t use it, so why hide it?

According to Luigi’s Box, site searchers are 70% more likely to buy than non-searchers.

So, if you make it accessible to your users, imagine what that could mean for your conversion rates?

But it doesn’t end there.

In a recent webinar, we invited Michal Barla, Co-Founder and CPO at Luigi’s Box, to break down the steps you need to take if you want to turn your internal search into a powerful conversion tool.

Optimized site search: the solution improved online marketplace conversion

Get started with our handy checklist about site search, where we take you through the necessary steps to optimize your search experience and convert more.

Our new Implio feature enhances the quality of the images shared on your platform. Your manual moderation team can now crop and rotate user-generated images quickly and efficiently in the moderation tool.

For online platforms like online marketplaces and dating sites, creating a good user experience and trustworthy environment is essential; and high-quality pictures are crucial in that matter. In our user search study, users unanimously picked quality images as the reason to prefer one site over another.

Profile picture or images are crucial for users to trust the person on the other side of the screen or what they want to sell or buy. And as a company, you want to create and maintain that trust for your users.
On dating sites or online marketplaces, the cropping and rotating feature helps you to moderate pictures to comply with your company’s guidelines. For instance, cropping profile pictures so that only one person appears or ensuring that the user’s face is distinctly visible. On top of this, images can also be rotated to make sure that images submitted upside down, or from the wrong angle, can easily be corrected.

The cropping and rotation feature in Implio helps you improve trust and user experience for both your sellers and buyers.

Here’s how the feature works:

Implio cropping and rotationCurious to learn more about our new feature?

Have a look at our Implio features list or sign up to our all-in-one content moderation tool Implio, and try it out.

Could you tell us a bit about yourself?

My name is Kevin Ducón from Bogotá, Colombia. I hold an MSc in Computer Science from Universidad Politecnica de Madrid and a BSc in Computer Science from Universidad Ditrital de Bogotá.

I have been working in information and communications technology for more than fifteen years and began working at Besedo five years ago, specializing in IT Service Management and Information Security. I started as a local ICT Administrator in our Colombian center, then as an ICT Supervisor, and currently, I am the Global Head of ICT-IS (information and communications technology – information security).

Over the past five years, I have applied my knowledge and skills to this ever-changing industry by creating policies and processes aligned with the industry’s best practices, supporting our clients, and continuously improving our on-going projects.

What are your responsibilities as Global Head of ICT-IS?

As the Global Head of ICT-IS at Besedo, I’m in charge of all levels of support in information technology and communications.
I oversee the Global ICT work, and together with my ICT team, I make sure that we fulfill our most important metrics – availability, service-level agreement, and customer satisfaction.

On top of that, I manage and provide insights into our security guidelines and develop strategic and operational plans for the ICT department to ensure that all necessary tools and processes are fully functional to achieve the company’s overarching goals and ambitions.

I also have hands-on technical responsibilities in supporting and developing mission-critical systems, which are running 24/7, to make sure our moderation services are successfully delivered to our customers worldwide.

From an ICT point of view, what are the key elements that must go right when running a content moderation operation?

The essential part from an ICT standpoint when running a content moderation operation is to truly understand the priorities and needs specific to the operation. Having an IT strategy to translate business needs into functioning IT operations is vital for a successful moderation setup.

Furthermore, ensuring good practices in network infrastructure and server setup, device management, and IT support is key to achieve a solid moderation operation. Finally, it’s crucial to have a knowledgeable and committed IT staff behind the scenes.

What are the common things that can go wrong?

When running a moderation operation, many potential issues can occur, some of the most common hazards include Internet connection, networks, or servers going down, power outages and failed infrastructure deployments.

For instance, content moderation relies heavily on a stable Internet connection, and you cannot blindly trust that it will just work. Instead, you need to make sure that your Internet service always works to its full capacity.

What safety measures are needed to make sure the moderation operation runs smoothly?

It’s important to have proactive safety measures in place to guarantee that the moderation operation always is carried out correctly. A good first step is to plan the implementation of the moderation services thoroughly before putting disaster mitigations plans in place.

For example, at Besedo, we work with several Internet service providers in case one of those fails to deliver correctly. We also work with fault-tolerant networks, a resilient infrastructure, third-party support, etc., to ensure that our IT operations remain stable when potential risks materialize.

On top of this, we run daily IT checklists and use monitoring systems that allow us to prevent potential challenges during IT ops. Also, we have backup routines to avoid any information loss or damage and use UPS to keep our critical devices turned on.

All in all, for anyone looking to run a successful moderation operation, many countermeasures must be put in place to make sure that IT operations run smoothly.

What’s the best thing about your job?

My job allows me to work in the different areas of the ICT Function and with all the disciplines that contribute to the business. For some people, ICT only assists with end-user tickets because that’s what’s visible to them. However, IT is not just a commodity but a strategic ally for us to deliver the highest level of services to our customers.

I’m proud to apply my skill-set and knowledge to Besedo’s purpose and values, which I genuinely believe in. When I took the role as Global Head of ICT-IS, I sought out to implement our promise ‘grow with trust’ into everything we do in our team. This has shaped the ICT team’s goal to help all functions grow with trust, through efficient processes, guaranteed quality of services, and high customer satisfaction.

At Besedo, we have an excellent ICT team of committed and ambitious individuals who love what they do and work hard to improve the company every day.

Kevin Ducon - Global Head of ICT-IS

Kevin Ducón

Kevin Ducón is Besedo’s Global Head of ICT-IS. He has been working in information and communications technology for more than fifteen years. Over the past five years at Besedo, he has applied his knowledge and skills to the ever-changing content moderation industry.

Efficiency and accuracy are two of the most valuable KPIs online marketplaces track to evaluate their manual moderation performance. The key to an optimized manual moderation team is to find the right balance between efficiency and accuracy.

However, here’s the pitfall, if you push your moderators too hard to achieve efficiency, this can, in time, lessen their accuracy and jeopardize the quality of the content published on your marketplace. Low-quality content is likely to slip through the cracks threatening the reputation of your platform, damaging user trust, and putting your users at risk, varying from user experience issues to more serious threats such as identity theft or scams. 

For your online marketplace to succeed and to keep potential issues at bay, it’s imperative to provide your moderation team with the right moderation tools to help them be as efficient and accurate as possible.

At Besedo, we are continually looking to improve our all-in-one content moderation tool, Implio, by adding features to ensure your content moderators perform at their best.
Whether it’s highlighting keywords, working with specialized moderation queues, enabling quick links or warning messages, many features available in Implio are created to ease your moderators’ daily work and improve their overall performance.

Keyboard shortcuts – efficient manual moderation

Implio’s brand new feature, Keyboard shortcuts, helps your moderators easily make decisions with a single click and navigate through listings without leaving their keyboard, making manual moderation both efficient and accurate.

From our initial tests, we found that keyboard shortcuts increased the manual moderation efficiency up to 40%, and we’re expecting to see that number increase as the moderators grow more familiar with the feature.

Here’s how the keyboard shortcuts work:

Ready to improve your moderation efficiency?

Get in touch with a content moderation expert today, or try Implio for free.

What is a content moderator? why not ask one. We sat down with Michele Panarosa, Online Content Moderator Level 1 at Besedo, to learn more about a content moderators daily work, how to become one, and much more.

 

Hi Michele! Thank you for taking the time to sit down with us. Could you tell us a bit about yourself?

My name is Michele Panarosa, I’m 27 years old and I come from Bari, Puglia, Italy. I’ve been an online content moderator for nine months now, formerly an IT technician with a passion for technology and videogames. In my spare time, I like to sing and listen to music. I’m a shy person at first, but then I turn into an entertainer because I like to have a happy environment around me. They call me “Diva” for a good reason!

 

What is a content moderator?

A content moderator is responsible for user-generated content submitted to an online platform. The content moderator’s job is to make sure that items are placed in the right category, are free from scams, doesn’t include any illegal items, and much more.

 

How did you become a content moderator?

I became an online content moderator by training with a specialist during the first weeks of work, but it’s a never-ending learning curve. At first, I was scared to accidentally accepting fraudulent content, or not doing my job properly. My teammates, along with my manager and team leaders, were nice and helped me throughout the entire process. As I kept on learning, I started to understand fraud trends and patterns. It helped me spot fraudulent content with ease, and I could with confidence escalate items to second-line moderation agents who made sure it got refused.

Communication is essential in this case. There are so many items I didn’t even know existed, which is a enriching experience. The world of content moderation is very dynamic, and it has so many interesting things to learn.

 

What’s great about working with content moderation?

The great part of content moderation is the mission behind it. Internet sometimes could seem like a big and unsafe place where scammers are the rulers. I love this job because I get to make the world a better place by blocking content that’s not supposed to be online. It’s a blessing to be part of a mission where I can help others and feel good about what I do. Besides, it makes you feel important and adds that undercover aspect of a 007 agent.

 

How do you moderate content accurately and fast?

Speed and accuracy could be parallel, but you need to be focused and keep your eyes on the important part of a listing. Only a bit of information in a listing can be very revealing and tell you what your next step should be. On top of that, it’s crucial to stay updated on the latest fraud trends to not fall into any traps. Some listings and users may appear very innocent, but it’s important to take each listing seriously and it’s always better to slow down a bit before moving on to the next listing.

 

What’s the most common type of content you refuse?

The most common type of items I refuse must be weapons – any kind of weapons. Some users try to make them seem harmless, but in reality, they’re not. It’s important to look at the listing images, and if the weapon is not exposed in the image, we’ll simply gather more information about the item. Usually, users who want to sell weapons try to hide it by not using images and be very short in their description (sometimes no description at all). It’s our task, as content moderators, to collect more details and refuse the item if it turns out to be a weapon. Even if it’s a soft air gun or used for sports.

 

What are the most important personal qualities needed to become a good content moderator?

The most important personal qualities needed to become a good content moderator are patience, integrity, and curiosity.

  • Patience

    Moderating content is not always easy and sometimes it can be challenging to maintain a high pace while not jeopardizing accuracy. When faced with factors that might slow you down, it’s necessary to stay patient and not get distracted.

  • Integrity

    It’s all about work ethic, staying true to who you are and what you do. Always remember why you are moderating content, and don’t lose track of the final objective.

  • Curiosity

    As a content moderator, you’re guaranteed to stumble onto items you didn’t even know existed. It’s important to stay curious and research the items, to make sure they’re in the right category, or should be refused – if the item doesn’t meet the platform’s rules and guidelines.

Content moderator besedo - Michele Panarosa

Michele Panarosa

Michele is an Online Content Moderator Level 1 and has worked within this role for nine months. Previously he worked as an IT technician. Michele is passionate about technology and videogames, and in his spare time, he enjoys music both to sing and listen.

Is your site suffering from ‘marketplace leakage’? If so it’s because your customers are sharing their personal details with each other – to avoid paying site fees. But by doing so they also put themselves at risk. Here’s how to make sure your business protects itself from marketplace leakage and those that use it.

Marketplace leakage (also referred to as ‘breakage’) is a real problem for many online businesses. According to Venture Capitalists, Samaipata, the term can be defined as ‘what happens when a buyer and seller agree to circumvent the marketplace and continue transacting outside the platform.’

Broadly speaking, there are several ways in which personal details are shared – via listings, embedded in images, and within one-to-one chats. Information shared typically includes phone numbers, email addresses, WhatsApp details, and money transfer account details.

From a user perspective, it might make sense to try and do so. However, many don’t realize the wider ramifications of marketplace leakage and the negative impact it can have on the platforms they transact on – and on their own businesses.

Let’s look more closely at the impact of sharing personal details online via marketplaces and what can be done to prevent it.

How personal details do damage

As we see it, there are 3 key ways in which sharing personal details can have a negative impact.

1. Conversions

From eBay to Airbnb; Amazon to Fiverr – the vast majority of marketplaces facilitate the trade of goods and services. As a result, a core part of each platform is its payment infrastructure.

But not only do these solutions offer a trusted way for users to transact, they can also be used to collect fees – a percentage paid for using the platform.

In the early days of a platform’s existence, many sites may be available to both buyers and sellers for free – whilst the marketplace is trying to scale and get as many users as possible. However, once it’s reached a certain threshold and networks effects are visible, it’s common for them to begin charging, often through the transaction.

This is often when users – primarily those selling on these sites – will try to circumvent the platform and include their contact details in each post. It might be that they paste their email address in the product description itself, or create an image that has details included within it.

When this occurs, your marketplace loses out on conversions. It’s something that’s easy to overlook and – on the odd occasion – let slide. But in the long-term, activities like this will seriously dent your revenue generation.

2. Retention

One of the major differentiating factors between online marketplaces is whether they’re commoditized or non-commoditized – particularly where service-focused platforms are concerned.

While commoditized service providers are more about getting something specific fixed, delivered, or completed (think Uber or TaskRabbit); non-commoditized providers (eg Airbnb) take into account a number of determining factors – such as location, quality, and available amenities.

Due to the nature of these sorts of services, they are more likely to encourage personal interactions – particularly when repeat transactions with the same vendor are involved. Once trust and reliability are established, there’s little incentive for either party to remain loyal to the platform – meaning conversions are more likely to be forfeited.

Leakage of this nature was partly to blame for the demise of Homejoy – an on-demand home services recruitment platform. The nature of the work involved increased the likelihood of recurring transactions. However, it transpired that the features facilitated by the site – in-person contact, location proximity, and reliable workmanship – were of greater value than the incentives offered by using the site itself in many cases.

As a result, more and more transactions began happening outside of the marketplace; meaning that the site lost out on recurring revenues.

3. User safety

Losing control of the conversation and having users operate outside of your marketplace, increases the risk of them being scammed.

This is particularly prevalent in online dating, where even experienced site users can be duped into providing their personal details to another ‘lonely heart’ in order to take the conversation in a ‘different direction’.

eHarmony offers some great advice on what users should be wary of, but the general rule of thumb is to never disclose personal details of any kind until a significant level of trust between users has been established.

While similar rules apply to online marketplace users too, some telltale signs of a scammer are requests for alternative payment methods – such as bank or money transfers, or even checks.

An urgency to trade outside of the marketplace itself is also a sign to be aware of. So it’s important to advise your users to be cautious of traders that share their personal details. Also, make a point of telling them to be wary of vendors who are ‘unable’ to speak directly to them – those who request funds before any arrangements have been made.

In all cases, marketplaces that don’t monitor and prevent this kind of activity put their customers at risk. And if their transaction is taken away from your site, they forfeit the protection and assurances your online marketplace provides.

But unless your users understand the value and security of your platform, they’ll continue to pursue conversations off your site and expose themselves to potential scammers.

Preventing marketplace leakage

The best way to overcome these issues and prevent marketplace leakage is to do all you can as a marketplace owner to keep buyer-seller conversations on your site and reinforce why it’s in their (and to some extent your) interest not to share personal details and remain on your platform.

There are several ways to do this.

Stronger communication

The stronger the communication channels are within your platform, the less incentive there is for customers to navigate away from your site.

From eBay and Airbnb’s messaging functionality (which look and feel like email servers) to one-to-one chat platforms (similar to Facebook Messenger or WhatsApp), or even on-site reviews and ratings; the more user-friendly and transparent you make conversations between different parties, the greater the likelihood they’ll remain on your site. A point we also highlighted and covered in our webinar about trust building through UX design.

In addition, it’s always worth reinforcing exactly what your marketplace offers users – and reminding them of their place within it. For example, telling them they’re helping build a trust-based peer-to-peer network is a powerful message – one that speaks to each user’s role as part of a like-minded online community.

Provide added value services

If users feel as though there’s no real value to using your site – other than to generate leads or make an occasional purchase – there’s very little chance that you’ll establish any meaningful connection.

The best way to foster user loyalty is to make the experience of using your marketplace a better experience to the alternative. In short, you need to give them a reason to remain on your site.

In addition to safety and security measurements – consider incentives, benefits, and loyalty programs for both vendors and buyers.

Turo, the peer-to-peer car rental site is an example of a company that does this very well – by offering insurance to lenders and travelers: both a perk and a security feature.

In a similar way, eBay’s money-back guarantee and Shieldpay’s ‘escrow’ payment service – which ensures all credible parties get paid; regardless of whether they’re buying or selling – demonstrate marketplaces acting in both customers and their own interests.

Another way in which marketplaces offer better value is through the inclusion of back-end tools, which can help vendors optimize their sales. Consider OpenTable’s booking solution for example. The restaurant reservation platform doesn’t just record bookings and show instant availability; it also helps its customer fill empty seats during quieter services.

Platforms that can see past their initial purpose and focus on their customers’ needs are those that thrive. They offer a holistic, integrated solution that addresses a wider range of pain points. Which is a great way of ensuring they’ll remain loyal to your business; ultimately reducing leakage.

Filter and remove personal details

A relatively straightforward way to prevent marketplace leakages is to monitor and remove any personal details that are posted on your site.

However, this can turn out to become quite the task, especially when the amount of user-generated content increases.

The next logical step here would be to direct efforts towards improving your content moderation. Either improve your manual moderation and expand your team or look at setting up an automated moderation solution.

An automated filter is a great solution to help prevent personal details to be shared, and although the filter creation process can be complex, it’s definitely possible to create highly accurate filters to automatically detect and remove personal details in moderation tools like Implio.

Machine learning AI is another great automated moderation solution that will help with preventing personal details, and much more. Built on your platform-specific data, a tailored AI moderation setup is developed to meet your marketplace’s unique needs. This solution is a great option for online marketplaces that look for a complete customized solution.

Added value and moderation – a mutual benefit

Trust, security, and accountability are the most valuable features that any marketplace or classifieds sites can offer its users. However, they’re not always the most visible components.

But when they’re parts of a broader benefit – such as optimized user experience or a suite of useful features – the need to share personal details and transact way from a site is mitigated.

That said, shared personal details will always contribute to marketplace leakage. And without the right monitoring and moderation processes in place, it’s impossible for marketplace owners to overcome the challenge of marketplace leakage.

At Besedo, we work with online marketplace and classified sites to help them make the right choices when it comes to safeguarding their businesses and users by removing personal details.

To learn more about how you can prevent personal details form your marketplace, specifically through automated filters, check out our on-demand Filter Creation Masterclass.