Our new Implio feature enhances the quality of the images shared on your platform. Your manual moderation team can now crop and rotate user-generated images quickly and efficiently in the moderation tool.
For online platforms like online marketplaces and dating sites, creating a good user experience and trustworthy environment is essential; and high-quality pictures are crucial in that matter. In our user search study, users unanimously picked quality images as the reason to prefer one site over another.
Profile picture or images are crucial for users to trust the person on the other side of the screen or what they want to sell or buy. And as a company, you want to create and maintain that trust for your users.
On dating sites or online marketplaces, the cropping and rotating feature helps you to moderate pictures to comply with your company’s guidelines. For instance, cropping profile pictures so that only one person appears or ensuring that the user’s face is distinctly visible. On top of this, images can also be rotated to make sure that images submitted upside down, or from the wrong angle, can easily be corrected.
The cropping and rotation feature in Implio helps you improve trust and user experience for both your sellers and buyers.
Here’s how the feature works:
Curious to learn more about our new feature?
Could you tell us a bit about yourself?
My name is Kevin Ducón from Bogotá, Colombia. I hold an MSc in Computer Science from Universidad Politecnica de Madrid and a BSc in Computer Science from Universidad Ditrital de Bogotá.
I have been working in information and communications technology for more than fifteen years and began working at Besedo five years ago, specializing in IT Service Management and Information Security. I started as a local ICT Administrator in our Colombian center, then as an ICT Supervisor, and currently, I am the Global Head of ICT-IS (information and communications technology – information security).
Over the past five years, I have applied my knowledge and skills to this ever-changing industry by creating policies and processes aligned with the industry’s best practices, supporting our clients, and continuously improving our on-going projects.
What are your responsibilities as Global Head of ICT-IS?
As the Global Head of ICT-IS at Besedo, I’m in charge of all levels of support in information technology and communications.
I oversee the Global ICT work, and together with my ICT team, I make sure that we fulfill our most important metrics – availability, service-level agreement, and customer satisfaction.
On top of that, I manage and provide insights into our security guidelines and develop strategic and operational plans for the ICT department to ensure that all necessary tools and processes are fully functional to achieve the company’s overarching goals and ambitions.
I also have hands-on technical responsibilities in supporting and developing mission-critical systems, which are running 24/7, to make sure our moderation services are successfully delivered to our customers worldwide.
From an ICT point of view, what are the key elements that must go right when running a content moderation operation?
The essential part from an ICT standpoint when running a content moderation operation is to truly understand the priorities and needs specific to the operation. Having an IT strategy to translate business needs into functioning IT operations is vital for a successful moderation setup.
Furthermore, ensuring good practices in network infrastructure and server setup, device management, and IT support is key to achieve a solid moderation operation. Finally, it’s crucial to have a knowledgeable and committed IT staff behind the scenes.
What are the common things that can go wrong?
When running a moderation operation, many potential issues can occur, some of the most common hazards include Internet connection, networks, or servers going down, power outages and failed infrastructure deployments.
For instance, content moderation relies heavily on a stable Internet connection, and you cannot blindly trust that it will just work. Instead, you need to make sure that your Internet service always works to its full capacity.
What safety measures are needed to make sure the moderation operation runs smoothly?
It’s important to have proactive safety measures in place to guarantee that the moderation operation always is carried out correctly. A good first step is to plan the implementation of the moderation services thoroughly before putting disaster mitigations plans in place.
For example, at Besedo, we work with several Internet service providers in case one of those fails to deliver correctly. We also work with fault-tolerant networks, a resilient infrastructure, third-party support, etc., to ensure that our IT operations remain stable when potential risks materialize.
On top of this, we run daily IT checklists and use monitoring systems that allow us to prevent potential challenges during IT ops. Also, we have backup routines to avoid any information loss or damage and use UPS to keep our critical devices turned on.
All in all, for anyone looking to run a successful moderation operation, many countermeasures must be put in place to make sure that IT operations run smoothly.
What’s the best thing about your job?
My job allows me to work in the different areas of the ICT Function and with all the disciplines that contribute to the business. For some people, ICT only assists with end-user tickets because that’s what’s visible to them. However, IT is not just a commodity but a strategic ally for us to deliver the highest level of services to our customers.
I’m proud to apply my skill-set and knowledge to Besedo’s purpose and values, which I genuinely believe in. When I took the role as Global Head of ICT-IS, I sought out to implement our promise ‘grow with trust’ into everything we do in our team. This has shaped the ICT team’s goal to help all functions grow with trust, through efficient processes, guaranteed quality of services, and high customer satisfaction.
At Besedo, we have an excellent ICT team of committed and ambitious individuals who love what they do and work hard to improve the company every day.
Kevin Ducón is Besedo’s Global Head of ICT-IS. He has been working in information and communications technology for more than fifteen years. Over the past five years at Besedo, he has applied his knowledge and skills to the ever-changing content moderation industry.
Efficiency and accuracy are two of the most valuable KPIs online marketplaces track to evaluate their manual moderation performance. The key to an optimized manual moderation team is to find the right balance between efficiency and accuracy.
However, here’s the pitfall, if you push your moderators too hard to achieve efficiency, this can, in time, lessen their accuracy and jeopardize the quality of the content published on your marketplace. Low-quality content is likely to slip through the cracks threatening the reputation of your platform, damaging user trust, and putting your users at risk, varying from user experience issues to more serious threats such as identity theft or scams.
For your online marketplace to succeed and to keep potential issues at bay, it’s imperative to provide your moderation team with the right moderation tools to help them be as efficient and accurate as possible.
At Besedo, we are continually looking to improve our all-in-one content moderation tool, Implio, by adding features to ensure your content moderators perform at their best.
Whether it’s highlighting keywords, working with specialized moderation queues, enabling quick links or warning messages, many features available in Implio are created to ease your moderators’ daily work and improve their overall performance.
Keyboard shortcuts – efficient manual moderation
Implio’s brand new feature, Keyboard shortcuts, helps your moderators easily make decisions with a single click and navigate through listings without leaving their keyboard, making manual moderation both efficient and accurate.
From our initial tests, we found that keyboard shortcuts increased the manual moderation efficiency up to 40%, and we’re expecting to see that number increase as the moderators grow more familiar with the feature.
Here’s how the keyboard shortcuts work:
Ready to improve your moderation efficiency?
What is a content moderator? why not ask one. We sat down with Michele Panarosa, Online Content Moderator Level 1 at Besedo, to learn more about a content moderators daily work, how to become one, and much more.
Hi Michele! Thank you for taking the time to sit down with us. Could you tell us a bit about yourself?
My name is Michele Panarosa, I’m 27 years old and I come from Bari, Puglia, Italy. I’ve been an online content moderator for nine months now, formerly an IT technician with a passion for technology and videogames. In my spare time, I like to sing and listen to music. I’m a shy person at first, but then I turn into an entertainer because I like to have a happy environment around me. They call me “Diva” for a good reason!
What is a content moderator?
A content moderator is responsible for user-generated content submitted to an online platform. The content moderator’s job is to make sure that items are placed in the right category, are free from scams, doesn’t include any illegal items, and much more.
How did you become a content moderator?
I became an online content moderator by training with a specialist during the first weeks of work, but it’s a never-ending learning curve. At first, I was scared to accidentally accepting fraudulent content, or not doing my job properly. My teammates, along with my manager and team leaders, were nice and helped me throughout the entire process. As I kept on learning, I started to understand fraud trends and patterns. It helped me spot fraudulent content with ease, and I could with confidence escalate items to second-line moderation agents who made sure it got refused.
Communication is essential in this case. There are so many items I didn’t even know existed, which is a enriching experience. The world of content moderation is very dynamic, and it has so many interesting things to learn.
What’s great about working with content moderation?
The great part of content moderation is the mission behind it. Internet sometimes could seem like a big and unsafe place where scammers are the rulers. I love this job because I get to make the world a better place by blocking content that’s not supposed to be online. It’s a blessing to be part of a mission where I can help others and feel good about what I do. Besides, it makes you feel important and adds that undercover aspect of a 007 agent.
How do you moderate content accurately and fast?
Speed and accuracy could be parallel, but you need to be focused and keep your eyes on the important part of a listing. Only a bit of information in a listing can be very revealing and tell you what your next step should be. On top of that, it’s crucial to stay updated on the latest fraud trends to not fall into any traps. Some listings and users may appear very innocent, but it’s important to take each listing seriously and it’s always better to slow down a bit before moving on to the next listing.
What’s the most common type of content you refuse?
The most common type of items I refuse must be weapons – any kind of weapons. Some users try to make them seem harmless, but in reality, they’re not. It’s important to look at the listing images, and if the weapon is not exposed in the image, we’ll simply gather more information about the item. Usually, users who want to sell weapons try to hide it by not using images and be very short in their description (sometimes no description at all). It’s our task, as content moderators, to collect more details and refuse the item if it turns out to be a weapon. Even if it’s a soft air gun or used for sports.
What are the most important personal qualities needed to become a good content moderator?
The most important personal qualities needed to become a good content moderator are patience, integrity, and curiosity.
Moderating content is not always easy and sometimes it can be challenging to maintain a high pace while not jeopardizing accuracy. When faced with factors that might slow you down, it’s necessary to stay patient and not get distracted.
It’s all about work ethic, staying true to who you are and what you do. Always remember why you are moderating content, and don’t lose track of the final objective.
As a content moderator, you’re guaranteed to stumble onto items you didn’t even know existed. It’s important to stay curious and research the items, to make sure they’re in the right category, or should be refused – if the item doesn’t meet the platform’s rules and guidelines.
Michele is an Online Content Moderator Level 1 and has worked within this role for nine months. Previously he worked as an IT technician. Michele is passionate about technology and videogames, and in his spare time, he enjoys music both to sing and listen.
Is your site suffering from ‘marketplace leakage’? If so it’s because your customers are sharing their personal details with each other – to avoid paying site fees. But by doing so they also put themselves at risk. Here’s how to make sure your business protects itself from marketplace leakage and those that use it.
Marketplace leakage (also referred to as ‘breakage’) is a real problem for many online businesses. According to Venture Capitalists, Samaipata, the term can be defined as ‘what happens when a buyer and seller agree to circumvent the marketplace and continue transacting outside the platform.’
Broadly speaking, there are several ways in which personal details are shared – via listings, embedded in images, and within one-to-one chats. Information shared typically includes phone numbers, email addresses, WhatsApp details, and money transfer account details.
From a user perspective, it might make sense to try and do so. However, many don’t realize the wider ramifications of marketplace leakage and the negative impact it can have on the platforms they transact on – and on their own businesses.
Let’s look more closely at the impact of sharing personal details online via marketplaces and what can be done to prevent it.
How personal details do damage
As we see it, there are 3 key ways in which sharing personal details can have a negative impact.
From eBay to Airbnb; Amazon to Fiverr – the vast majority of marketplaces facilitate the trade of goods and services. As a result, a core part of each platform is its payment infrastructure.
But not only do these solutions offer a trusted way for users to transact, they can also be used to collect fees – a percentage paid for using the platform.
In the early days of a platform’s existence, many sites may be available to both buyers and sellers for free – whilst the marketplace is trying to scale and get as many users as possible. However, once it’s reached a certain threshold and networks effects are visible, it’s common for them to begin charging, often through the transaction.
This is often when users – primarily those selling on these sites – will try to circumvent the platform and include their contact details in each post. It might be that they paste their email address in the product description itself, or create an image that has details included within it.
When this occurs, your marketplace loses out on conversions. It’s something that’s easy to overlook and – on the odd occasion – let slide. But in the long-term, activities like this will seriously dent your revenue generation.
One of the major differentiating factors between online marketplaces is whether they’re commoditized or non-commoditized – particularly where service-focused platforms are concerned.
While commoditized service providers are more about getting something specific fixed, delivered, or completed (think Uber or TaskRabbit); non-commoditized providers (eg Airbnb) take into account a number of determining factors – such as location, quality, and available amenities.
Due to the nature of these sorts of services, they are more likely to encourage personal interactions – particularly when repeat transactions with the same vendor are involved. Once trust and reliability are established, there’s little incentive for either party to remain loyal to the platform – meaning conversions are more likely to be forfeited.
Leakage of this nature was partly to blame for the demise of Homejoy – an on-demand home services recruitment platform. The nature of the work involved increased the likelihood of recurring transactions. However, it transpired that the features facilitated by the site – in-person contact, location proximity, and reliable workmanship – were of greater value than the incentives offered by using the site itself in many cases.
As a result, more and more transactions began happening outside of the marketplace; meaning that the site lost out on recurring revenues.
3. User safety
Losing control of the conversation and having users operate outside of your marketplace, increases the risk of them being scammed.
This is particularly prevalent in online dating, where even experienced site users can be duped into providing their personal details to another ‘lonely heart’ in order to take the conversation in a ‘different direction’.
eHarmony offers some great advice on what users should be wary of, but the general rule of thumb is to never disclose personal details of any kind until a significant level of trust between users has been established.
While similar rules apply to online marketplace users too, some telltale signs of a scammer are requests for alternative payment methods – such as bank or money transfers, or even checks.
An urgency to trade outside of the marketplace itself is also a sign to be aware of. So it’s important to advise your users to be cautious of traders that share their personal details. Also, make a point of telling them to be wary of vendors who are ‘unable’ to speak directly to them – those who request funds before any arrangements have been made.
In all cases, marketplaces that don’t monitor and prevent this kind of activity put their customers at risk. And if their transaction is taken away from your site, they forfeit the protection and assurances your online marketplace provides.
But unless your users understand the value and security of your platform, they’ll continue to pursue conversations off your site and expose themselves to potential scammers.
Preventing marketplace leakage
The best way to overcome these issues and prevent marketplace leakage is to do all you can as a marketplace owner to keep buyer-seller conversations on your site and reinforce why it’s in their (and to some extent your) interest not to share personal details and remain on your platform.
There are several ways to do this.
The stronger the communication channels are within your platform, the less incentive there is for customers to navigate away from your site.
From eBay and Airbnb’s messaging functionality (which look and feel like email servers) to one-to-one chat platforms (similar to Facebook Messenger or WhatsApp), or even on-site reviews and ratings; the more user-friendly and transparent you make conversations between different parties, the greater the likelihood they’ll remain on your site. A point we also highlighted and covered in our webinar about trust building through UX design.
In addition, it’s always worth reinforcing exactly what your marketplace offers users – and reminding them of their place within it. For example, telling them they’re helping build a trust-based peer-to-peer network is a powerful message – one that speaks to each user’s role as part of a like-minded online community.
Provide added value services
If users feel as though there’s no real value to using your site – other than to generate leads or make an occasional purchase – there’s very little chance that you’ll establish any meaningful connection.
The best way to foster user loyalty is to make the experience of using your marketplace a better experience to the alternative. In short, you need to give them a reason to remain on your site.
In addition to safety and security measurements – consider incentives, benefits, and loyalty programs for both vendors and buyers.
Turo, the peer-to-peer car rental site is an example of a company that does this very well – by offering insurance to lenders and travelers: both a perk and a security feature.
In a similar way, eBay’s money-back guarantee and Shieldpay’s ‘escrow’ payment service – which ensures all credible parties get paid; regardless of whether they’re buying or selling – demonstrate marketplaces acting in both customers and their own interests.
Another way in which marketplaces offer better value is through the inclusion of back-end tools, which can help vendors optimize their sales. Consider OpenTable’s booking solution for example. The restaurant reservation platform doesn’t just record bookings and show instant availability; it also helps its customer fill empty seats during quieter services.
Platforms that can see past their initial purpose and focus on their customers’ needs are those that thrive. They offer a holistic, integrated solution that addresses a wider range of pain points. Which is a great way of ensuring they’ll remain loyal to your business; ultimately reducing leakage.
Filter and remove personal details
A relatively straightforward way to prevent marketplace leakages is to monitor and remove any personal details that are posted on your site.
However, this can turn out to become quite the task, especially when the amount of user-generated content increases.
The next logical step here would be to direct efforts towards improving your content moderation. Either improve your manual moderation and expand your team or look at setting up an automated moderation solution.
An automated filter is a great solution to help prevent personal details to be shared, and although the filter creation process can be complex, it’s definitely possible to create highly accurate filters to automatically detect and remove personal details in moderation tools like Implio.
Machine learning AI is another great automated moderation solution that will help with preventing personal details, and much more. Built on your platform-specific data, a tailored AI moderation setup is developed to meet your marketplace’s unique needs. This solution is a great option for online marketplaces that look for a complete customized solution.
Added value and moderation – a mutual benefit
Trust, security, and accountability are the most valuable features that any marketplace or classifieds sites can offer its users. However, they’re not always the most visible components.
But when they’re parts of a broader benefit – such as optimized user experience or a suite of useful features – the need to share personal details and transact way from a site is mitigated.
That said, shared personal details will always contribute to marketplace leakage. And without the right monitoring and moderation processes in place, it’s impossible for marketplace owners to overcome the challenge of marketplace leakage.
At Besedo, we work with online marketplace and classified sites to help them make the right choices when it comes to safeguarding their businesses and users by removing personal details.
To learn more about how you can prevent personal details form your marketplace, specifically through automated filters, check out our on-demand Filter Creation Masterclass.
If user generated content is the lifeblood, then the portal is the spine of any successful marketplace. Without a quality portal your entire business will quickly collapse.
Developing and maintaining the framework of your marketplace can be hard and expensive work, that takes away focus from other critical business areas like monetization strategies and product evolution.
Luckily there are a lot of great partners out there providing platform solutions. Russmedia Solutions is one of the most experienced companies in the business, having completed more than 100 successful projects.
We had a chat with their CRO Adrian Daniels about common challenges, pitfalls and tips for marketplace owners setting up their portal.
What does Russmedia do?
RussMedia Solutions is an international company that offers software services for online businesses. Our solutions include Job Board Software and other specific solutions for Real Estate Platforms, Car Portals and other online classifieds. We apply a structured approach to development, supported by dedicated project managers and product owners. Today our expertise includes digital marketing, user experience, SEO and conversion rate optimization.
Some of our key features include semantic search, matching, and classification based on machine learning, artificial intelligence, great filtering capabilities, user alerts plus very powerful analytics.
We began as an internal software development department of the Russmedia group, but soon after that, we started to develop online portal solutions not only for our internal products, as we started having requests for our solutions from external partners. We also took on the development and maintenance of news, job, car and real estate portals. Since then it’s been 15 years and more than 100 successful projects.
Why should marketplace founders partner with Russmedia to build their marketplace instead of using an in-house team?
We’ve built many projects from scratch. First our own portals and then many for our partners. When you have an in-house team the cumulated experience of your team is rarely that rich and diverse. Finding a solution outside your company might very often prove to be more effective not only cost wise but also from the human resources management perspective. You do not have a handful of people that are specialized in very specific technologies, but an entire company at your disposal with a lot of experience and with expertise in multiple areas. You have support around the clock and you also get to be part of a community.
Having a company that offers software as a service also saves the cost in developing new features or functionalities, many times we develop these and then offer it to our clients either as part of their subscription or at a better cost than if they would have developed it in-house.
You’ve recently joined us in a webinar around migrating from one marketplace tech to another, could you give us one example of pitfalls you’ve seen marketplace owners fall into when going through this process?
For sure, we assisted and conducted many migrating processes. some very successful, some that had many challenges. From my experience, there is a huge risk if planning has not been done thoroughly. Transparency and setting up clear goals is crucial and unfortunately, it happens that some of these checkpoints are not completed.
If you’d like here is a very short list:
- Schedule proper training for all users of the new tool
- Make a test migration with live data – It’s Important to test with live data, to measure migration timings and find out possible data issues with live data before the actual migration. Once data is there, more complete tests can be performed.
- Make sure all involved parties are present on migration date
What are some of the most common challenges marketplaces have when reaching out to Russmedia?
There are multiple situations, but 3 of the most common are:
- Undersized infrastructure – companies that did not plan their growth in advance and went for a lower budget solution which is not scalable, and then they end up in the situation where they need to migrate to a different solution for this very reason.
- The platform they currently use is not flexible enough – We often have prospects that tell us their current solution does not support multiple languages or cannot be integrated with certain apps or simply some desired design upgrades are not possible on the current platform
- Start-up in need of a solution
There are many vendors out there that can help build a marketplace. What sets Russmedia apart?
Indeed, many great companies and many good solutions are out there. I believe our key differentiator is the fact that our platforms are highly customizable. We are very flexible when it comes to integration or any other custom requests. We developed various projects for our clients from classic real-estate portals or general job portals to jobs in aviation or a marketplace for horses. We can adapt our platforms to nearly any language.
Last, but not least our partners become part of a community. We take on the innovation of new features, so the client no longer has to worry about this. Sometimes we give them new features that they did not even consider developing, but which end up being a great revenue stream for them.
Any examples of past projects that you’re really proud of? What made that project, in particular, a success?
We are proud of all our projects. Vol.at, CVOnline.hu, Laendleimmo.at, Simplysalesjobs.co.uk, Laendlejob.at, Immo.tt.com, Jobs.tt.com, just to name a few. If we’d have to pick one of them maybe laendlejob.at. It is also part of our group and it looks great; it works great and it is highly profitable.
You’ve just recently partnered with Besedo. How does that fit into your strategy? Does it allow you to provide an even better or more complete offering?
Absolutely. Besedo has great tools. AI-powered moderation tools are something that every marketplace should have. This partnership allows us to give more to our clients and it takes the pressure off our team as we no longer need to invest in research to develop something like this.
Last but not least Besedo’s solutions, like ours, are tailor-made so every client gets their own customized solution.
If you could give one piece of advice to those setting up a marketplace portal what would it be?
Plan. Make some plans and when you are done revise those plans. You also need to make sure you plan for the long term. Make sure you take into consideration at least the first three years of your existence and make sure that when you build your business it will be scalable.
This should also reflect on the infrastructure you are setting up. It might seem cheaper to get an off the shelf solution at first, but after a few months you might realize this was not a great idea higher costs are involved and more effort: migration, scouting for a different platform all the back and forth…you lose time and money.
About Adrian Daniel
Adrian is Chief Revenue Officer at Russmedia Solutions. he’s worked his way up in the company, so he knows the business inside-out. Started as a junior programmer in Russmedia more than 9 years ago and believes that technology is here to help us.
He mostly enjoys the fact that technology can create endless possibilities to help businesses thrive. Classifieds portals and their growth have been his focus for nearly a decade and now as a CRO he is am looking at all the potential that Russmedia’s solutions can offer potential partners.
He is highly motivated and believes that any problem has a solution.
It’s 2018 and AI is everywhere. Every company and their grandmom are now offering AI-powered solutions. With so many options to pick from does, it really matter who you partner with for AI moderation?
When we started building our AI for moderation in 2008, machine learning had hardly been applied to content moderation. Since then others have understood the value automation brings in keeping marketplace users safe.
Every time we go to a tradeshow or conference we see new companies with AI offers and we understand that as the market gets more saturated it can be hard to decide which vendor to bet on.
To help you navigate the AI jungle we wanted to highlight some very specific areas where our AI is unique in the market.
A lot of AI models work based on a sliding scale and the output you get is a probability score. The score gives you a picture of how likely the content piece is to be whatever the algorithm is looking for. So if a content piece receives a high probability score from a model looking to detect unwanted content, there’s a good chance that the content piece falls into that category.
However, a scoring system is often arbitrary. When should you reject an item as a scam? When the probability score is 100%? 99% or maybe 85% is good enough?
Our AI doesn’t operate this way. We want to provide our clients with clear answers that they can apply straight away. As such we do not send back an abstract score, instead, our algorithm provides a concrete answer.
We operate with 3 different, but clear answers that are easy to apply a moderation action to. The 3 values we expose are OK, NOK (not okay) and uncertain.
Let’s use the unwanted content model as an example. Our algorithms will look at the content and determine whether it’s unwanted content. If it is it will return “NOK” and you should reject the content piece, if it isn’t you will get “OK” back and you can accept it. If the model isn’t sure it will send back “Uncertain”, this doesn’t happen often, but if it does you should send the content for manual review.
That’s how simple it is. There’s no grey zone, only clear actionable answers to each content piece you run through the model.
A holistic AI approach
We believe that the value of AI is often mistakenly judged on the accuracy of the models alone. The reality is that it’s more complex than that. To explain why we need to get a bit technical and quickly outline a bit of AI terminology. (If you are interested you can read more about the basic concepts of AI moderation here)
When evaluating an AI there are multiple KPI’s you can look at, accuracy is just one of them. To determine if our AI is performing to our standards, we look at a wide array of metrics. We can’t cover all in this article, but here are some of the most important ones.
Is a number that describes how often models predictions were actually correct. If there are 100 content pieces and the machine determine 10 of them to be unwanted content, but only 8 of them were actually unwanted content then the model has a precision of 80%.
Recall is showing how many of the actual unwanted content pieces the algorithm correctly identifies. If we go back to our example with 100 content pieces. The AI correctly identified 8 unwanted content pieces out of the 100, but, there were 16 unwanted content pieces. In this case, the recall of the model is 50% as it only found half of the unwanted content cases present.
Describes the number of decisions the model gets correct. If we have 100 content pieces and 16 of them are unwanted content the accuracy of the model will be negatively impacted both by the unwanted content it fails to identify and by any good content it wrongly identifies as bad.
This means that if a model out of 100 content pieces correctly identified 8 unwanted content when there were 16 present and it wrongly identified 2 good content pieces as unwanted content the model would have an accuracy of 90%.
Automation rate is a way to measure exactly how much of the total content volume is being handled by AI. If you have 100000 content pieces per day and 80000 of them are dealt with by the models, then you have an automation level of 80%
When judging how well AI works we believe it needs to be based on how it performs in relation to all these 4 metrics as that will give you a truer picture of how well the AI is dealing with the content challenges.
You can never have perfect accuracy, precision, recall, and automation at the same time. Our AI is unique in that it is calibrated to meet your business objectives and to find the right balance between all of these indicators.
Supervised and continuous learning
Machine learning models can be taught in different ways and the way they are taught has a huge impact on how well they perform.
Our AI is trained on structured and labeled data of high quality. What this means is that the data sets we train our models on have been reviewed manually by expert content moderators who have taken a yes or no decision on every single piece of content.
We also update the models regularly ensuring that they are updated and adhere to new rules and global changes or events that could impact moderation decisions.
A calibrated solution
One of the benefits of designing our AI with an eye on multiple metrics is that we can tailor-make a solution to ensure the perfect fit for your business.
We have multiple levers we can pull to adjust the output allowing us to tweak accuracy and automation ensuring that everything is calibrated as your business requires.
With our solution the accuracy and degree of automation are elastic and that makes our AI setup much more flexible than other available options.
Adaptive AI Solution
One of the few drawbacks of Machine Learning is that it’s rigid and static. To change the model, you need to retrain it with a quality dataset. This makes it hard for most AI setups to deal with sudden changes in policies.
We‘ve solved this problem by deeply integrating it into our content moderation tool Implio. Implio has a powerful filter feature which adds flexibility to the solution, so you can quickly adapt to change.
For example. when a new iPhone comes out the AI models will not pick up the new scams until it has been trained on a new dataset including them, but you can add filters in Implio until there’s time to update machine learning. The same is true for other temporary events like the Olympic Games or global disasters except that these are over so quickly that it’s likely not feasible to update the models. Instead, you can add Implio filters that ensure high accuracy even during times with special moderation demands.
In addition, we have a team dedicated to studying moderation trends and best practices and all our AI customers benefit from their knowledge and our 16 years of experience to support and guide them.
ML Tailored to content moderation
Most of the AI solutions on the market were created to solve a general problem that occurs in multiple industries. This means that the AI works okay for most companies, but it’s never a perfect fit.
We took the other route and dedicated our efforts to create an AI that’s perfect for content moderation.
When we develop our AI we do it based on the 16 years of experience we have helping companies of all sizes keep their users safe and the quality of their user-generated content high. That has made our stack uniquely tailored to content moderation ensuring unparalleled results in our field.
We also have a team of experts supporting our AI developers with insights, internal learnings from moderating global sites of all sizes and research into industry trends and the challenges faced by online marketplaces and classifieds in particular.
Our research team feeds their insights to Besedo as a whole ensuring a high level of expertise at every level of our organization. From moderation agents to managers and developers. This ensures that our experience and expertise is infused into all our services and products.
Get an AI solution that fits your needs
There is no question about it, AI will play a huge role in marketplace growth over the next couple of years. However, to truly benefit from machine learning, make sure you get models that will work well for you.
We often talk to marketplace owners who have become slightly disillusioned after testing AI solutions that weren’t properly calibrated for their business. They have wasted time implementing a solution that didn’t solve their issue in a proper way and now they are wary of AI as a whole.
That’s a shame, when applied correctly AI is a great money saver and provides other benefits like fast time to site and user privacy protection.
To avoid spending money on the wrong AI, have a chat with our solution designers and they will give you a good idea of which setup would work for you and the results you can expect. Together you can tailor a solution that fits your exact needs.
Working with user-generated content moderation is not an easy task. Moderators need to be able to spot the slightest details to find fraud, scams or counterfeits etc. Therefore, it’s important to provide your manual workforce with the best possible conditions to work efficiently. To help simplify work for both you and your moderators, we now introduce multiple queues in our all-in-one content moderation tool, Implio. This new feature will help you streamline your manual moderation setup and the day-to-day work for your moderators.
How do multiple queues help my site?
We’ve implemented multiple queues to be as flexible as possible. This means that you can decide how many different queues to create, edit their function and names, or delete them whenever you want. Make sure to customize your queues to make the daily operation work as smooth as possible.
There are numerous ways multiple queues can be of use to your online marketplace. One way is to create a queue per language supported by your site. Utilize geolocation, available in Implio, to ensure that content is automatically placed in the correct queue, making it easier for your moderation teams to specialize and work with one language only.
This use of multiple queues is very valuable to multi-language marketplaces, but our new Implio feature can also help marketplaces who only support one language. Multiple queues can, for instance, be set up to automatically sort content based on price, category or risk and funnel it into different queues allowing you to direct it to specific expert teams or agents.
You can also create queues for items which are time sensitive and needs a shorter SLA, for example, funnel flagged content or new users to individual queues.
In Implio, we always have two predetermined queues, one default queue and one escalated queue. Your moderators can easily select which queue to work in from the manual interface. When working in a specific queue, your moderators can escalate an item to a supervisor at any time or send the item to another queue.
Multiple queues help you enable specialized moderation teams, which will simplify your moderators’ day-to-day work and make your overall moderation setup more effective.
How does it work?
Begin with creating a new queue in Implio. Then navigate to automation and create a new rule. Set the rule to send matching content to the correct queue that you just created. Here’s how a language queue set up looks like:
Try it out yourself.
Create your very own account in Implio, it’s free to use up to 10.000 items per month. Follow the steps above to set up your unique queue. Make sure to use the CSV importer to test multiple queues and all the other features available in Implio with your very own data.
If you want to learn more about multiple queues and Implio, get in touch with one of our content moderation experts.
Every feature we include in Implio has been carefully chosen based on feedback from stakeholders (internal and external) and after careful analysis of current and future needs within the industry (read more about how we plan our roadmap). As such it is always exciting when we launch something new since we know it is anticipated by our users and will increase their efficiency and quality of life when working in our tool.
Our developers work hard to ensure regular updates and feature additions to Implio. Here are the biggest improvements we released in 2017
Debuting almost an entire year ago, this particular feature helps manual moderators create a customized UI template in Implio. This allows users to display the necessary moderation information whichever way suits them best. For example, they could configure the layout to prioritize the image shown, user details, customer data, and moderation feedback – among other information.
Our second big feature of last year was the new Implio search tool. Never underestimate the power, speed, and usefulness of a good search function! The always-visible search bar is found at the top of each page within Implio. Users can search by keywords, exact quotes, and specific contact information – including email addresses and phone numbers.
The results can be ordered by relevance, newest first, oldest first; and displayed as a list or using images. We think this feature is going to be particularly useful for moderators as they review posts, or monitor accounts and items coming into Implio.
In May we launched Implio’s updated manual interface. It was the culmination of months of hard work from our developers; especially our front-end team.
We spent a lot of time performing usability tests and getting client feedback; fine-tuning the new interface to make sure it benefits everyone.
Key improvements added to this version include:
- Data is organized to follow the API’s structure, to make things more consistent.
- Revisions of a single item are grouped together so that the moderator only reviews the latest version and can disregard previous ones.
- Content can be edited directly within the page. Plus type and category can be changed using a simple drop-down menu.
- A status bar helps you track your progress on the page
- It’s also much easier (for a developer) to configure a number of settings for each client. These include the number of items displayed per page, and the ability to enable or disable pre-approved items in the queue.
Our fourth biggest Implio feature involved the rollout of different user role permissions. Each user role now comes with a specific list of permissions, allowing admins, automation specialists, moderators, and spectators full or restricted access to certain functionalities. As you’d expect, admins have the greatest level of authority, but being able to manage rules and search items will undoubtedly make moderators’ jobs a lot easier.
Our final feature for 2017, launched just before Christmas. It was our geolocation filter – which we’ve covered in a dedicated blog post.
Essentially it’s used to detect inconsistencies between where users say they’re based, and where their IP address actually shows them to be; ideal for helping online marketplace owners protect their users from scammers.
Geolocation is fully integrated into Implio and is visible in the manual moderation interface. However, users can also create their own rules, helping them quickly compare information, making it easier for moderators to detect discrepancies.
So… what does 2018 hold? Don’t worry there’s a whole lot more where these came from! We already have a number of features, functions, and updates planned for the next 12 months.
Watch. This. Space.
Sue Scheff is an author, parent advocate and cyber advocate who is promoting awareness of cyberbullying and other online issues. She is the author of three books, Wit’s End, Shame Nation and Google Bomb.
We had the opportunity to conduct an interview with her where she talked about victims experience of online sexual harassment/online shaming and shared her opinion on what sites can do to help fight the problem.
Interviewer: Hi Sue, thanks a lot for taking the time to share your knowledge, I know you are extremely busy! You’re the author of Shame Nation and Google Bomb, what were you hoping to achieve by releasing them?
Sue Scheff: Awareness. Most importantly, giving a voice to the voiceless.
After I wrote Google Bomb I was stunned by the outpour of people from all walks of life – from all over the world – that contacted me with their stories of Internet defamation/shaming/harassment. People were silently suffering from cyber-bullets, like myself, on the verge of financial ruin and all were emotionally struggling.
Google Bomb was the roadmap to helping people know there are legal ramifications and consequences of online behavior.
By 2012, I was taken back by the constant headlines of bullycide. Names like Tyler Clementi, Amanda Todd, Rebecca Sedwick, Audrie Potts – I knew how they felt – like there was no escaping this dark-hole of cyber-humiliation. At 40 years-old, when this happened to me, I had the maturity to know it would eventually get better. These young people don’t.
Google Bomb was the book to help people understand their legal rights, but with the rise of incivility online, Shame Nation needed to be written to help people know they can survive digital-embarrassment, revenge porn, sextortion and other forms of online hate. I packed this book with over 25 contributors and experts from around the world – to share their first-hand stories to help readers know they can overcome digital disaster. I also include digital wisdom for online safety and survival.
Interviewer: You’re a victim of online harassment and won a landmark case of internet defamation and invasion of privacy. Can you please try to explain your experience?
Sue Scheff: In 2003, I was attacked online by what I refer to as a disgruntled client, definitely a woman that didn’t like me. Once she started her attack, the gang-like mentality of trolls joined in. These trolls and this woman created a smear campaign that took an evil twist. From calling me a child abuser, saying I kidnap kids, exploit families, a crook and more. Things went towards the sexual side when they claimed to be auctioning my panties (of course they never meet me – or had anything) but to anyone reading this, how do you explain these are malicious trolls out to destroy me?
As an educational consultant, I help families with at-risk teens find residential treatment centers. These online insults nearly destroyed me. I ended up having to close my office, hire an attorney and fight.
By 2006 I was both emotionally and financially crippled. In September 2006 I won the landmark case in Florida for Internet defamation and invasion of privacy for $11.3M in a jury verdict. Lady Justice cleared my name, but the Internet never forgets. Fortunately for me, the first online reputation management company opens their doors that summer. I was one of their first clients. To this day – I say my lawyer vindicated me – but it’s ORM that gave me my life back.
Interviewer: You’ve also met many other victims of online harassment, online shaming, revenge porn etc. How are victims affected, both in short and long-term?
Sue Scheff: Trust and resilience.
I’ve spoken to many victims of online hate. The most common theme I hear is the lack of trust we (they) have of others (both online and offline) initially. With me, I know I become very isolated and reserved. My circle of trusted friends became extremely small – the fact is, no one understands this pain unless they have walked in your shoes. When researching Shame Nation – others expressed feeling the same way.
The good news is, with time we learn to rebuild our trust in humanity through our own resilience. This doesn’t happen overnight. It’s about acceptance – understanding that the shame doesn’t define you and it’s your opportunity to redefine yourself.
The survivors you will read about in Shame Nation have inspiring stories of hope. They all learned to redefine themselves – out of negative experiences. It’s what I did – and realized that many others have done the same.
Interviewer: Where do you see the biggest risk of being exposed to online sexual harassment?
Sue Scheff: Online reputation and emotional distress.
Today we face the majority of businesses and universities that will use the Internet to search your name prior “interviewing” you. Depending on how your name survives a Google rinse cycle, it will dictate your financial future – career or job wise.
Just because you have a job – doesn’t mean you’re out of hot water. More than 80% of companies have social media policies in place. If your name is involved in sexual misconduct (scandal) online – you could risk losing your job. Colleges are also implementing these social media policies.
PEW Research says the most common way for adults to meet – is online. If you’re a victim of cyber-shame, online sexual harassment, revenge porn or sextortion – this content could hinder your chances of meeting your soul mate.
The emotional distress is overwhelming. You feel powerless and hopeless. Thankfully today there are resources you can turn to for help.
Interviewer: Do you think this issue is growing or are we any closer to solving it?
Sue Scheff: Yes… and no.
In a 2017 PEW survey, over 80% of researchers predicted that online harassment will get worse over the next decade – this includes revenge porn and sexual harassment. This is a man-made disaster, and can only be remedied by each of us taking responsibility for our actions online and educating others. Education is the key to prevention. I believe the #MeToo and Times Up movement have brought more awareness to this topic, but I fear not enough is being done about it for the online world. It’s too easy to use a keypad as a legal lethal weapon.
The good news is that we are seeing stronger revenge porn laws being put in place, as well as more social platforms, are responding to removing content when flagged as abusive. Years ago, we didn’t have this – though it may be slow, it’s moving in the right direction.
Interviewer: What would be your advice to internet users today on how to avoid, prevent and fight harassment?
Sue Scheff: Digital wisdom.
I’m frequently asked, “how can I safely sext my partner?” I give the same answer every time. The Internet and social media were not and is not intended for privacy. We only have to think of the Sony email hacking or Ashley Madison leaks to know that no one is immune to have their private habits exposed to the world wide web. You should have zero expectancies of privacy if sending any sexual message via text or otherwise. Several studies concur – a majority of adults will share personal and private messages and images of their partner without their partner’s consent.
Your friend today could quickly turn into a foe tomorrow. Divorce rates are climbing – what used to be revenge offline with charging up your ex’s credit cards, now has longer-term consequences when your nudes can go viral or other comprising images or content. E-venge (such as revenge porn) is how ex’s will take out their anger. Don’t give them that power.
If you find you are a victim of online harassment or online hate – report it and flag it to the social platform. Be sure to fill out a form – outlining how it’s violating their code of conduct – email them professionally (never use profanity or a harsh tone).
I encourage victims not to engage with the harasser. Be sure to screenshot the content – then block them. If you feel this is a case that will get worse and it needs to be monitored, you can ask a friend to monitor it for you so you don’t have to be emotionally drained from it. I also tell the friend not to engage – and to let you know if it gets to a point that it may need legal attention – that your life is in danger or your business is suffering.
Interviewer: What is your opinion on what sites can do to help fight this problem?
Sue Scheff: In a perfect world – we would say stricter consequences offline for the perpetrators – which would hinder them from doing this online in the first place.
Strengthen the gatekeepers: User -friendlier and a speedier response time.
Although sites such as Facebook, Twitter and Instagram are stepping up and want to alleviate online harassment, many people still struggle with figuring out the reporting methods and especially the poor response time. Where are the forms? After that – the response time can be troubling – from what victims have shared with me. When you’re a victim of sexual harassment, these posts are extremely concerning – every minute feels like a year.
I personally had a good experience on Facebook – when I wrote about a cyber-stalker on my public page. It was addressed and handled within 48 hours.
Systems should be in place that if a comment/image is flagged as abusive (harassment) by more than 3-5 unique visitors, it should be taken down until it can be investigated by the social platform’s team. I think we can relate to the fact that online abuse reported daily is likely overwhelming social media platforms – however, I believe they should give us the benefit of the doubt until they can investigate our complaint.
Interviewer: What do you think about the idea of using computer vision (AI) to spot and block nude pictures before they are submitted on a dating site?
Sue Scheff: If dating sites were able to implement AI for suspicious content, it would be a great start to cut-back on sexual harassment and keeping the users safer.
Interviewer: Where can victims turn for support?
Are you a victim of online sexual harassment or cyberbullying?
Please heed Sue’s advice and reach out for support.
If you are site looking to help in the fight?
Contact us to see how AI and content moderation can help keep your users safe.
Sue Scheff is a Nationally Recognized Author, Parent Advocate and Internet Safety Advocate. She founded Parents Universal Resources Experts, Inc. in 2001.
She has 3 published books, Wit’s End, Google Bomb and her latest, Shame Nation: The Global Epidemic of Online Hate with a foreword by Monica Lewinsky.
Sue Scheff is a contributor for the Psychology Today, HuffPost, Dr. Greene, Stop Medicine Abuse, EducationNation, and others. She has been featured on ABC 20/20, CNN, Fox News, Anderson Cooper, Nightly News with Katie Couric, Rachael Ray Show, Dr. Phil, and more. Scheff has also been in USA Today, LA Times, NYT’s, Washington Post, Wall Street Journal, AARP, just to name a few.