The Christmas season is here and while the festivities kick off online retailers hold their breath and wait to see whether all of the preparations they have diligently made will pay off in revenue and sales during this ‘Golden Quarter.’ Will the website be able to handle extra demand? Will all orders be able to be shipped before Christmas?
Yet, The National Cyber Security Centre (NCSC) has highlighted another pressing concern which can have a lasting impact on revenue. Last week it launched a major awareness campaign called Cyber Aware advising potential customers to be aware of an increase in fraud on online platforms this year. This is because millions of pounds are stolen from customers through fraud every year – including a loss of £13.5m from November 2019 to the end of January 2020 – according to the National Fraud Intelligence Bureau.
Fraud is a major concern for marketplaces who are aware of the trust and reputational damage that such nefarious characters on their platform can create. While consumer awareness and education can help, marketplaces know that only keeping one eye on the ball when it comes to fraud, especially within User Generated Content (UGC), is not enough. Fraudulent activity deserves full attention and careful monitoring. Trying to tackle fraud is not a one-off activity but a dedication to constant, consistent, rigorous, and quality moderation where learnings are continuously applied, for the on-going safety of the community.
With that in mind, our certified moderators investigated nearly three thousand listings of popular items on six popular UK online marketplaces, in order to understand whether marketplaces have content moderation pinned down, or, whether fraudulent activity is still slipping through the net. After conducting the analysis during the month of November, including the busy Black Friday and Cyber Monday shopping weekend, we found that:
· 15% of items reviewed showed signs of being fraudulent or dangerous, this rose to 19% on Black Friday and Cyber Monday
· Pets and popular consumer electronics are particular areas of concern, with 22% of PlayStation 5 listings likely to be scams, rising to more than a third of PS5 listings being flagged over the Black Friday weekend
· 19% of listings on marketplaces for the iPhone 12 were also found to show signs of being scams
· Counterfeit fashion items are also rife on popular UK marketplaces, with 15% of listings found to be counterfeits.
The research demonstrates that, even after any filtering and user protection measures marketplaces have a significant number of the products for sale on them are leaving customers open to having their personal details stolen or receiving counterfeit goods. We know that many large marketplaces have a solution in place already, but are still allowing scams to pass through the net, while smaller marketplaces may not have thought about putting robust content moderation practices and processes in place.
Both situations are potentially dangerous if not tackled. While it is certainly a challenging process to quickly identify and remove problematic listings, it is deeply concerning that we are seeing such a high rates of scams and counterfeiting in this data. Powerful technological approaches, using AI in conjunction with human analysts, can very effectively mitigate against these criminals. Ultimately, it should be the safety of the user placed at the heart of every marketplace’s priorities. It’s a false dichotomy that fail safe content moderation is too expensive a problem to deal with – in the longer term, addressing even small amounts of fraud that is slipping through the net can have a large and positive long term impact on the financial health of the marketplace through increased customer trust, acquisition and retention.
2020 was a year we would not want to repeat from a fraud perspective – we have not yet won the battle against criminals. As we move into 2021, we’ll be hoping to help the industry work towards a zero-scam future, one where we take the learnings and lessons together from 2020 to provide a better, safer community for users and customers, both for their safety, but also for the long term, sustainable and financial health of marketplaces.
2020 was a year noone could’ve predicted. E-shopping has been accelerated by years and a whole new segments of customers have been forced online.
Meanwhile, marketplaces have had to deal with new unpredictable moderation challenges as illustrated very poignantly by the spike on this google trends graph showing the past years search volumes for facemasks.
The entire world has had to get used to a market that was very volatile and extremely sensitive to any new development in the pandemic, good as well as bad.
Now however, at the end of 2020, there’s light at the end of the tunnel. The vaccine is finally within reach and hopefully social distancing and face masks will soon be a curious terms in a bizarre chapter of our history books.
A return to “normality” will, however, not erase the incredible jump the world has done towards a fully digitized society.
What will the new world order mean for marketplaces going forward? What can eCommerce, sharing economy and online platforms in general expect from 2021?
We’ve asked 8 industry experts to give their trend prediction for the new year.
7% of reviewed listings for popular items across six UK marketplaces are found to be scams.
This Christmas, British shoppers will be turning to online shopping more than ever before as Internet retailers offer quick and convenient services. Consumers increasingly rely upon these services as the UK faces its second lockdown of the year, with many shoppers preferring not to visit physical stores. The opportunity is for online marketplaces to gain new customers, especially as many consumers turn to shopping for goods online for the first time. In fact, according to research by O2 Business and Retail Economics, 44% of consumers believe their shift to online shopping during the last COVID peak will stay the same going forward.
The good news does not stop there. In a saturated industry where monoliths such as Amazon dominate, consumers have become much more aware of where they shop and are much more willing to try a new service – over a third (39%) of respondents to a survey undertaken by Bazaarvoice said that they had bought from a new brand during the first quarantine. For small to medium sized marketplaces, this shift in shopping habits has opened up an opportunity to appeal to these newly adventurous shoppers, willing to discover a new platform.
Yet, while there is a huge opportunity for marketplaces, there is also considerable threat. The Association of Certified Fraud Examiners found that, in August this year, 77% of anti-fraud experts said that the pandemic has created an increase in fraud, while the New York Times reports that more than 200,000 coronavirus-related scam complaints have already been filed in the US this year.
To uncover the scale of this problem in the UK and help online marketplaces to understand how to keep consumers safe in the run up to Christmas, our certified content moderators started tracking six popular UK marketplaces over the peak shopping period. By reviewing listings of the items most associated with fraudulent selling, they are identifying how many show tell-tale signs of risk despite having made it through the marketplaces’ safety measures.
Having so far reviewed over 1000 listings during the first two weeks of November, we have found that:
- 7% of reviewed listings are likely to be scams
- Puppies are particularly risky, with 23% being found to be scams
- 14% of fashion listings are for counterfeit items
- Smaller marketplaces are particularly rife with scam products
The data demonstrates that no matter how large the marketplace, all marketplaces need to take precautions to prevent fraud on their platforms in the lead up to Christmas and onwards – as these scams and counterfeit posts are certainly not one-offs. Marketplaces need to rid their platforms of fraud. Larger marketplaces may see a smaller percentage of fraudulent posts but considering that they have a much larger user base, even small percentages can lead to thousands of users becoming victims. If marketplaces – of all sizes – do not act now, then they could risk long-term reputational damage and the potential for fraud on their platform to spiral out of control.
Recent research undertaken by Next Consulting on our behalf also demonstrates that:
- Up to 73% of users would never return to a platform after seeing fraudulent content
- 76% of users would not recommend a marketplace after having seen fraudulent content
- 80% would not buy from a brand that they had seen fraudulent posts for previously
Tackling fraudulent activity on marketplace platforms is not an easy challenge. The scam posts and counterfeit goods we found listed above were post-reviewed, meaning that these were the posts that had either slipped through a content moderation process or there was no process and analysis in place to start. Marketplaces need to review their content moderation strategy and explore a holistic approach – one that uses both AI and filter automation layered with manual moderation in order to provide high quality review and prevention. AI and automation are effective as a first line of defense for preventing harmful content. However, alone they are not the panacea. AI is fantastic for routine health checks – such as checking suspicious content that follows a continuous pattern. However, some scam posts venture out of the ‘routine’ and break the patterns that are typically picked up by AI. That is when human moderation will need to step in to ensure that rigorous analysis of the context is applied.
Working with a trusted partner who can provide guidance, best practice approaches and support for content moderation will be crucial for many marketplaces in the lead up to Christmas and beyond.
Do you need help with your content moderation setup? Get in touch.
When British journalist, Sali Hughes, found herself the victim of online trolls, she took it upon herself to understand why. As part of these efforts, she actually met one of them in person.
While it’s vital that victims, like Ms. Hughes, speak out against the responsible individuals – what of the sites where all of these comments are posted? Negative User Generated Content (UGC) continues to be problematic for all kinds of online marketplaces and classified sites – as well as chat forums. But to what extent can they be held accountable for enabling these kinds of viewpoints to be aired – and to what extent should they be punished?
While the human impact is a critical concern that many businesses, governments, and advocacy groups are striving to curb, there’s a similarly disastrous impact for the sites themselves – not just in terms of reputation or bad user experience; there’s a financial impact too.
Let’s consider the ways in which negative UGC can affect the business bottom line and look at ways companies can put a stop to it.
Bad Content: The Bigger Picture
The need for UGC platforms to respond swiftly and decisively to negative content is a given. Legislators have continually weighed in on these issues. For example, last year, (as mentioned in a previous blog) the EU voted to give online businesses one hour (from being contacted by law enforcement authorities) to remove terrorist-related content – from the moment that it’s identified on a site. Failure to comply could incur these businesses a fine of up to 4% of their global revenue.
Similar efforts would have governments taking a more proactive stance – such as British media watchdog, Ofcom’s proposals to police social media earlier this year. But given concerns over freedom of speech and expression, such moves are bound to provoke a backlash – from businesses and individuals.
Another solution is for businesses to regularly audit their own sites: which larger platforms like Facebook and YouTube do already (and with varying degrees of success given Facebook’s continued fines over under-reporting illegal activity). In addition, organizations like GARM (Global Alliance for Responsible Media) brings advertisers, agencies, media companies, platforms and industry organizations together to improve digital safety.
While the vast majority of online platforms do everything they can to ensure their sites remain safe places for all of their users and customers, the issue is that all of these combined actions don’t stop trolls and cyber criminals.
The Business Impact
In addition to the ongoing regulatory maelstrom, the urgency to respond is exacerbated by a myriad of business concerns. These include retention, conversion, engagement, reputation, and customer satisfaction – all of which can be easily damaged and disrupted by bad or harmful User Generated Content. This in turn can pave the way for other types of negative online behaviours: from scams to fake ads.
Retention, Conversion, & Engagement
When customers lose faith in an online platform – be it a service or a marketplace – there’s a negative impact.
It follows that if users leave; revenues will drop. Lower engagement leads to fewer conversions. But that’s not all. Costs increase too as a result – as it becomes increasingly more expensive to win back old, or entice new, customers.
Lower user retention stemming from a negative experience pushes up the cost of user acquisition. Similarly, higher user leakage means that the lifetime value of users will drop too.
Given that, a bad UGC experience can contribute to a fairly rapid downward spiral, the case for prevention rather than reaction is more important than most site owners consider.
A company’s reputation online matters just as much – if not more so – than how it’s perceived offline. After all, content tends to have a habit of lingering online. That’s why bad UGC can be so damaging.
It can often be hard for brands to shake the stigma of bad content published about them. By the same token, their reputation can be damaged by (unknowingly) hosting it as well. This can be disastrous for online marketplaces, classifieds, and chat sites, who often need to rebuild trust from the bottom up.
Then of course there are legal and liability issues that can stem from unauthorized UGC as well as harmful content. Take the 2017 case of Kayla Kraft vs Anheuser-Busch. When an image of the claimant was supposedly submitted as part of a campaign to crowdsource advertising images, she filed a lawsuit asserting the image had been used without her consent.
While many businesses will focus on providing multichannel support, and making it as easy as possible for customers to access their business and support channels, reducing the number of support requests doesn’t always factor in as highly.
This is a mistake. Ultimately, the more calls, emails, and support tickets there are, the higher the cost of customer service – as better trained staff are needed to deal with incoming queries. But, with a more robust, preventative solution in place, the need for a bigger support offering reduces significantly.
Take our client, Connected2Me – a social media platform where users can chat to each other anonymously. While the idea itself is intended to be fun; the team were experiencing more negative User Generated Content than they could handle. As a result, they were getting an increased number of support tickets, which were proving difficult for the in-house team to keep on top of.
When they contacted us in 2018, Connected2Me had tried adding automation to their content moderation workflow, but had not been able to find a solution which could live up to their required accuracy. The team was manually moderating content but needed to ensure 24/7 monitoring in order to reduce the amount of support tickets and provide the best possible user experience.
With our help Connected2Me now has an accurate moderation solution in place covering numerous languages. They can now move forward confidently – meeting user expectations and provide the experience they were originally aiming for. These efforts are also helping them attract new investment and develop a loyal and happy user base.
User Safety = Sales Success
Task most people with introducing themselves to a crowd of strangers in person, and the chances are you’ll see them do their very best to present a positive version of themselves.
But transpose this to an online environment, add a degree of anonymity, allow people to share content, and all kinds of intriguing behaviors can manifest themselves.
Ultimately, online platforms can unwittingly play host to a torrent of negativity – which is why preventative action is wholly necessary at a site level.
From a company perspective, the need to counteract it is as much a business concern as it is a user-centric one. But, when you think about it, the two are in fact one and the same. A trusted site that’s known for quality content, reliable customers, and a great user experience will attract more prospects than a platform in which they’re likely to be subject to scams and abuse.
And as for Sali Hughes? She was surprised by the person she met. It wasn’t some bitter, twisted, housebound hacktivist – it was a well-dressed, professional woman in her mid 30s; the kind of person she might even be friends with in other circumstances.
It just goes to show: you can never second guess when, where, or from whom online abuse will come from – which is why a moderation strategy that can be applied at scale and is specifically designed to uphold your site’s rules and procedures is a safer bet for all.
If, like the companies mentioned here, your online business relies on User Generated Content then you need to make sure that every single customer gets the best experience possible.
Here at Besedo, it’s our goal to help companies do exactly that.
Interested in learning more about how we can help counter the effects of bad UGC 24/7? Then talk to our team today.
Having quality tools is key to performing well. That’s true for content moderation as well. An outdated or feature incomplete platform affects everyone from CEO to content moderator. And the consequences can be dire ranging from lacking user and operational insights to decreased productivity or even the inability to perform important tasks.
If you are looking for your first content moderation tool or considering exchanging your in-house platform it’s a good idea to take a step back and consider the options available. It might be tempting to hand the task and your specifications to your dev team and await delivery. On the other hand, it might be better to buy an off the shelves solution that’s plug and play. In short should you build or buy?
The answer is of course very dependent on your situation, your company and your current requirements. We’ve created a list of questions that you should ask yourself and an overview of pros and cons to help you take a more informed decision.
- Is content moderation business defining or business critical?
- What are the time constraints for implementing a moderation solution?
- What developer resources do you have available?
- What is your budget?
Is content moderation business defining or business critical?
10 years ago, most online platforms didn’t prioritize content moderation as there was a lot less understanding of the negative impact from bad content. Today, most agree that content moderation is business critical to ensure user trust and ensure conversion. However, business critical isn’t the same as business defining.
For most customer facing businesses, a help-desk tool is critical, but unless customer support and how it’s delivered is a core part of your USP the software you use for customer queries doesn’t have to be unique to your business.
What you do want though, is a platform that is stable, has all required features and is updated regularly as requirement shift.
What are the time constraints for implementing a new moderation solution?
What is the timeline for implementing your new solution? Building a moderation platform from scratch can be a yearlong project depending on how many developers you can throw at the task.
At the other side of the coin, even out of the box solutions usually require some level of integration with your current systems. It’s good to get an idea of requirements early in the decision process so you understand the timeline for either option. If you do decide to buy, shop around a bit to understand the difference between vendors and how much effort they will put into supporting you during the integration step.
What developer resources do you have available?
Before committing to building or buying you should figure out what developer resources you realistically have at your disposal. Keep in mind that content moderation tools are an almost living entity that needs to develop with your product and ongoing trends.
When you evaluate your developer need for an in-house solution, remember to include ongoing maintenance and new feature development.
For bought solutions you should as mentioned before do a discovery call with potential vendors to understand how much time integration will take.
What is your budget?
Budget is always an important aspect when deciding whether to build or buy. It can be really hard to estimate the cost for a content moderation solution regardless of whether you build and buy. Many vendors will have on-boarding fees, monthly fee and price/item which can make obscure to predict the actual monthly bill.
For in-house projects on the other hand, it’s easy to forget the cost of management, salaries, project meetings, and ongoing management and feature updates.
Most importantly many companies forget to keep opportunity cost in mind. What unique features could our developers have created for our core product instead of building a moderation platform?
Whether you decide to build or buy, spend some time investigating potentially hidden costs to avoid unpleasant surprises.
If you buy, go with a partner that is transparent and straight forward.
If you build, map out every cost, not just direct developer costs, but also adjacent expenditures, especially after the project has been delivered.
Building a Content Moderation Platform:
|Greater feature control (within the project and budget scope)||Even well-planned projects are subject to scope creep, going over budget or surpassing deadline|
|You can build the exact tool you need, and ensure that it fits with all your other in-house tools||It’s likely that there’s a product on the market that already matches most of your requirements. Developing from scratch might waste money and time when there’s potentially already an off the shelf solution that would fit your needs.|
|Significant upfront and hidden ongoing costs|
|Timescale for building an in-house tool is significantly higher than buying an off the shelf solution|
Buying a Content Moderation Platform:
|Ongoing feature addition from vendors with their main focus on content moderation software development.||Less customization to your particular needs. If a solution has missing features, make sure you check with the vendor to see if it’s on their roadmap.|
|Low upfront cost.||While the initial cost may not be high, you should ensure that you understand the ongoing monthly cost and any additional fees in case your volumes grow.|
|Ongoing support, training and maintenance.|
|Fast go live date.||While you will be able to deploy a purchased tool faster, make sure you do consider implementation time.|
|Tech knowledge loss protection. When buying you are not dependent on in-house developers with certain knowledge and wont have to worry about them leaving. Buying protects you from development delays and additional cost related to rehiring and training.|
The decision to buy or build is never easy and very dependent on the business and the use cases. In most cases, asking yourself whether building an in-house solution will give you a competitive edge or if the needs can be sufficiently covered by an already existing content moderation platform.
If you want to better understand your options, feel free to reach out to us and get a transparent offer for a tailored content moderation solution.
Or check out Kaidees experience with our content moderation tools.
Most online platforms would agree that Images are one of the absolute most important elements of a user profile. When it comes to building trust between strangers, having an actual picture of the person you are about to engage with is vital. Whether you are looking for a date, booking a ride or renting a holiday home interacting with a total stranger can be daunting. In the real world visual clues like facial expression and body language is intuitively used to decode intent. In the digital world we must emulate these trust markers and images are a crucial part of this.
There’s one problem though. While good images can put a face on a stranger and boost user trust, bad images can have the exact opposite effect. This is one of the reasons we advocate for image quality and why we’re continuously expanding on Implios capabilities for catching and managing bad, inappropriate or low-quality images.
The latest tool in Implios image moderation toolbox is a misoriented AI module.
Why should you care about misoriented images?
The use case is straight forward of course, misoriented images (E.g. Wrongly rotated or upside down) will be caught by the model and sent for manual moderation.
Catching misoriented images is important for the overall impression of your site. A bunch of upside-down faces will make browsing time-consuming and confusing at best or make your platform look unprofessional and scammy at worst.
As more users access and create their profiles using mobile phones the number of images that are misoriented increase and the need to efficiently deal with the issue grows accordingly.
Which is why we’re excited to announce that Implio can now help you automatically identify misoriented images.
How to automatically detect misoriented images
The misoriented module will be available to all Implio users off the shelf soon. For now to gain access, just reach out to us and we’ll activate it for you. When the module is active all images will be scanned by the AI and tagged with misoriented if they are rotated wrongly.
This tag can then be utilized in Implios powerful rule creator where you can decide to send to manual moderation, reject outright (not recommended) or take no action if you want to use the rule for tracking purposes only.
Here’s an example of an image caught by the new misoriented module. As you can see the picture is upside down and it’s been tagged by the AI with “face” and “misoriented”
To the right you can see that it has matched the misoriented rule.
If you decide to send misoriented images to the manual queue moderators will be able to fix the issue. Here’s a view of Implios image editing tool. Here you can crop and rotate images as you see fit.
This version of the misoriented image model works best with human subjects, but we’re hard at work expanding on it and soon we’ll add capabilities that will allow the model to tag items with the same level of accuracy.
If you’re looking for a way to optimize the way you handle misoriented images on your site or platform then get in touch. We can help you with the setup and have a look at your site for other low hanging content quality issues that can easily be resolved with a good moderation setup.
We can’t deny that 2020 hasn’t gone according to plan. It’s smashed through any notion of normality and has pushed companies all over the world to the very limit – in many different ways.
Yes, the events of 2020 have been a massive catalyst for change – particularly for online marketplaces, classified sites, and all kinds of digitally-native businesses. However, it’s not just companies that are changing. We’re also seeing a significant behavioral shift from customers.
It appears that a greater reliance on digital services is bringing about a need for more transparency and trust; as well as a desire for more personalized products and services. More accurate data is also helping to instill more empathy between customers and online vendors; and it appears as though quality is now more important than ever to shoppers.
Is this lasting change? How is it manifesting? How are online marketplaces and classified sites turning these challenges into opportunities? Let’s take a closer look at why empathy, personalization, and quality are just as highly prized by consumers as availability, cost, and convenience.
First let’s consider how things look from a technology standpoint. It’s clear that there has been a massive shift here too. The adoption of digital services rose by nearly a decade’s worth of growth in just a few short weeks – across retail, grocery shopping, deliveries, payments, and much more besides.
According to the UK Office of National Statistics, between February and May 2020, internet sales jumped nearly 14% (from 19% to 32.8%) – a rise which took some 9 years previously.
Now, we’d be remiss if we were to ignore the reasons behind this dramatic increase. It took place during peak lock down season. While the usual suspects (remember Amazon’s $11,000 per second earnings?) undoubtedly profited, so too did peer-to-peer orientated online marketplaces that were able to mobilize themselves in the right way.
But how were they able to do so? Competing with the likes of Amazon when it comes to cost, choice, and availability is seemingly impossible.
It seems that the challenges we collectively faced brought about a renewed focus in not only what we wanted as shoppers, but the way in which we perceived value.
Empathy is one thing that’s been in short supply for too long, but in recent years, it has started to permeate company and buyer behaviors in line with wider awareness of CSR (corporate social responsibility).
To the cynical shopper, corporate altruism is easy to pick holes in. But looking more closely at shopping trends, it’s clear that the initiatives being put in place by many online marketplaces and classified sites are largely customer-driven. In short, they make good business sense as well.
An Accenture report conducted in April 2020 shows that consumers are now taking health and environmental consideration into their shopping choices – with 45% of consumers saying they’re making more sustainable choices when shopping and will likely continue to do so.
Earlier this year we interviewed senior representatives from both eBay K (a dedicated Germany-focused classifieds version of the online platform) and Norwegian marketplace, Finn.no – both of which managed to grow during the pandemic by focusing on helping their communities.
Other digital companies with the capacity to be of broader service are adapting their business models too – at a time when overall demand for their original service has suffered setbacks.
Take ride-sharing app BlaBlaCar, which created a volunteer service during the pandemic – BlaBlaHelp – where drivers could sign up to help deliver groceries and medicine to those in need. Competitor mobility services could have elected to do the same, at a time when the entire transportation sector faced the same challenges, but didn’t.
The upshot is that the business was able to double down on the community aspect of its service – something that will no doubt continue to positively impact BlaBlaCar’s brand moving forward.
Ultimately empathy impacts loyalty: but it also has a positive impact on a company’s bottom line. A study conducted in 2016 on the link between empathy and profitability showed the top 10 empathy-driven companies increased in value more than twice as much as the bottom 10, and generated 50% more earnings.
Personalization Make A Play
Deeply understanding your customer’s journey – across all touch points – is also critical to ongoing success for online marketplaces. This understanding can be translated into personalization which in turn drives loyalty and conversions.
With all of the digital technology at our fingertips, personalization is nothing new to digitally-native businesses. In fact, it’s arguably one of the cornerstones of online commerce. After all, the more user data is available, the easier it becomes to provide and suggest products and services to customers based on their buyer behavior.
But why has this become of particular importance in 2020? Well, the more online retail options there are, the harder it becomes for shoppers to differentiate between similar products and services.
Competition for attention is fiercer than ever. But rather than simply bombard shoppers with yet more ads, companies are focusing on personalizing their overall customer experience – opting for attraction over interruption as their calling card.
For example, take Spotify’s Discover Weekly option – an algorithm-curated playlist based on user preferences – or app-based Atom Bank’s customization and naming options; even Starbucks’ much hackneyed practice of baristas scrawling customer names across their morning cup of coffee. When businesses – both digital and analogue – take the time to show that customers are more than just a dollar sign, they open the doors to more meaningful interactions.
The stats concur. According to a personalization development study, conducted by customer optimization experts, Monetate, 93% of companies with an “advanced personalization strategy” saw revenue growth. Similarly, companies that spent at least 20% of their marketing budget on personaliszation saw double the amount of ROI (return on investment).
The study shows that personalization has a positive impact on loyalty too – which will become more essential as companies that have experienced growth look to maintain their new customers.
In a similar way to how digital businesses are personalizing their customer experiences, more and more online marketplaces are focusing on improving the quality of their overall digital offering.
With potentially more customers coming to different sites, the need for a welcoming, user-friendly experience is crucial. If a site or marketplace is difficult to navigate, doesn’t offer different languages or even accessibility options, companies could be losing out on much needed revenue.
After all, if customers can’t find what they’re looking for, they have plenty of other options out there. However, this need for better quality extends to the actual products on offer too.
The ‘finding discounts strategy’ many shoppers have favored in the past has been replaced by a search for the ‘best possible option’. This is true across many different categories – from electrical goods to cars and furniture purchases, according to a survey carried out by First Insights.
The study shows that up to 98% of shoppers (at least where furniture is concerned) stated that price had no bearing on their purchase decisions.
But what does this all tell us? Well, if price has less impact than quality, then it’s clear that online marketplace and classified sites that specialize in particular niche should look more closely at their customer group’s needs and provide a product range and customer experience that matches higher expectations.
Building Long-Term Trust Is Crucial
All things considered, although many businesses face an uncertain future, as more shopping and interaction takes place remotely, there’s never been a better time to be an online marketplace or classifieds site.
To truly thrive however, companies need to continually focus on customers needs and to work on building communities rather than empires. Trust is more important than ever before. Which is why, among other things, online content moderation should be a priority for sites that rely on user-generated content.
Knowing that a site or marketplace is a safe, well-maintained place to transact will become even more important. This not only reassures customers your site is safe, it’s a broader indication that you genuinely care about your community and that you’re committed to providing the best possible user experience and that you understand why these things are important to your customers.
The pandemic will hopefully prove to be a short-term concern. But there are a lot of long-term lessons we can learn from how businesses and consumers are behaving. The challenge now is to commit to implementing the positive ones here and now. For everyone’s sake.
Behind every successful company is a team of hardworking tech people diligently keep the IT infrastructure safe, optimized and resilient towards unsuspected events. They are the unseen and sometimes unsung heroes of digital operations. At Besedo the head of that team is Kevin E. Ducón Pardey. We sat down with him to understand what it takes to ensure that we can deliver high quality services to our clients 24/7 all year long.
Interviewer: Could you introduce yourself?
Kevin: I’m just a guy who dreamt about doing what he loves and who worked and studied hard to achieve it. I’m Colombian, 34 years old, married with one kid and a dog. I hold an MSc in Computer Science from Universidad Politecnica de Madrid and a BSc in Computer Science from Universidad Ditrital de Bogotá and I have different certifications in IT Service management, IT security, and cloud computing. I have been working in ICT for more than fifteen years. I have been working at Besedo for almost seven years. I started as a local ICT Administrator in our Colombian branch office, then I was promoted to ICT Supervisor, and currently, I am the Global Head of ICT-IS, which means I’m in charge of all levels of ICT support. Together with my amazing team, we make sure that we fulfill our most important metrics, service-level agreements, and customer satisfaction.
I have applied my knowledge and skills to this ever-changing industry by creating policies and processes aligned with the industry’s best practices of IT service management. I’ve developed strategic and operational plans aligned with security guidelines for the ICT department to ensure that all necessary tools and processes are fully functional to achieve the company’s goals. Our commitment is to keep our mission-critical systems alive and running 24/7, to ensure that our moderation services are successfully delivered to our customers worldwide.
Interviewer: How does your team manage such a variety of clients ICT needs and set ups?
Kevin: We need to be aligned with the business guidelines to properly onboard the clients. So, with the help of Sales, Customer Success, and Service delivery teams, we are able to translate business and operational needs to ICT needs. It all starts with a cross-functional plan which is the key to understanding the whole process, IT requirements, and compliance aspects.
Once this plan is clear, from ICT we need to provide all necessary tools and fulfill all requirements. Our local ICT teams manage the local implementation according to the plan and they ensure that everything required is ready to go-live under the defined time-frames. They also to tackle any post-implementation issue immediately.
When you have a capable, knowledgeable, and committed team like I do, things are always easier.
Interviewer: How did you make the transition from office to home office so efficiently?
Kevin: We’ve actually had a remote workforce project running since 2017. We were quite early in identifying remote workforce as our future and as a necessity in order to explore new markets and cope with some specific requirements. Also, working at home boost the team’s morale and it is a benefit in countries where some have commutes as long as 2 hrs. COVID was a challenge for all ICT departments, but it also made our work visible. Talking with different colleagues, a pandemic situation likelihood was low, but the impact based on the government measures taken in each country was high, it’s what we refer to as a black swan. So, COVID gave a boost to the investment in the required resources. Also, it changed the business culture to focus on identifying areas and covering areas of vulnerability, we’re now even more resilient, we’re managing the internal resources more efficiently, and are taking a people-centric approach to IT security. As we we’re already working with cloud and virtualization, it was relatively painless to scale up. But I won’t lie, it was not easy. As we moved from 30% to 100% of the operation working from home, we had to boost some processes, provide resources, and try to tackle challenges fast, since we were operating under tight deadlines. However, we’ve made it. Of course, there were challenges at the beginning (as there always are when a new technology is implemented) in terms of support requests and provision of services but now, we are in a stable situation. The next challenge is to improve our delivery and make this more efficient. That’s an on-going process.
Interviewer: How do we manage disaster recovery in Besedo?
Kevin: It’s important to have proactive safety measures in place to guarantee that the moderation operation always is carried out correctly. A good first step is to plan the implementation of the moderation services before putting disaster mitigation plans in place. As I mentioned earlier, a good onboard process, but also, a fault-tolerant and highly available infrastructure is necessary.
For example, at Besedo offices, we work with different Internet service providers in case one fails to deliver correctly. We also work with fault-tolerant networks, a resilient infrastructure, third-party support, etc., to ensure that our IT operations remain stable when potential risks materialize. Also, we believe that remote workforce/work from home can support the delivery on-premises, and a mixed model (On-premises and remote) makes sense.
To complement proactive measures, we run IT checklists, backup routines and we have monitoring systems that allow us to prevent potential challenges during IT ops.
Interviewer: What is the next big ICT project you have planned in your team?
Kevin: We plan to go-live from a new office in Germany this year. We are preparing all that is required to have an operational center that is compliant with some specific guidelines and which are imperative to Besedo’s worldwide commitment to client satisfaction. Also, we are making exhaustive research in a Desktop as a service (DaaS) solution which can improve our current remote workforce delivery. That will be beneficial when we explore new markets and make it easier to scale up or down according to business demands.
From ancient Greek merchants attempting to claim insurance by sinking their ships, to Roman armies ‘selling the Emperor’s throne’(!): since time began, whenever there’s been a system open to exploitation, there have been fraudsters willing to try their luck. And succeeding.
However, most historical crimes were essentially isolated events. While a single fraudster could be a repeat offender, compared to the sheer number of people who can be duped at scale across different digital channels – by largely ‘invisible’ cyber criminals – it’s clear that fraud has become much more of an everyday concern for all of us.
But how did we get to this point? What risks do we need to be aware of right now? What can we do about it?
Let’s consider the issues in more detail.
A History Of Digital Deviance
In a similar way to other forms of fraud, digital scams date back further than you might think. Email phishing allegedly first took place in the early 1970s, although it’s generally accepted that the term was coined and the practice became commonplace in the mid-1990s.
Since then, the online world has seen con artists try their hand at everything from fake email addresses to using information gleaned from massive data breaches; with $47 million being the most amount of money a single person has lost to an email scam
(Incidentally, the most famous email scam, the ‘419’, aka ‘Advance-fee’, aka ‘the Nigerian Prince’ scam surfaced as mail fraud some 100 years ago.
But email isn’t the only digital channel that’s been hijacked. The very first mobile phone scams came about during the high-flying 80s when they became available – way before they were popular.
Given the high cost of these now infamous brick-sized devices, the wealthy were pretty much only people in possession of them (so it makes sense that they quickly became fraud targets too).
SMS messages requesting funds be ‘quickly sent’ to a specific account by a ‘family member’ began to soon after, though again didn’t surge in number until well into the 90s when uptake soared.
Of course, these aren’t the only forms of online fraud that surfaced at the start of the Internet’s popularity. Password theft, website hacks, and spyware – among others – proliferated at an alarming rate around the world at a similar time.
So, if we take it that digital scams have been around for some 25 years, why do they persist – especially when awareness is so high –? One of the biggest problems we face today is the ease with which online fraud can take place.
Hackers, of course, continue to evolve their skills in line with advances in tech. But when you consider the number of sites that anyone can access – the marketplaces and classified sites/apps that rely on user-generated content – pretty much anyone can find a way to cheat these systems and those that use them.
Fraud Follows Trends
As we’ve explored previously, scammers operate with an alarming regularityall year round. However, they’re much more active around specific retail events – such as peak online shopping periods, like Black Friday, the January sales, back to school accommodation searches, and the Chinese New Year.
However, while 2020 was shaping up to be a landmark year for fraudsters, given the many different sporting and cultural events – such as the Euro 2020 and Copa Americas football tournaments, and of course, the Summer Olympics – it seems that fate had very different plans for all of us: in the form of the COVID-19 pandemic.
But true to form, scammers are not above using an international healthcare crisis to cheat others.COVID-19 has given rise to different challenges and opportunities for online businesses. For example, video conferencing services, delivery apps, dating websites, and marketplaces themselves have largely been in an advantageous position financially, given the fact they’re digital services.
However, given the knock-on economic factors of coronavirus and the danger of furlough drifting into long term unemployment – among other things – there may be more wider reaching behavioral shifts to consider.
That said, fraudulent behavior simply seems to adapt to any environment we find ourselves in. In the UK, research shows that over a third (36%) of people have been the target of scammers during lockdown – with nearly two thirds stating that they were concerned that someone they knew could be targeted.
Examples playing on fear of contamination include the sale of home protection products, while other more finance-focused scams include fake government grants (requesting personal information), help with credit applications (for a fee), and even investment opportunities promising recession-proof returns for those with enough Bitcoin to put into such schemes.
The Gap Is Closing
It’s clear that online fraud is closely aligned with wider trends. In fact, the ‘newer’ something is, the more likely scammers are likely to step in. Looking at the various timelines of many of these scams against when the technology itself was first invented, and it’s clear that as time progresses, the gap between the two is closing.
There’s a very good reason for this: the pace of adoption. Basically, the more people there are using a particular device or piece of software, the more prolific specific types of scam have become.
Consider the release of the iPhone XS back in 2013 and the popularity of mobile game, Pokemon Go, three years later. Both events provided fraudsters with enough of an incentive to target innocent users.
As with most (if not all) eCommerce scams, the launch of the iPhone XS played upon consumer desire and ‘FOMO’ (fear of missing out) to manipulate Apple enthusiasts into being able to get their hands on the latest technology ahead of the official launch date.
Not only that, but the accompanying hype around the launch proved the perfect time for fraudsters to offer other mobile phone-orientated scams.
In general new tech and trends lead to new scams. For instance, Pokémon Go gave rise to new scams such as the Pokémon Taxi (‘expert’ drivers literally taking users for a ride to locations where rare Pokémon were said to frequent) and the advanced user profile (basically paying for accounts that other players had already leveled up on).
The fact it was so new and its popularity surged in such a short period of time, it made it a whole lot easier for fraud to materialize. Essentially, there was no precedent set – no history of usage. No-one knew what exactly to anticipate. As a result, scams were materializing as quickly as new users signed up.
In one case, players were being tricked into paying for access as ‘new server space was needed’. Not paying the $12.99 requested would result in their hard fought Pokemon Go accounts being frozen. While that might be a small price to pay for one person, at scale it would mean a significant amount.
Regardless of which methods they use, fraudsters are ultimately focused on one thing. Whether they’re using ransomware to lock out users from their data, posting ‘too-good-to-be-true’ offers on an online marketplace, or manipulating lonely and vulnerable people on a dating app – the end result is cold hard cash through scalable methods.
However, while hackers will simply barge their way into digital environments, those using online marketplaces and classifieds sites essentially need to worm their way into their chosen environment. In doing so they often leave behind a lot of clues that experienced moderators can detect.
For example, the practice of ad modification or use of Trojan ads on public marketplaces follow particular patterns of user behavior that can cause alarm bells to ring.
So what can marketplace and classified site owners do to stay ahead of fraudsters? A lot, in fact. Awareness is undoubtedly the first step to countering scams, but that alone will only raise suspicion than act as a preventative measure.
Data analysis is another important step. But, again, the biggest issue is reviewing and moderating at scale. When you have an international platform, how can a small moderation team police every single post or interaction when thousands are created every day?
This is where moderation technology – such as filters – can help weed out suspicious activity and flag possible fraud.
In order to stay ahead of fraudsters, you need a combination of human expertise, AI, and filters. While it’s possible for marketplace owners to train AI to recognize these patterns at scale, completely new scams won’t be picked up by AI (as it relies on being trained on a dataset). This is where experienced and informed moderators can really add value.
People who follow scam trends and spot new instances of fraud quickly are on full alert during big global and local events. They can very quickly create and apply the right filters and begin building the dataset for the AI to be trained on.
Ultimately, as tech advances, so too will scams. And while we can’t predict what’s around the corner, adopting an approach to digital moderation that’s agile enough to move with the demands of your customers – and with fast intervention when new scam trends appear – is the only way to future proof your site.
Prevention, after all, is much better than a cure. But where fraud is concerned, a blend of speed, awareness, and action are just as critical.
Why is the quality of content important?
Every marketplace professional knows that quality content is the key to success. Without good user generated content, you won’t drive traffic to your platform.
The quality of your content helps build trust (a vital element to marketplace success) and directly impacts conversions, engagement and retention. In fact, visitors are more than twice as likely to return to your site if they encounter high quality content on their first visit.
Now, what constitutes good content can differ wildly depending on; your audience, the services your platform facilitates and the overall competition in your specific niche.
However, we’ve collected some general rules of thumb that rarely fail.
What’s a good online marketplace profile?
A good online marketplace profile helps build trust in the owner, makes it easy to identify and re-identify them and provides the reader with enough relevant information for them to decide whether it makes sense to engage or not.
Polina Yelkina, Manager Russia & Ukraine, Community Relations at Blablacar breaks down their view on a good profile to the following:
- Maximized useful info (bio, preferences, photo, certified phone/email, car details).
- Real information.
- Relevant content (no gibberish).
- No surname mentioned.
- No contact details publicly visible.
- No advertising/spam.
- No offensive/insulting content in the profile.
- Content which complies with the rules of the platform.
Expanding on this and based on our experience, it’s clear that there’s some universal criteria quality profiles meet:
Good profiles contain relevant information that helps the reader understand whether the profile is matching their current needs. Which information is relevant, will depend on the platform and the service exchanged between the profile owner and the person viewing it.
For a profile on a car sharing platform for instance, it might be relevant to include reviews or information such as whether the driver accepts smokers or not.
Spam and advertising are of course completely unacceptable and links that lead off the site are inadvisable due to the risk of fraud and platform leakage.
Profiles should be streamlined to communicate the exact message needed for the reader to understand who they are about to engage with. It’s important that it’s clear who the profile owner is so pictures of multiple people are usually a no go. While the profile should be informative, it also shouldn’t contain irrelevant information. Short and to the point is better than rambling and incoherent.
We live in a visual age. Regardless of whether the profile is for a classifieds site, a gamer account or a car sharing platform, having a visual representation of the user is important. In general, this should be in the form of a picture of the user but could also be an avatar for platforms that are more lenient on anonymity (like community sites for kids or gaming platforms). The image should however be recognizable, so the representation is always easy to identify as the profile owner.
We’ll get further into this in the next paragraph, but make sure the image is clear, sharp, properly formatted and inviting.
What’s a good image for an online marketplace?
What constitutes a good image, is highly reliant on context. However, usually the following is true:
Blurry images communicate low quality and don’t instill trust in the person looking at it. Make sure users upload sharp images where it’s easy to see the subject.
Images that are formatted wrong, or are too dark or too bright, signal low value and should generally be avoided.
Photos that look genuine are preferable so try to get users to stay away from the stock photo look. There is such a thing as too perfect. In general, if your users are trying to sell an item it’s good to have multiple images. A couple that show the item, some that show details of the item in closeup and one or two that show the item in action (preferably in a natural, yet non-home-video-style fashion).
For profile pictures, a good image ensures that the face is easily distinguishable. After all, you’re trying to simulate a face to face encounter. Sunglasses, shots from behind and other images that remove information should be discouraged.
Depending on your site policies you may also want to remove inappropriate images where people have gone in the opposite direction and reveal too much.
In a qualitative survey on user search experience 100% of all participants highlighted good images as a reason to click on an advert. Underlining that great pictures drive engagement and conversions.
“The pictures are showing that it actually belongs to a real person. It looks less blurry, and like the seller really cares to sell the item” – Louise
Anibis.ch’s Product Support Lead, Ana Castro and Anti-Fraud Specialist Jelena Moncilli reveals that their internal guidelines echo this call for genuine and clear images.
“A good quality image is an authentic image captured by the user, directly related to the listing, without contacts or web links, and where the product or service offered can be clearly visualized, without being blurred.”
The same message of clarity is repeated by Polina Yelkina, Manager Russia & Ukraine, Community Relations at Blablacar.
In pictures of cars we want the following to be true:
- The whole car should be visible.
- The car from the picture should be easily recognized by passengers.
- No people in the picture.
- No contact details/advertising visible in the picture or on the car itself.”
What’s a good marketplace listing?
What is the purpose of listing text? It’s to provide the potential buyer or user with enough information for them to decide whether to engage with the seller.
If it takes too much effort on the buyer-side to decode the item or service specifications, it’s likely the buyer gives up and moves on to the next listing or worse, to the next platform.
As such the goal with listing text should be to clearly and concisely describe the service or item for sale. Misinformation, spammy, scammy or inappropriate content should obviously be disallowed and immediately removed.
Detailed descriptions aren’t just good for the browsing experience, it also helps with SEO driving traffic to the listing and the platform as a whole.
Listing text is also very important in building long-term trust in the platform brand. Since the buyer can’t physically examine the item, they have to rely on the description. If this turns out to be dishonest or lacking, trust will quickly deteriorate. As such an honest and detailed description is recommendable as a good listing text.
Ana Castro and Jelena Moncilli backs the need for detailed descriptions stating:
“A good quality listing is a listing in compliance with our insertion rules, offering only one service or product, placed in the right category at a fair price, with a clear title with good key words which help to find the service or product offered by going through a list of results. Description should contain all necessary information like the size, the color etc. A good listing is completed by a picture illustrating the item or the service which is sold. All necessary contact information should be correctly filled in the specific field.
For the three points above, all of them must respect our insertion rules as well as the Swiss law.”
The perils of low-quality content
We’ve listed a lot of benefits to high quality content, but what issues does low-quality content cause?
Imagine walking by your local clothing store and seeing dirty, tattered dresses tossed around in the window. You’d be unlikely to enter the store and even less likely to purchase from there.
The page a visitor lands on when first visiting your site is your storefront. It should be inviting and beautiful. You want the user generated content to show your platform in the best light possible. Blurry images, lacking descriptions and user profiles that don’t give insight into the owner will deter visitors. Guide users to create and submit better content and take care of the bad pieces that inevitably slip through. That way you’ll ensure a more successful platform with more engagement, traffic and higher retention.
Need help moderating your content? Get in touch and we’ll have a look at how you can optimize your current content moderation setup.