When Facebook CEO, Mark Zuckerberg recently came under fire for the company’s admittedly tepid approach to political fact-checking (as well as some revelations about just what constitutes ‘impartial press’), it became clear that where content moderation is concerned, there’s still a big learning curve – for large and small companies.
So given that a company like Facebook with all of the necessary scale, money, resources, and influence, struggles to keep on top of moderation activities – what chance do smaller online marketplaces and classified sites have?
When the stakes are so high, marketplaces need to do everything they can to detect and remove negative, biased, fraudulent, or just plain nasty content. Not doing so will seriously damage their credibility, popularity, and ultimately, their trustworthiness – which, as we’ve discussed previously, is a surefire recipe for disaster.
However, we can learn a lot from the mistakes of others and by putting the right moderation measures in place. Let’s take a closer look at the cost of bad content and at ways to prevent it from your online marketplace.
The cost of fraudulent ads
Even though we live in a world in which very sophisticated hackers can deploy some of the most daring and devastating viruses and malware out there – from spearphishing to zero-day attacks – there can be little doubt that the most common scams still come from online purchases.
While there are stacks of advice out there for consumers on what to be aware of, marketplace owners can’t solely rely on their customers to take action. Being able to identify the different types of fraudulent ads – as per our previous article – is a great start, but for marketplace owners, awareness goes beyond mere common sense. They too need to take responsibility for their presence – otherwise, it’ll come with a cost.
Having content moderation guidelines or community that give your employees clear advice on how to raise the alarm on everything from catfishers to Trojan ads is crucial too. However, outside of any overt deception or threatening user behaviors, the very existence of fraudulent content negatively impacts online marketplaces as essentially, it gradually erodes the sense of trust that they have worked so hard to build. Resulting in lowered conversion rates and, ultimately, reduced revenue.
One brand that seems to be at the center of this trust quandary is Facebook. It famously published a public version of its own handbook last year, following a leak of its internal handbook. While these take a clear stance on issues like hate speech, sexual, and violent content; there’s little in the way of guidance on user behavior on its Marketplace feature.
The fact is, classified sites present a unique set of moderation challenges – that must be addressed in a way that’s sympathetic to the content forms being used. A one-size-fits-all approach doesn’t work. It’s too easy to assume that common sense and decency prevail where user-generated content is concerned. The only people qualified to determine what’s acceptable – and what isn’t – on a given platform are the owners themselves: whether that relates to ad formats, content types, and the products being sold.
Challenging counterfeit goods
With the holiday season fast approaching, and two of the busiest shopping days of the year – Black Friday and Cyber Monday – just a few weeks away, one of the biggest concerns online marketplaces face is the sale of counterfeit goods.
It’s a massive problem: one that’s projected to cost $1.8 trillion by 2020. It’s not dodgy goods sites should be wary of; there’s a very real threat of being sued by an actual brand for millions of dollars: if sites enable vendors to use their name on counterfeit products: as was the case when Gucci sued Alibaba in 2015.
However, the financial cost is compounded by an even more serious one – particularly where fake electrical items are concerned.
According to a Guardian report, research by the UK charity, Electrical Safety First shows that 18 million people have mistakenly purchased a counterfeit electrical item online. As a result, there are hundreds of thousands of faulty products in circulation. Some faults may be minor; glitches in Kodi boxes and game consoles, for example. Others, however, are a potential safety hazard – such as the unbranded mobile phone charger which caused a fire at an apartment in London last year.
The main issue is the presence of fraudulent third-party providers setting up shop on online marketplaces; advertising counterfeit products as a genuine article.
Staying vigilant on issues affecting consumers
It’s not just counterfeit products that marketplaces need to counter; fake service providers can be just as tough to crack down on too.
Wherever there’s misery, there’s opportunity. And you can be sure someone will try to capitalize on it. Consider the collapse of package holiday giant, Thomas Cook, a couple of months ago – which saw thousands of holidaymakers stranded and thousands more have their vacations canceled.
Knowing consumer compensation would be sought, a fake service calling itself thomascookrefunds.com quickly set to work gathering bank details, promising to reimburse those who’d booked holidays.
While not an online marketplace-related example per se, cases like this demonstrate the power of fake flags planted by those intent on using others’ misfortune to their own advantage.
Similarly, given the dominance of major online marketplaces, as trusted brands in their own right, criminals may even pose as company officials to dupe consumers. Case in point: the Amazon Prime phone scam, in which consumers received a phone call telling them their bank account had been hacked and they were now paying for Amazon Prime – before giving away their bank details to claim a non-existent refund.
While this was an offline incident, Amazon was swift to respond with advice on what consumers should be aware of. In this situation, there was no way that moderating site content alone could have indicated any wrongdoing.
However, it stands to reason that marketplaces should have a broader awareness of the impact of their brand, and a handle on how the issues affecting consumers should be aligned with their moderation efforts.
Curbing illegal activity & extremism
One of the most effective ways of ensuring the wrong kind of content doesn’t end up on an online marketplace or classifieds site is to use a combination of AI moderation and human expertise to accurately find criminal activity, abuse, or, extremism.
However, in some cases, it’s clear that those truly intent on making their point still can find ways around these restrictions. In the worst cases, site owners themselves will unofficially enable and advise users on ways to circumvent their site’s policies for financial gain.
This was precisely what happened at the classifieds site Backpage. It transpired that top executives at the company – including the CEO, Carl Ferrer – didn’t just turn a blind eye to the advertisement of escort and prostitution services; but actively encouraged the rewording and editing of such ads to give Backpage ‘a veneer of plausible deniability’.
As a result of this, along with money laundering charges, and for hosting child sex trafficking ads; not only was the site taken down for good, but officials were jailed – following Ferrer’s admission of guilt for all of these crimes.
While this was all conducted knowingly, sites that are totally against these kinds of actions, but don’t police their content effectively enough, are putting themselves at risk too.
Getting the balance right
Given the relative ease with which online marketplaces can be infiltrated, can’t site owners just tackle the problem before it happens? Unfortunately, that’s not the way they were set up. User-generated content has long been regarded as a bastion of free speech, consumer-first commerce, and individual expression. Trying to quell that would completely negate their reason for being. A balance is needed.
The real problem may be that ‘a few users are ruining things for everyone else’, but ultimately marketplaces can only distinguish between intent and context after content has been posted. Creating a moderation backlog when there’s such a huge amount of content isn’t a viable option either.
Combining man & machine in moderation
While solid moderation processes are crucial for marketplace success, relying on human moderation alone is unsustainable. It’s for many sites just not physically possible to review every single piece of user-generated content in real-time.
That’s why online content moderation tools and technology are critical to helping marketplace owners identify anything suspicious. When combining AI moderation with human moderation, you’re able to efficiently find the balance between time-to-site and user safety; which is what we offer here at Besedo.
Ultimately, the cost of bad content – or more specifically, not moderating it – isn’t just a loss of trust, customers, and revenue. Nor is it just a product quality or safety issue. It’s also the risk of enabling illegal activity, distributing abusive content, and giving extremists a voice. Playing a part in perpetuating this comes at a much heavier price.
This past September, Global Dating Insights (GDI) – the leading source of news and information for the online dating industry – gathered the international dating industry during an engaging conference in London.
With renowned speakers and insightful presentations of more than twenty leading industry-disruptors, the conference tackled some of the industry’s most important questions and exciting foreseeable trends of the future.
The dating industry might be facing some challenges with retention and engagement, but online dating sites are increasingly more creative in retaining customers and improving their user experience continually.
Here is our recap of the GDI Conference in London 2019, and the trends emerging in the near future.
Bringing the online dating world into the real one
Bringing online and offline together with real-life experiences such as members events and marketing campaigns has become an important strategy for many dating sites.
Online dating sites need to keep up with the latest consumer and demographic trends to improve their retention and experience. That is, what their users are seeking. Thus, many sites and apps have gone the route of making their dating apps feel more like a community.
Millennials and Gen Z are authentic experience seekers. From that insight, UK dating app Clikd has found creative ways of engaging their demographic. The company started organizing events to bring together its users. For one night in a venue, people meet 20 fellow users to find their ideal partner and are picked to jet off to a lavish holiday together.
Clikd has also launched its popular marketing campaign ‘the Clikd Summer Internship’, the world’s best internship to find love. The winning applicant is paid to go to 10 dates over 10 weeks and produce content to engage fellow users, as well as many other fancy perks obtained through the internship.
For selective dating site the Inner Circle, according to an interview of co-founder Michael Krayenhoff, offline events help to emphasize stronger brand loyalty from users through deeper connections.
Venturing into real-life experiences is a trend that has proven to work for businesses to strengthen their brands and match their users’ needs.
Engaging users through video
Standing out in the crowded dating industry is no small feat. Users often drift between two or three different dating apps daily, so it is essential for companies looking to overcome fierce competition to improve their user retention.
Video is dubbed to be the next big thing in the dating industry. CEO of location-based app happn, Didier Rappaport has emphasized that video is the significant future development in the industry, and many businesses are going down that route.
He said in an interview: “We need to allow people to hear the voice, to see the mannerisms and understand the person better than just looking at their picture,” adding that happn is working on developments to bring more real life into online dating with video interaction.
Video features are launched as we speak. The Meet Group’s app MeetMe has recently implemented a one-on-one video chat feature to facilitate confident connections and user safety before meeting. Members can start video-chatting with users they have already exchanged with to get a better sense of the person on the other side of the screen.
Expanding platforms’ value propositions, such as including video features, is an upcoming trend that will most likely improve user engagement and usage.
Making apps more human and creating better interactions
According to a study by eHarmony, 70% of American singles are looking for a serious relationship. It is no surprise then that singles are looking for value in dating, and not just mindless swiping anymore.
Many value-driven, and niche, online dating apps have popped up in the last few years and grown exponentially, privileging quality over quantity. It’s not just about the profile picture anymore.
For instance, Neargroup puts personalities before pictures by matching users before they can see each others’ pictures – ending the profile picture swiping craze.
Another example of a value-driven app is Say Allo, also present at the conference. Say Allo is a self-proclaimed ‘relationship app’ for matures singles focusing on compatible connections without wasting time swiping away.
Focusing on quality over quantity, dating app Once follows the trend of slow-dating with only providing their users with one match per day.
Dating fatigue and burnout are now so common; it has become a new challenge threatening retention for apps to tackle. Many companies have taken action against certain online behaviors.
Ghosting – which is the practice of ignoring dates and leaving messages unanswered after speaking to or going on a date with them – has been an issue for singles on dating apps for a while. And it is an issue for apps themselves driving disillusioned users to delete their accounts. To tackle the problem, some companies have launched anti-ghosting features. Dating app Hinge has rolled out a feature dubbed “Your Turn” pushing users to answer their abandoned matches.
Similar strategies are used by apps Bumble and Badoo to avoid the ghosting scourge threatening their retention and usage.
Online dating safety comes first
Another challenge to overcome for dating companies is to increase the number of women on dating sites. Dating sites still have a majority of male users. Some reasons being the fear of bad encounters or inappropriate experiences such as indecent pictures or sexual harassment. According to a study, across the dating services, 18% of participants reported having an issue with another user in the past.
Bumble CEO Whitney Wolfe Herd asked in an interview “why is it allowed digitally when it’s not allowed in the streets? People are operating on their phone. We need to keep the internet safe.” To attract women, new ideas are tested to give women more control over their dating experiences, including video interactions.
Contrarily to Chatroulette’s community with its lack of safety and unfortunate experiences reviewed online, dating sites are betting on restricted video features to enhance their community and make their users feel safer – especially women – before meeting their potential love-interest.
The Online Dating Association (ODA) is an international nonprofit organization dedicated to safety and standardizing best practices in the online connection space. At the GDI conference, the ODA emphasized the importance of setting ethical frameworks and standards for the industry as authorities and governments will demand more control and more regulations may come.
This is something online dating sites will need to pay close attention to. Stay ahead of the game and learn more about liability and moderation regulations in our interview with law professor Eric Goldman.
Dating apps thrive not merely by using technology to enhance user experience and retention, but also by creating safety features and guidelines to protect their users and brand reputation.
As any app (or online service) provider knows – in their quest to hit an all-important network effect – it’s not just downloads and user numbers that indicate success. Revenue generation is ultimately needed to ensure longevity.
Dating apps have established some of the most forward-thinking and innovative monetization methods in technology today. But finding a perfectly matching monetization strategy for your app or dating site means adopting a method that reflects its content, style, and user experience.
Luckily, there are lots of different tried and true monetization strategies out there already. Although they broadly fall into two major categories – in which the user pays or a third party pays – there are many different variations.
Here are some ways dating site owners can monetize their operations or improve their current strategy.
Advertising: Great When There’s Scale
Allowing other brands to advertise on your site has been part of the online world since the first sites went up. A natural extension of the broadcast media commercial model, passing the cost onto third party advertisers allows dating sites and apps to offer services for free: albeit for the price of the user’s attention.
Ad formats themselves come in all shapes and sizes – from simple PPC text ads to full-page interstitials, as well as native ads (more consistent with a site’s overall inventory), in-line carousel ads, in-feed content stream ads; among many others. Revenue is either generated via clicks, views, or transactions.
However, dating apps offer higher click-through rates and eCPMs (effective cost per thousand impressions) than games or other types of apps. Despite this, brands still need to work harder to make an impact as consumers have grown weary/immune/resistant to digital advertising.
Where online dating apps and sites are concerned, third party commercial affiliations range from the sublime to the ridiculous. Some pairings – like Tinder’s Dominos and Bud Light beer partnerships – might appear odd at first but, considering the importance of food and drink in the dating/socializing scene, actually make perfect sense.
From a business perspective, campaigns like these are a testament to a dating app’s ability to engage certain demographics (usually millennials) at scale; demonstrating the pulling power of a specific dating platform.
However, it’s not necessarily a technique that can be relied on to monetize a digital dating service from its very inception. Other methods are much more effective at doing that – often by selling their features and benefits. But this involves the cost being pushed back onto the user.
Subscriptions: Luring Users Behind The Paywall
Subscriptions ain’t what they used to be. Consumers are a lot more reluctant to part with their cash if they can’t see a genuine benefit for the service they’re from the very outset.
For some, better user experience is enough to sway them to part with a little cash each month. However for others, given that so many ‘free’ dating apps exist (admittedly of varying quality), unless they can clearly see what they’ll be getting for their money, they’ll take their chances elsewhere.
To overcome this, dating sites and apps offer varying degrees of ‘membership’ which can seem a little muddled to the uninitiated. So let’s consider the main contenders.
Firstly there’s the ‘free and paid app versions’ model: in which the free version has limited functionality, meaning the user must upgrade to fully benefit. Stalwarts like OkCupid and Plenty of Fish were among the first pioneers here – but many others champion this model too, including EliteSingles, Jaumo, Zoosk, Grindr and HowAboutDating – offering monthly and annual subscriptions.
The ‘Freemium’ model offers a similar experience – providing basic functionality for free – such as finding and communicating with others. However, other perks are available for an additional cost.
Badoo’s ‘Superpowers’ feature is probably the best known: letting users see who ‘liked’ them and added them to their favorites, as well as giving access to invisible mode, having their messages highlighted – plus they don’t see any ads. In fact, the popularity of Tinder’s ‘Rewind’ feature (taking back your swipe) led the company to start charging for it via it’s Tinder Plus and Tinder Gold packages. Bumble Boost, Hinge’s Preferred Membership, and Happn’s Premium are other scope-widening freemium services worth mentioning too.
A slight variation is the ‘free app with in-app purchases’ model. In addition to greater functionality – like a broader search radius and more daily matches – users can buy virtual and actual gifts and services. For example, Plenty Of Fish lets users digitally buy in-app ‘Goldfish’ credits to send virtual gifts to their potential dates – a folly to break the ice basically.
However, those that don’t want to pay, but are keen to test a few additional dating app perks, can often complete in-app tasks for limited-time access to premium accounts. Users are usually presented with an ‘offerwall’ detailing tasks to complete and the rewards to be reaped. MeetMe’s rewarded videos are a great example of this, as are rewarded surveys which seem to become increasingly common – and were trialed by dating app Matcher (now Dangle) a while back.
Activities like these indicate dating sites’ key asset: their audience data. Given that 15% of Americans use dating services and that the average user dedicates around 8 minutes to every session, the opportunity is real for those that achieve a certain scale.
But you can’t just sell data – can you?
Data Monetization: Insights For Sale
The sale of user data is a big no-no when specific information is involved (remember Cambridge Analytica?). But when the user grants consent and the data remains anonymous, well, that’s a different story.
Companies operating in EU countries need to abide by GDPR regulations or risk severe penalties, and other international data security initiatives, such as the EU-US Privacy Shield Framework are held in high esteem. So how can dating sites use their rich data sources as a revenue generation tool?
The only kind of data that can be sold is non-personal data – with a user’s consent. Even then, the type of data source is restricted to basic parameters: device types, mobile operator, country, screen size – among others.
The good news is that there’s significant demand for all of this data – from market researchers across many different sectors for a range of purposes; including optimizing user experiences and understanding buying choices.
On another positive note, according to one research survey, 95% of respondents are content to use apps that collect anonymous usage statistics.
However, unless your dating app has more than 50,000 daily active users, it won’t offer a large enough pool to draw from; and it will prove difficult to find a buyer for it.
Which Monetization Strategy Works Best?
All things considered, as with many types of online businesses, the greater the combination of monetization methods, the more profit there is to be had. Perhaps that can explain Tinder’s phenomenal global success.
But in isolation, each method has its drawbacks. Advertising only reaps a reward when a service offers scale; otherwise, where’s the value for brands? Conversely, charging users for a new service can be tricky to justify – unless the cost unlocks some additional never-seen-before feature. And without scale, charging marketers for data insights is pretty much impossible.
What is crucial, however, from the very outset, is that dating platforms establish a strong, dedicated user base. This means doubling down on trust and user safety, and finding ways to keep users engaged.
Despite the many positive things about dating sites, for some, the negative connotations persist. While sites and apps are a lot more conscious of preventing these, as with any platform that relies on user-generated content, the risk of users being catfished, shown inappropriate content, or de-frauded is always prevalent.
However, there are lots that digital dating platforms can do to build trust in their platforms and boost conversions. Content moderation is just one area – but it’s one that any dating service looking to expand its user base can’t ignore.
Ultimately there’s no substitute for getting the service right, knowing your users’ wants and needs (and there are many different dating services!), and developing a safe, secure and engaging environment for them to interact in. With these established, and when active usage hits a critical mass, monetization becomes a natural next step.
Just like any other business, online marketplaces also continuously tend to look for new and improved revenue streams to boost their growth. We caught up with Marketplace optimization and growth specialist and founder of the marketplaceplaybook.com, Bec Faye, to hear her take on how marketplaces can monetize their platforms efficiently.
In the interview, we speak about the importance of considering your users’ behavior and experiences when deciding on your monetization strategy, and we explore a successful disruptive monetization strategy that will inspire your inner creativity.
Watch the Interview:
Want to read the interview instead?
Emil: Hi, everyone, I’m here with Bec Faye, Marketplace Optimization & Growth Specialist, who is running the marketplaceplaybook.com. Bec, would you like to introduce yourself?
Bec: Hi. It’s really great to be here. As you explained, I specialize in helping marketplaces to really optimize their growth and to be able to work from the UX kind of angle and looking at conventional organization. But in amongst that, I work with a lot of different marketplaces of all different shapes, sizes and stages. I am really looking forward to have a chat today.
Emil: It’s really exciting to have you on. If you haven’t seen before, me and Bec did a webinar as well together around UX design for online marketplaces, which was really valuable. So I’m super happy to have you on board for this interview, as well, around monetization.
Bec: Really great to be back.
Emil: Awesome. Let’s jump straight into it. If we think about monetization strategies for marketplaces, there are many things that you need to consider. But if we can just start with what is monetization for marketplaces, how would you define it?
Bec: I have a couple of thoughts on monetization. I think part of it is obviously how we actually make money from a marketplace, and how do we actually create this piece of technology that’s going to fulfil its purpose. But in a way that we can actually help sustain it as a business or not for profit or whatever shape it happens to be in. But I also think it’s an interesting area where we tend to really focus in on specific trends and the way that people; we always do market monetization in marketplaces. But I think is actually really quite interesting because we see a lot of trends in people going for funding and raising all capital and everything like that. And the marketplace is very much dependent on that. But one of the things I’m really passionate about, exploring over the next two years, is really trying to figure out how do we make marketplaces profitable without necessarily relying on investment quite so much. So it’s what one of my areas of I’m quite passionate about. Sorry, but I think in a nutshell, monetization is really about how do we actually make money? How do we make sure that we’re making something that’s worthwhile that can be sustainable?
Emil: Yes and if you’re a young marketplace as well, like getting the investors on board, there needs to be some plan on how they can make their money back. If you can’t monetize your platform in a right way, then the investments are not going to happen, even to begin with. So for online marketplaces, if you look at different mindset and strategies to implement, there’s so many different alternative business models out there, like commission based selling fees and advertisement subscription. But what would you say, Bec, is important to consider when deciding on a suitable monetization strategy?
Bec: It is a really good question, I think it always comes back to no matter what stage you are in your marketplace. Really understanding the value of your core user group, the value that you’re offering to that user group, what paying for it is solving for them and what you are providing for them and then trying to really distil the value offering and that way basing our monetization around the value, the offering. So it’s going to make our job a lot easy to justify. And it’s also going to mean that people are willing to pay for that. I always say it depends on what side of the market we end up charging and the different structures that we might end up charging. But for me, it always comes back to really ensuring that where we’re providing that value and income associated, the monetization around that typically. One of the exercises I get a lot of early stage marketplaces to do to really prove that the marketplace monetization really work, is trying to create five transactions successfully without actually using the marketplace as a platform.
So to do it manually offsite, if they can create those five transactions manually. They’re saying really if you can prove that you can create these transactions without technology and means and proof, that people are willing to pay at that point in time, you get a lot of really valuable lessons along that journey that allows you to write on that monetization strategy without having to impede on the marketplace as a whole or worrying about technology and things like that. But it’s obviously for early stage marketplaces. The latest stage marketplaces, I think it’s really a matter of trying to figure out why they’re looking at monetization strategies. Is it that they’re looking to add a new revenue streams that business? And if that’s the case, then again, it’s going back to a core users. What value can we offer them? And is there an additional way that we can add revenue to help business in that way? I think we touched on it in the webinar about monetization around how tricky it can be to actually to transform an established marketplace and change that monetization strategy. But if you find ways to add new revenue streams into the business, that could be a way of kind of exploring that.
Emil: You’re right. In the recent webinar that we did around monetization in marketplaces, we spoke about mature marketplaces and well-established global players on how it can be very difficult to change that business model, instead looking at becoming more of an investor and adding revenue streams that way, as well. Just out of curiosity, like regions and geographies, does that really matter? When you look into integrating a new strategy?
Bec: I think it always does, because different regions are always going to have different intricacies about them, whether it’s the currency, whether it’s the exchange rate, whether it’s the culture in the way that that different, different aspects are conceived in that particular culture, for example. So I think it definitely comes into play. That’s where really getting to know your users easier every day, really understanding what it is that drives them, what they’re comfortable with and where they would see the value in your platform.
Emil: It is a good point. You should always look at user behaviour, but what is the user base that you are? What’s your marketplace? Who is the target audience? Who is using your platform? How are they using it? Looking to those different aspects and then you will find the different opportunities. I think you know that the premium listings like that kind of on monetization strategies really comes from that need of people who want to be able to sponsor the content, etc. and these kind of things.
Bec: Exactly. I am a true believer in that your end users are going to tell you what you need to be doing in business. If you are not listening to them, you are making a whole heap of assumptions. I always say with anybody I work with is always getting back to who your users are and listening to them because they are going to tell you the answers.
Emil: Super. If we look at monetization, it feels like things are trending from year to year, there’s a constant development in the marketplaces industry, moving from classifieds to offering more, the full payment solution and the entire offering. So my question is: what is the kind of trending monetization strategies that works in 2019?
Bec: I think, as you just touched on them, one of the big trends that we’re really seeing is the larger marketplaces, those traditional classified spaces trying to get closer and closer to the transaction in the original days and online classifieds. It was very much on listing and then a lot of that transaction happened off technology, basically, along with the real world. When now, they are again going back to that customer journey, really understanding what is that name that they’re selling for they users, for their customers.
For example, when the price of real estate they’re looking at, the original classifieds might be just selling a house. But actually, if we take a step back from that as a whole, lot of stuff that happens in amongst that, there’s things like the need to get financing. There’s the getting the phone and Internet and all of those kind of things connected. And there’s a whole bunch of other stuff that needs to happen around that part of that journey, the customer’s life. So really looking at what that journey looks like and then starting to figure out where they can add more value in, again, adding new revenue streams into that. And it’s thing that we’re seeing time and time again, every base works. Another good example as well, where a base or a trend happening with the tops of these that we use in the marketplaces that we’re using in minor places are using the platform. They realized that there was a lot of people using it for work, so then they started to explore that. And then a baby was born. And there’s another income stream for them. So to me, it felt really that everybody is coming back on this journey and really saying to depict how it’s coming. Get more. Closes the transaction. But how can we add more value as well in the long journey?
Emil: It’s really key for mainly larger mature players to really have that opportunity to both satisfy their users and users by adding these values sort of services. It could be anything from any insurances or delivery and payment solutions. But you offer that complete solution and that also gives not just increase UX, but also actually more revenue streams for yourself. So it’s a win-win in that sense to move in that direction. Out of curiosity, are there any traditional monetization strategies that doesn’t seem to work anymore or are fading out?
Bec: One of the things that probably less than the fading out, but one of the challenges that I’ve seen marketplaces really struggle with, particularly in the early stage, I work with a lot of early stage marketplaces. A lot of them are falling down the tunnel of, they’re in a low frequency marketplace and there in a low value frequency marketplace which means transactions are only happening very rarely for a user. So, for example, one user might only transact with their platform maybe once a year or every six months. That’s only about very low volume. So if you’re taking a ticket, as a percentage of the transaction, they really need a huge scale in order for that to work. And because they’re so close to the business, sometimes can be a little bit hard to see. The fact that they are charging that percentage but get the lifetime value of that customer on the average order value of that customer is so large. It’s just a really challenging space to be in and at scale is really the only option. So this is where I think that I really need to kind of recognize that that’s the case, and if the current shift that around and focus on solving a need that’s around that might actually bring in a higher frequency from that particular customer or a higher value of that transaction. It’s a matter of kind of looking to see what other income streams that can look at. What are the revenue streams can I look at? I think just being mindful of that is something definitely to be considered.
Emil: You touched on a little bit here that you work with a lot of more younger or early stage marketplaces, So I’m going to jump into another question. But feeding off of that, I assume, since you speak to a lot of young early stage companies, marketplaces that you have encountered, a couple of really disruptive sort of out of the box thinking style monetization strategies. Anything that you that you want to share.
Bec: I guess one of the real great things about working with early stage marketplaces is that we can be really inventive and really creative in how we approach monetization. I was working with a client recently, who was in a space where they were very concerned about the fact that they were kind of in the business of matchmaking almost. So they had users that were coming in from one side and the demand side. But it’s very easy for them to form relationships with a supplier, and therefore that relationship could be technically, taken off the platform. Therefore, charging a commission, for example, wasn’t gonna work for them, swinging to really take a step back and just really understand again what the customer journey was like. But really looking at the big picture of what the user was trying to achieve in their particular industry, and by taking that step backward, to identify that there was actually a big need that needed solving, which meant that we could actually really structure a sort of being narrowed in by what was done previously, were able to create more of a disruptive model that would let out that particular marketplace. To then basically bait 10 times the amount, 10 times their revenue that they were then earning from an average order value from that particular customer that was coming through the platform. Then we’re looking at a new way now to which will work out for them buying basically for a single hour. For example, they will be now purchasing bulk hours and then we’re looking into more of a membership type of opportunity which come down the track as well. So suddenly we’ve gone from being worried about the supply and the demand guy off site and platform where we’re losing the transaction to it now actually really increasing and multiplying that revenue stream. Now that’s the marketplace. So it’s kind of combining quite a few different techniques together. But by really understanding what the needs were of that user group, where I would identify this new opportunity, that would mean that we’re basically solving a problem from all angles. Sorry, that’s been really interesting, that experiment we are still running with, but so far the tests have been really positive.
Emil: It is very cool. I know that it can be a big struggle with having conversions like losing out on the actual conversion from your marketplace that you have your buyers and sellers, especially the sellers, I would assume, who are incentivized to leave the marketplace. But here is also, from a moderation standpoint, where it gets dangerous for the user as well. Because you can’t control the conversation and you can’t protect your users, the buyer in that scenario. So if the seller manages to get the conversation off the site, then the chance for a scam, etc. increases significantly. So it’s also like a way to keep protect your users in that sense as well, to be able to keep them on your platform. It’s very interesting. Another way of preventing that as well is adding, like we spoke about value-added services, as monetization strategy. And if you add enough value like OpenTable, or like an Uber, something like that, where you actually can do reservations and book. And the platform itself is so useful for the seller, that can bring down the incentive to actually leave the platform because it adds services that you need.
Bec: Exactly. That’s a really great technique.
Emil: So I think that brings us to the end of today’s interview. So thank you so much, Bec, for taking your time. It’s been really helpful. I hope and I think all the listeners are really happy to hear your tips, tricks and ideas for monetization strategies. If you want to get in touch with Bec, you can reach her on her email on firstname.lastname@example.org. If you want to learn more about monetization strategies, check out our webinar that we did on September 17th. And finally, if you want to learn more about content moderation and how we, at Besedo, can help you improve your content quality, and from that side of things, boost your revenue generation. Then don’t hesitate to reach out to me at email@example.com. Thank you, Bec. Thank you very much. Thank you for having me. Take care, guys. Bye.
Bec: Thank you. Bye.
It might sound obvious, but there’s a direct correlation between online marketplace trust and conversion rates.
It’s not hard to see why – after all, people do business with those they know, like, and relate to. But how, as an online marketplace owner, can you establish, build, and maintain user trust?
Let’s take a closer look at the nature of trust online, how it helps improve conversions within online marketplaces and classified sites, and how these businesses can increase user trust.
Defining trust for Online Marketplaces
Trust is complex. It’s part-rational and part-emotional. It’s something that can be earned, lost, reinforced, and rebuilt. It’s an enabler; an empowering force. It’s also tough to define; so let’s be clear about what trust means in relation to online marketplaces and classified sites.
Respected economist, author, and online trust expert, Rachel Botsman, put forward the idea – in one of her TED Talks – that technology and the sharing economy encourages us to place an enormous amount of trust in complete strangers and marketplaces.
While on the surface this may sound counter-intuitive, her assertion that humans constantly take ‘trust leaps’ in the digital world is persuasive. In order to take these leaps, Botsman argues, we need certain reassurances – which is where her concept of a ‘Trust Stack’ comes into play (watch our webinar for more on this).
There are three parts at work in Botsman’s Trust Stack: trust in an idea; trust in a platform; and trust in the other user. Now, it’s easy to presume that online marketplace owners should just be concerned with the second point. However, all three are equally important:
- Trust in an idea is essentially social proof – that the marketplace has a legitimate purpose and regular user base; and that there’s evidence elsewhere that this is the case – testimonials, good PR, social media links etc.
- Trust in the platform is the reassurance that the marketplace is accountable; that the information provided is true, accurate, and designed to benefit the customer; and that there are fail-safes in place to protect their interests.
- Trust in the other user is the confidence that the content others post on a marketplace reflects the actual product they’re selling. They need to know they’re transacting with a real person – and that that person has an equally vested interest.
Ultimately, all three of these layers can be addressed by marketplaces and classified sites – and need to be aligned in order to establish this complete Trust Stack.
But, this is easier said than done when your business operates across multiple countries, has listings in various languages, and has a business model that’s reliant on user-generated content from a myriad of users.
However, there are numerous proactive steps that marketplaces can take to ‘trustproof’ the way they do business – which we’ll now consider.
Building a better basis for Marketplace Trust
Building a solid trust foundation for an online marketplace or classified site involves addressing the entire user experience. Some aspects are more obvious than others, so let’s begin with the most evident.
This is probably what springs to mind immediately when the idea of ‘online trust’ is mentioned, and it is an important consideration. But it’s not just a question of choosing a reputable third party payment partner.
Payment preferences differ across national borders. Digital wallets, credit and debit cards, and cash on delivery are among the many different payment types used by a variety of marketplaces. Providing the secure payment option that your user base feels most comfortable with is a great trust builder – and plays into the wider user experience (more on that below).
Ease of payment is also just as important where trust is concerned. According to a survey, some 70% of users abandon their online shopping carts, and over a quarter (26%) do so because of complicated payment processes – directly affecting marketplace conversion rates negatively.
Policies & Guarantees
Most marketplace users won’t read the small print or the terms and conditions you set. But that doesn’t mean they can be neglected. When something goes wrong, users will immediately start looking to find out where they stand – so make sure your Service Level Agreements, moneyback guarantees, and data privacy policies are clearly labeled.
From a trust perspective, offering total transparency on consumer rights is very important. For example, guarantees – like eBay’s money-back policy – give customers the confidence to purchase on your marketplace; like a safety net.
Another great example is online shoe retailer, Zappos, which offers a ‘no questions asked returns policy – for unused items – up to a year after purchase. This (and other aspects of Zappos’ exemplary customer service) has become something of a ‘feather in the cap’ for the business. Putting the customer first is a key component of building a solid trust foundation.
User experience – or UX as it’s more commonly known – might be a bit of a technology buzzword at the moment, but for a very good reason. Ensuring that users easily and safely can find the information, products, and services they’re searching for has to be a priority for any marketplace owner.
Everything from the way a marketplace is designed, the content, the icons deployed, and the ease of navigation from one page to the next should be considered core parts of a site’s overall user experience too.
But how does UX relate to trust? Thinking back to Botsman’s Trust Stack, accessibility, user-friendliness, and online safety are crucial components in both the first (trust in the idea) and second (trust in the platform) layers.
Overall, UX is about reassuring users that they’ve made the right decision – at every point in their journey. Any UX action that enhances user trust helps smoothen the user journey towards conversion.
Safety and security doesn’t just mean making sure the right payment processes are in place. Sure, money is a huge part of the security equation; but even more important is ensuring users aren’t subjected to fraudulent, criminal, or unwanted sexual, racist, hatred, or violent content on your online marketplace.
Sadly, scams are all too prevalent online (see our previous blog): from catfishing to identity theft; as well as fake ads selling everything from high-value items at low prices, to seasoned classifieds scammers posting genuine ads in bulk to disguise fake ones.
This is where a clear content moderation policy and strategy can pay dividends. For example, setting clear rules for what ‘good’ and ‘bad’ content look like on your platform is a great place to start – and that information should always be clearly communicated to your users.
In addition to preventing criminal activities and inappropriate material, content moderation is also crucial for ensuring overall site hygiene. For example, it’s important to ensure that listings are placed in the correct category and that imagery used accurately reflects the descriptions given. In a study on classifieds listings we conducted a few years ago, we discovered that on average 15% of the listings are posted in the wrong category every month.
Similarly, employing dedicated moderators to ensure consistency and identify anomalies is now standard practice for many online marketplace owners. However, when the scale of UGC becomes too much for a manual moderation team to manage alone, there are moderation technologies out there (like ours!) that use AI and machine learning to moderate and approve content. By implementing automated moderation, be it filters or AI moderation, from the very beginning, your marketplace is equipped to handle the increased volumes as you grow.
Marketplace Trust & Conversions
As you have discovered; a great deal needs to be addressed in order to get trust right. That’s why it’s important to keep the end result in mind: creating safe online experiences that increases conversions and sales. But what overall impact does trust have on a marketplace’s success?
Again, looking at our user study on classifieds, several key stats give us an indication:
- 100% of the participants found irrelevant items in their search results
- 75% of participants found duplicate posts
As mentioned above, anomalies don’t play out well from a trust perspective. They not only cause confusion, but they also demonstrate to users that a site isn’t what it claims it is which can diminish their trust levels.
We also discovered that:
- Only 20% would buy the product in an ad with poor description and 73% were unlikely to return to the site: compared to 56% and 37% for a good listing.
In a similar way, a lack of clarity in the listing itself does little to inspire buyer confidence. However, the most telling stat from our study nicely summarizes the link between trust and conversions:
- 75% who saw a scam on a site said they would not return
It stands to reason that you can do a great deal to ensure your marketplace works well, is fast, reliable, and well-designed. You could also spend hours perfecting your UX, onboarding the right payment partners, and ensuring your customer service is top-notch. But if your site doesn’t exercise enough caution from a fraudulent and poor content perspective, the trust foundations you’ve worked so hard to build can quickly crumble.
This is the risk you take when the majority of your sales activity is driven by peer-to-peer interactions and user-generated content (UGC). It’s a delicate balance and can be tricky to get right. But done correctly, online marketplaces can propel their businesses to huge success.
However – it’s always worth remembering that while trust can take a long time to build, it can be lost incredibly quickly.
Want to find out more about building trust in your online marketplace? Take a closer look at our ‘Trust Stack Checklist’. If you have any queries about online safety and content moderation, please get in touch with our team.
The fraud landscape is continually shifting with fraudsters’ methods growing more sophisticated by the day. With cybercrime profits rising to $1.5 trillion last year, scammers are getting increasingly creative. Fraudsters are always improving on their deceptive and lucrative scamming methods alongside the continuous development of the Internet.
Let’s have a closer look at some of the most sophisticated scams surfacing on the Internet currently.
Fake website scams
Scammers might be getting more elaborate in their techniques, yet they take more precautions than ever before to avoid being exposed and caught by the authorities.
A current trend making the headlines involves scammers setting up fake websites to replicate reputable online stores or marketplaces while carefully executing their scams.
A news story broke out in Spain earlier this year and is considered to be the largest cyber-scam in the history of the country. A 23-year old, along with his accomplices, managed more than 30 fraudulent websites generating an income amounting to an estimated 300,000 € per month.
The fraudsters sold mainly consumer electronics by directing potential buyers to their own fraudulent pages which looked like replicas of reputable websites using companies’ logos and names to dupe users.
To avoid suspicion from the authorities, these sites would only stay live for a few days while being intensely advertised on search engines and social media before disappearing into thin air. The aim was to attract as many potential customers as possible in the shortest amount of time.
Victims were asked to send bank transfers for their orders, but of course, the goods would never arrive. The money received would be deposited into bank accounts set up by mules recruited to transfer money illegally on behalf of others. The scammers then only needed to withdraw the money at cashpoints.
Another technique used by fraudsters plays on the fact that most of us have been warned to check for secure websites when browsing the web. The presence of “https” and the lock icon are supposed to indicate that the network is secure, and users can share their data safely. However, this can be easily replicated. By using those security symbols, scammers make it easier for them to fool potential victims.
Fake website scams, also popping up on sharing economy websites such as Airbnb, shows that since people have become warier of online scams, fraudsters have taken their methods to the next level of sophistication.
Romance mule scams
Online dating sites have been one of romance scammers’ preferred hunting grounds to find their next victim.
The FBI’s online crime division recently flagged an alarmingly growing trend in the romance scam department. Online dating scammers have expanded their romance scam strategies by adding a new dark twist to it and taking advantage of their victims in a whole new way. This dating scam turns love-seekers into unwittingly recruited criminals, also known as ‘money mules’.
Fraudsters, under the guise of a relationship, are using online dating sites to recruit their next victim and dupe them into laundering stolen money.
Here’s how they work. The scammers, pretending to be American or European citizens living or working abroad, start grooming their victims for several months by establishing a supposedly trustworthy relationship with them. They then proceed to have their victim act as a financial middleperson in a variety of fraudulent activities. There are many ways victims can unwittingly help fraudsters. For instance, they can print and post packages or letters often containing counterfeit checks, or in other cases pick up money at Western Union and forward it to another place. They could even open a bank account under the pretense of sending and receiving payment directly, except that these accounts, unknowingly, are then used to aid criminal activities.
A study estimates that 30% of romance scam victims in 2018 have been used as money mules.
Indeed, romance scam victims are the perfect mules for these sorts of frauds. Blindly in love, victims will not doubt their lovers’ intentions and likely never steal the money or ask for anything in exchange for their services, unlike professional fraudsters.
This romance mule scam makes it complicated for local authorities to follow the money, and even if the victims get caught, they do not know the actual identity and location of the scammers. Tracking the fraudsters’ movements becomes an intricate or impossible task.
Romance mule victims often do not know they are a part of these fraud schemes or criminal activities until it’s too late. Despite the victims not losing any money, the FBI warns that they might face legal or financial consequences for participating in such a scheme.
The Telegraph reported a story about a 61-year-old British man who was tricked by fraudsters to fund terrorists in this exact way. He believed he was corresponding with a wealthy businesswoman he met on an online dating site. The woman claimed she needed her money in the UK so she could pay her European employees. The man would then send checks on her behalf, becoming inadvertently and unknowingly part of a fraud scheme.
Phishing scams are very well-known, however a variation, the ‘impersonation scam’ has boomed over the past few years impacting both users’ online safety and companies’ reputation.
These phishing emails might seem completely genuine as they look nearly identical to those of reputable and reliable websites, including Google or Apple, and often end up bypassing spam filters. Like many fraud schemes, impersonation scams are based on trust.
Last year, the popular streaming service Netflix was the target of an email phishing scam in Ireland which was sent to thousands of subscribers on the pretext of a maintenance and verification issue. In a similar way to the fake website scams previously mentioned, these malicious emails, looking like perfect replicas of Netflix emails featured a link to update credit card information or login credentials. However, the link did not direct users to the Netflix website but to a site managed by scammers.
Scams often target popular brands with a large user-base to lure subscribers into giving out personal information. Not only is this dangerous for the customer, but it also threatens brands’ reputation.
Staying ahead of the scammers
With fraudsters persistently refining their scamming techniques, companies must always be one step ahead in the prevention of these scams to protect their users and keep them safe on their site.
Marketplace leakage, which we wrote about in a previous article, refers to losing your users’ from operating on your platform and continue their conversations beyond your site’s security measures. This technique of luring users away from your site is used in the scam examples mentioned in this article, and dramatically increases the risk of your users being scammed. To ensure the safety of yours users, online marketplaces need to keep interactions on their platforms by preventing the display of personal details and hidden URLs.
Marketplace leakage can be avoided by improving your content moderation efforts. By ensuring accurate automated moderation you can instantly spot and prevent any content aimed at drawing users away from your platform.
To learn more about setting up efficient automated moderation filters to protect your users, check out our Filter Creation Masterclass.
Could you tell us a bit about yourself?
My name is Kevin Ducón from Bogotá, Colombia. I hold an MSc in Computer Science from Universidad Politecnica de Madrid and a BSc in Computer Science from Universidad Ditrital de Bogotá.
I have been working in information and communications technology for more than fifteen years and began working at Besedo five years ago, specializing in IT Service Management and Information Security. I started as a local ICT Administrator in our Colombian center, then as an ICT Supervisor, and currently, I am the Global Head of ICT-IS (information and communications technology – information security).
Over the past five years, I have applied my knowledge and skills to this ever-changing industry by creating policies and processes aligned with the industry’s best practices, supporting our clients, and continuously improving our on-going projects.
What are your responsibilities as Global Head of ICT-IS?
As the Global Head of ICT-IS at Besedo, I’m in charge of all levels of support in information technology and communications.
I oversee the Global ICT work, and together with my ICT team, I make sure that we fulfill our most important metrics – availability, service-level agreement, and customer satisfaction.
On top of that, I manage and provide insights into our security guidelines and develop strategic and operational plans for the ICT department to ensure that all necessary tools and processes are fully functional to achieve the company’s overarching goals and ambitions.
I also have hands-on technical responsibilities in supporting and developing mission-critical systems, which are running 24/7, to make sure our moderation services are successfully delivered to our customers worldwide.
From an ICT point of view, what are the key elements that must go right when running a content moderation operation?
The essential part from an ICT standpoint when running a content moderation operation is to truly understand the priorities and needs specific to the operation. Having an IT strategy to translate business needs into functioning IT operations is vital for a successful moderation setup.
Furthermore, ensuring good practices in network infrastructure and server setup, device management, and IT support is key to achieve a solid moderation operation. Finally, it’s crucial to have a knowledgeable and committed IT staff behind the scenes.
What are the common things that can go wrong?
When running a moderation operation, many potential issues can occur, some of the most common hazards include Internet connection, networks, or servers going down, power outages and failed infrastructure deployments.
For instance, content moderation relies heavily on a stable Internet connection, and you cannot blindly trust that it will just work. Instead, you need to make sure that your Internet service always works to its full capacity.
What safety measures are needed to make sure the moderation operation runs smoothly?
It’s important to have proactive safety measures in place to guarantee that the moderation operation always is carried out correctly. A good first step is to plan the implementation of the moderation services thoroughly before putting disaster mitigations plans in place.
For example, at Besedo, we work with several Internet service providers in case one of those fails to deliver correctly. We also work with fault-tolerant networks, a resilient infrastructure, third-party support, etc., to ensure that our IT operations remain stable when potential risks materialize.
On top of this, we run daily IT checklists and use monitoring systems that allow us to prevent potential challenges during IT ops. Also, we have backup routines to avoid any information loss or damage and use UPS to keep our critical devices turned on.
All in all, for anyone looking to run a successful moderation operation, many countermeasures must be put in place to make sure that IT operations run smoothly.
What’s the best thing about your job?
My job allows me to work in the different areas of the ICT Function and with all the disciplines that contribute to the business. For some people, ICT only assists with end-user tickets because that’s what’s visible to them. However, IT is not just a commodity but a strategic ally for us to deliver the highest level of services to our customers.
I’m proud to apply my skill-set and knowledge to Besedo’s purpose and values, which I genuinely believe in. When I took the role as Global Head of ICT-IS, I sought out to implement our promise ‘grow with trust’ into everything we do in our team. This has shaped the ICT team’s goal to help all functions grow with trust, through efficient processes, guaranteed quality of services, and high customer satisfaction.
At Besedo, we have an excellent ICT team of committed and ambitious individuals who love what they do and work hard to improve the company every day.
Kevin Ducón is Besedo’s Global Head of ICT-IS. He has been working in information and communications technology for more than fifteen years. Over the past five years at Besedo, he has applied his knowledge and skills to the ever-changing content moderation industry.
Internet fraud. It’s pretty sophisticated. And for online marketplaces – or any other platform that relies on User Generated Content – it’s often well-hidden or undetectable.
Scammers are an increasingly resourceful bunch, so if there’s a system to be gamed, you can bet they’ll find a way to work it.
However, with the right insight, awareness, and detection processes in place, site owners can keep their users safe – and put a stop to scams before they endanger anyone.
Let’s have a look at some of the most common online scam types to be aware of on your online marketplace, how you stay on top of them, and ultimately how to prevent them.
Online Shopping Scams
One of the most common types of fraudsters plaguing digital marketplaces, online shopping scammers usually advertise high-ticket items for sale at low prices. Typically, these include mobile phones, video game consoles, laptops, jewelry, and cars – with commercial vehicles and heavy equipment on the upraise.
They may be advertised along with a believable, yet fabricated story. This can be something like the fact they’re selling ‘excess stock’ or that goods have ‘minor damages’ – for example.
The reason scammers do this is simply to give some degree of credibility to their request for partial payment upfront. Of course, they have no intention of selling any goods at all. They simply aim to dupe users.
As a marketplace owner, it’s important to advise your users that if something sounds too good to be true, it usually is. It is also vital to warn them against sending any form of payment before obtaining any goods. They should also be wary of paying by direct transfer, using prepaid cards, or any requests to pay for goods using cryptocurrencies.
Dating & Romance Scams
Dating scams are probably the best-known kinds of online fraud – a topic we’ve covered before in our blog.
While many of us have used flattering photos of ourselves in online dating profiles, there’s a big difference between presenting ourselves in our best light and creating a fake online identity.
While TV shows and high profile cases of this practice – known as ‘catfishing’ – have raised awareness, it still remains a common issue on a lot of dating sites.
Essentially, romance scams Works when a scammer (posing as an attractive man or woman) reaches out to a user, builds a relationship with them exclusively online – sometimes over a period of months – before proceeding to either ask them for money, or even to do favors for them: activities that could well be criminal in their nature.
Why does the scam work so well? Catfishers do whatever it takes to win their targets’ trust. And once that trust is established, the target is too emotionally invested to question the scammer’s motives.
While different official organizations – like the Online Dating Association – are doing more to raise awareness, dating sites themselves need to do more to highlight the dangers and behavior patterns fake users follow.
For example, there are many keywords and phrases catfishers use to make themselves sound more credible (as we outline here). They may claim to be religious or work in a trustworthy job – like the police or military.
A common struggle for many sites is that they’re not quick enough to remove scammers. Dating and romance scammers are quick to move the conversation away from the site to avoid detection – sites need to prevent that from happening already from scratch. Learn how you can create filters to detect and prevent personal details automatically.
Fake Charity Scams
Many of us are wary of so-called ‘chuggers’ (charity + muggers) approaching us on the street asking for donations and we’d be right to – given the recent news that one scam in London was so well-orchestrated that even those collecting cash didn’t know it was a shady operation.
However, online – where donation platforms are becoming increasingly popular owing to their ease of use – how can those donating be sure their money ends up where it’s supposed to?
Transparency is key. The more information a site offers about the charities they’re working with; how much (if anything) they take as commission; and how long donations take to reach each charity, the more trustworthy they’re likely to be.
But what about online marketplaces and classified sites? Charity scams are just as likely here – particularly in the wake of high profile disasters.
As a result, site owners need to advise their users to exercise caution when those requesting funds…
- say they’re from a charity they’ve never heard of
- won’t/can’t give all the details about who they’re collecting for
- seem to be pushing users to donate quickly
- say they only want cash or wire transfers (credit card is much safer)
- claim donations are tax-deductible
- offer sweepstakes prizes for donations
When working with charities, online marketplaces and classified sites should ensure that rigorous security checks are in place. For example, as phishing is a common fake charity scam, it’s crucial that any relevant in-platform messages that provide a link to an external ‘charity site’ are detected early on.
Online fraud and employment may sound like a fairly unlikely pairing, but in fact, it’s a lot more sophisticated than many might think.
There are numerous ways in which scammers abuse online marketplace and classified sites, and most of the time they’re looking to either extract money or steal your identity (more on that below too).
One of the most frequent employment-related scams is a fake job posting looking for people to handle ‘payment processing’. The scammer may find CV/Resumes online or they may post on credible boards – such as Craigslist.
The trick being played out here is one where the proceeds of crime are handled by the user (for a small commission) and transferred back to the fraudster – who is essentially using the ‘employee’ to launder money.
Another common job-related scam is one where ‘recruiters’ coax candidates into paying for additional job training or career development courses – or when an ‘employer’ asks candidates to cover the costs of a credit record check.
In all cases, employment-focused marketplace owners need to be acutely aware of anyone asking users to impart finance-related information or money.
However, these requests may not materialize until the conversation has been moved to email – away from the site – so it’s critical for those operating job boards to put some form of prevention and moderation effort in place.
Recent news that a young journalist had a job application withdrawn by someone pretending to be him – via email – is alarming but not uncommon. However, impersonation takes on a whole new meaning when linked to identity theft.
While the most likely scenario in which identity theft occurs is an online data breach, internet shopping also puts users at risk. According to Experian, 43% of online shopping identity theft happens during the peak holiday shopping season (Black Friday onward).
Many scammers use familiar tricks – like phishing – to steal personal details, debit and credit card details, and social security numbers; using them to buy goods (often high priced items in bulk), to claim refunds from ‘faulty’ items, or to open accounts in other peoples’ names to mask other fraudulent activities.
Scammers can buy stolen identities on the dark web very cheaply. And it’s not uncommon for fraudsters to advertise usually high priced items at low prices for quick sale on marketplaces… and then steal shoppers’ credit card details.
While general advice is routinely given to consumers – such as vigilance over website security, visiting preferred stores directly rather than clicking search engine links, and not to store card details online – online marketplaces need to prioritize monitoring and prevention too.
Preventing Scams on Online Marketplaces
With so many ways in which scammers can benefit, it’s clear that they’re not going to stop anytime soon.
This means that in an environment where trust is a limited commodity, the pressure increases on e-commerce sites, online marketplaces, and classified sites, to maintain it.
While official bodies, governments, and consumer rights groups; as well as Facebook (as reported in TechCrunch this week) and other tech champions with considerable clout are informing and empowering users to recognize and take an active stance against suspicious activity, online marketplaces also have a responsibility to detect and eliminate fraud.
As marketplaces scale and begin to achieve a network effect, they need to adopt more stringent cybersecurity protocols to protect their users – multi-factor authentication, for example. Similarly, mapping user behavior can help site owners to identify how genuine customers navigate it – giving them the intelligence they can use to benchmark suspicious activity.
Essentially, the better you know your users and the way they behave, and the more emphasis you put on transparency as a prerequisite for joining your community, the greater the deterrent. But as discussed, there are ways scammers to mask their behavior.
Being a step ahead of scammers is important (as our trust and security expert explains in this article). Therefore, it’s essential to anticipate the different times of the year when certain scams manifest – as outlined in our Scam Spikes Awareness Calendar.
However, by far the most effective way to prevent fraudulent activity on online marketplaces is to have a solid content moderation setup in place. While this could be a team or person manually monitoring the behaviors most likely to be scams, as a marketplace grows, this process needs to function and maintain at scale.
Enter machine learning AI – a moderation solution trained to detect scammers before fraudulent content is posted. Essentially this works by ‘feeding’ the AI with data to recognize suspicious behavioral patterns, and can, therefore, identify a number of possible fraud threats simultaneously.
At Besedo, we fight fraud by giving marketplace owners the tools – not just the advice – they need to stop it before it is published.
All things considered, scammers are merely opportunists looking for an easy way to make money. The harder it becomes to do this on online marketplaces, the less inclined they’ll be to target them.
Keen to learn more about content moderation? Let’s talk.
What is a content moderator? why not ask one. We sat down with Michele Panarosa, Online Content Moderator Level 1 at Besedo, to learn more about a content moderators daily work, how to become one, and much more.
Hi Michele! Thank you for taking the time to sit down with us. Could you tell us a bit about yourself?
My name is Michele Panarosa, I’m 27 years old and I come from Bari, Puglia, Italy. I’ve been an online content moderator for nine months now, formerly an IT technician with a passion for technology and videogames. In my spare time, I like to sing and listen to music. I’m a shy person at first, but then I turn into an entertainer because I like to have a happy environment around me. They call me “Diva” for a good reason!
What is a content moderator?
A content moderator is responsible for user-generated content submitted to an online platform. The content moderator’s job is to make sure that items are placed in the right category, are free from scams, doesn’t include any illegal items, and much more.
How did you become a content moderator?
I became an online content moderator by training with a specialist during the first weeks of work, but it’s a never-ending learning curve. At first, I was scared to accidentally accepting fraudulent content, or not doing my job properly. My teammates, along with my manager and team leaders, were nice and helped me throughout the entire process. As I kept on learning, I started to understand fraud trends and patterns. It helped me spot fraudulent content with ease, and I could with confidence escalate items to second-line moderation agents who made sure it got refused.
Communication is essential in this case. There are so many items I didn’t even know existed, which is a enriching experience. The world of content moderation is very dynamic, and it has so many interesting things to learn.
What’s great about working with content moderation?
The great part of content moderation is the mission behind it. Internet sometimes could seem like a big and unsafe place where scammers are the rulers. I love this job because I get to make the world a better place by blocking content that’s not supposed to be online. It’s a blessing to be part of a mission where I can help others and feel good about what I do. Besides, it makes you feel important and adds that undercover aspect of a 007 agent.
How do you moderate content accurately and fast?
Speed and accuracy could be parallel, but you need to be focused and keep your eyes on the important part of a listing. Only a bit of information in a listing can be very revealing and tell you what your next step should be. On top of that, it’s crucial to stay updated on the latest fraud trends to not fall into any traps. Some listings and users may appear very innocent, but it’s important to take each listing seriously and it’s always better to slow down a bit before moving on to the next listing.
What’s the most common type of content you refuse?
The most common type of items I refuse must be weapons – any kind of weapons. Some users try to make them seem harmless, but in reality, they’re not. It’s important to look at the listing images, and if the weapon is not exposed in the image, we’ll simply gather more information about the item. Usually, users who want to sell weapons try to hide it by not using images and be very short in their description (sometimes no description at all). It’s our task, as content moderators, to collect more details and refuse the item if it turns out to be a weapon. Even if it’s a soft air gun or used for sports.
What are the most important personal qualities needed to become a good content moderator?
The most important personal qualities needed to become a good content moderator are patience, integrity, and curiosity.
Moderating content is not always easy and sometimes it can be challenging to maintain a high pace while not jeopardizing accuracy. When faced with factors that might slow you down, it’s necessary to stay patient and not get distracted.
It’s all about work ethic, staying true to who you are and what you do. Always remember why you are moderating content, and don’t lose track of the final objective.
As a content moderator, you’re guaranteed to stumble onto items you didn’t even know existed. It’s important to stay curious and research the items, to make sure they’re in the right category, or should be refused – if the item doesn’t meet the platform’s rules and guidelines.
Michele is an Online Content Moderator Level 1 and has worked within this role for nine months. Previously he worked as an IT technician. Michele is passionate about technology and videogames, and in his spare time, he enjoys music both to sing and listen.
What is content moderation?
Content moderation is when an online platform screen and monitor user-generated content based on platform-specific rules and guidelines to determine if the content should be published on the online platform, or not.
In other words, when content is submitted by a user to a website, that piece of content will go through a screening process (the moderation process) to make sure that the content upholds the regulations of the website, is not illegal, inappropriate, or harassing, etc.
Content moderation as a practice is common across online platforms that heavily rely on user-generated content, such as social media platforms, online marketplaces, sharing economy, dating sites, communities and forums, etc.
There are a number of different forms of content moderation; pre-moderation, post-moderation, reactive moderation, distributed moderation, and automated moderation. In this article we’re looking closer at human moderation and automated moderation, but if you’re curious to learn more, here’s an article featuring the 5 moderation methods.
What is human moderation?
Human moderation, or manual moderation, is the practice when humans manually monitor and screen user-generated content which has been submitted to an online platform. The human moderator follows platform-specific rules and guidelines to protect online users by keeping content like unwanted, illegal, scam, inappropriate, and harassment, off the site.
What is automated moderation?
Automated moderation means that any user-generated content submitted to an online platform will be accepted, refused, or sent to human moderation, automatically – based on the platform’s specific rules and guidelines. Automated moderation is the ideal solution for online platforms who want to make sure that qualitative user-generated content goes live instantly and that users are safe when interacting on their site.
According to a study done by Microsoft, humans only stay attentive for 8-seconds on average. Therefore, online platforms cannot afford to have slow time-to-site of user-generated content or they might risk losing their users. On the other hand, users who encounter poor quality content, spam, scam, inappropriate content, etc., are likely to leave the site instantly. So, where does that leave us? In order for online platforms not to jeopardize quality or time-to-site, they need to consider automated moderation.
When talking about automated moderation, we often refer to machine learning AI (AI moderation) and automated filters. But what are they really?
What is AI moderation?
AI moderation, or tailored AI moderation, is machine learning models built from online platform-specific data, to efficiently and accurately catch unwanted user-generated content. An AI moderation solution will take highly accurate automated moderation decisions – refusing, approving, or escalating content automatically.
One example that showcases the power of AI moderation is the Swiss online marketplace, Anibis, who successfully automated 94% of their moderation whilst achieving 99.8% accuracy.
It should also be mentioned; AI moderation can be built on generic data. These models can be very effective but are in most cases not as accurate as a tailored AI solution.
What is Automated filter moderation?
Automated filter moderation is a set of rules to automatically highlight and catch unwanted content. The filters (or rules) are efficient while finding content that can’t be misinterpreted or are obvious scams. This makes them a solid complimentary automation tool for your moderation set up. Automated filters can easily be created, edited and deleted in our all-in-one content moderation tool, Implio – learn how to create filters here.
Do’s and don’ts of content moderation
Determining what to do and not to do in content moderation, may vary from site to site. There are many elements and factors that need consideration to get the moderation set up best suited for your specific needs.
However, regardless if you’re running an online marketplace, social media platform, or sharing economy site, etc., there are some things true of what to do and not to do when it comes to content moderation.
Do’s of content moderation
Do: Select the moderation method that’s right for your needs
Start off by looking at what kind of content your site hosts and who your users are. This will help you create a clear picture of what’s required from your moderation method and setup. For example, the type of user-generated content found on Medium contra Facebook is very different, and their users’ behavior too. This makes their moderation methods and setups look differently in order to fit their platform’s specific needs.
Do: Create clear rules and guidelines
Your content moderation rules and guidelines need to be clear for everyone who is directly involved with your online platform’s content moderation. Everyone from the data scientist developing your AI moderation to the human moderator reviewing content, regardless if they sit in-house or are outsourced to partners. Uncertainty in your rulebook can set your moderation efforts back; both from a financial and from a user experience perspective.
Do: Moderate all types of content
Regardless if you’re running an online marketplace, dating site, or a social media platform, your users are key contributors to your platform. Making sure they’re enjoying pleasant experiences and are met with quality content on your site, should be of your interest. To achieve this, you need to make sure your content moderation is done right.
In a perfect world moderating all types of content on your site, from text and images, to videos and 1-to-1 messages, would be ideal. The reality though, is that this is not an approach possible for all online platforms; for financial and technical reasons. If that’s your case, as a minimum approach make sure to identify your high-risk categories and content and start your moderation efforts there.
Don’ts of content moderation
Don’t: misinterpret what good content is
Quality content is key to build user trust and achieve a splendid user experience on your online platform, but it’s important to understand what good content is. Don’t make the mistake of misinterpreting good content and end up rejecting user-generated content simply because it’s of negative nature.
For example, a negative comment or review following a transaction can still be good content, as long as no harsh language is used of course. Genuine content is what you want, as it enhances quality and user trust.
Don’t: wait too long before you get started with moderation
If you’re in the early stages of establishing your online platform, getting started with content moderation might feel like its miles away. It’s not.
Don’t get us wrong, perhaps it shouldn’t be your main priority right out of the gate, but you need to have a plan for how to handle user-generated content, from a moderation perspective, when you scale. As you’re growing, and the network effect kicks in, you often see a rapid increase of content flooding into your site. You need to be prepared to handle that; if not, your big break might actually end up hurting you in the long run.
Don’t: waste resources
Don’t reinvent the wheel. With multiple content moderation tools and solutions, like Implio, available in the market, it’s important that you prioritize your resources carefully. Innovation and growth are what will boost your online platform to success, and this is where your dev resources will give you the most competitive advantage. Find your way to free up your resources for innovation, without risking falling behind with your moderation efforts.