The restrictions put in place to combat the global Covid-19 pandemic has had a devastating effect on many businesses. Social distancing, restrictions on physical services and a downturn in spending has also hurt most marketplaces and sharing economy sites despite their digital nature.
After months of closed down societies and harsh restrictions, nations are slowly and carefully opening up again, but the world is forever changed. Businesses who understands and adapts quickly to the new reality will be successful. To do so they’ll need to understand the challenges and opportunities arising in the post-corona business landscape.
We’ve asked 8 online marketplace experts to share their thoughts and predictions to help you prepare and adapt to the new reality.
User safety is key for all online platforms, particularly when you’re dealing with vulnerable youngsters. Moderating can be challenging and getting the balance between censorship and safety right can be hard.
We sat down with industry veteran and founder of Friendbase; Deborah Lygonis, to discuss the experience she’s gained from developing and running a virtual world for teens.
Interviewer: Hi Deborah. Could you please give us a short introduction to yourself?
Deborah: My name is Deborah Lygonis and I am a serial entrepreneur. I have started and run several businesses over the years, mainly within the software and gaming sector, but also e-health and other tech. I love tech and I’m passionate about startups and entrepreneurship. I also work as a coach and mentor for entrepreneurs within what’s called the European Space Agency Business Incubator; The ESA BIC, and for a foundation called Entrepreneurs Without Borders.
Interviewer: Wow! That’s an impressive background. One of the things you’ve started as an entrepreneur is Friendbase, right? Could you tell us a bit more about that?
Deborah: Yes. Friendbase is a company that I founded with my brother and a third guy called Andreas. We’ve known each other for many years. Well, obviously, I’ve known my brother for many years, but Andreas as well, has been part of our group of friends and acquaintances for many, many years. We decided to found Friendbase in 2013. We saw that the whole idea of virtual worlds hadn’t really migrated over to smartphones and we wanted to see if it was possible to create a complete cross-platform version.
So, we put together a mockup of an Android, IOS, Web version and put it out there to see if that was something that today’s young people would like.
Friendbase is a virtual world for teens where they can chat, play games and also design their looks and spaces. Now we’re also moving towards Ed tech in the way that we’ll be introducing quizzes that are both for fun but also have learning elements in them.
Interviewer: That sounds awesome. What would you say is the main challenge when it comes to running cross-platform online community and specifically one that caters to teens?
Deborah: There are a lot of challenges with startups in general, but also, of course, running an online community. One challenge is when you have people that meet each other in the forms of Avatar and written chat and they have different personalities and different backgrounds that can cause them to clash. The thing is that when you write in a chat, the nuances in the language don’t come through as opposed to when you have a conversation face to face. It’s really very hard to judge, the small subtleties in language and that can lead to misunderstandings.
Add to that as well that there are lots of different nationalities online. That in itself can lead to misunderstandings because they don’t speak the same language.
What starts off as a friendly conversation can actually rapidly deteriorate and end up in a conflict just because of these misunderstandings. That is a challenge, but that’s a general challenge, I think, with written social interactions.
Interviewer: Just so we understand how Friendsbase work. Do you have one to one chat, one to many chats or group chats? How does it work?
Deborah: The setup is that we can have up to 20 avatars in one space. No more, because then it will get too cluttered on the small phone screens. So, you can have group chats. I mean, you see the avatars and then they have a text bubble as they write so that it can be several people in one conversation.
Interviewer: Do you have the opportunity for groups of friends to form and join the same kind of space together?
Deborah: Yes. Each member has its own space. They can also invite and open up their space for other friends.
Interviewer: And in that regard. What you often see in the real world with team dynamics is that there is a group of friends and there is the popular people in that group. And then one person who maybe is a little bit an outsider, who will at times be bullied by the rest of the group. Do you see people ganging up on each other sometimes?
Deborah: I haven’t seen groups of people ganging up on one individual. It’s more the other way around. There are individuals that are out to cause havoc and who are just online to be toxic.
Interviewer: That means that you have in general, you have a really nice and good user base. But then there’s the rotten fruits that come in from time to time.
Deborah: That is what it is like today. We are still fairly early stage, though, when it comes to the amount of users. So I would expect this to change over time. And this is something that we’re prepared for. We added safety tools at a really early stage to be able to learn how to handle issues like this and also how to moderate the platform when incidents occur. So, I think that even though that we don’t have that type of ganging up on each other at the moment, I would expect that to happen in the future.
Interviewer: But it sounds like you’re prepared for it. Now you’ve made a really nice segue into my next question; What is the main motivation challenges you experienced running Friendbase? What are the main challenges right now and what do you expect you will have to handle later on?
Deborah: I think that a challenge in itself for all social platforms is to set the bar on what is acceptable and not.
Our target group are mid teens and up. So we don’t expect young children to be on Friendsbase. We feel that if we made a social world for young children, then we’d need to have a completely different set of regulations, more controlled regulations, rather than when it is teenagers and upwards.
However, that demographic is also very vulnerable. So, of course, there has to be some sort of measurement in place. The challenge is to determine, at what level do you want to put the safety bar and also how can you tell the difference between what is banter between friends and when it sort of flips over to actually be toxic or bullying? That’s something that is really, really hard to differ between. And I think that if you work with chat filters, then you have to have some sort of additional reporting system for when maybe the filters don’t manage this challenge. The filter is only a filter and can’t determine between the two. So that’s one challenge. It’s also complex to enforce the rules that are in place to protect the users without being perceived as controlling or patronizing.
At the moment, we also have a challenge in that we have users that come back solely for the purpose to cause havoc and create a toxic environment. We track them down and we ban their accounts, but it’s a continuous process.
That is something that should it escalate over time it will become increasingly time consuming. That’s why it’s really, really important for us to have tools in place so that it doesn’t have to be moderated manually. That will just take too much resource and time.
Of course, you have the even darker side of the internet; sexual predators that are out to groom vulnerable youngsters and to get them to maybe move over to a different platform where they can be used in a way that is extremely negative.
That’s something that is difficult to handle. But today, thanks to artificial intelligence and again, amazing toolsets out there. There are attempts to look at speech patterns and try and identify that sort of behavior. And there it’s also really great to have your own tool sets where the user can actually report someone if they feel threatened or if they feel that someone’s really creepy.
Interviewer: When you have returning users who have made it their goal to attack the platform, in a malicious way, do you see that it’s the same people returning based on their IP or the way that they talk?
Deborah: It’s not always possible to see it based on their IP because they use different ways of logging in. However, given their behavior, we can quickly identify them. And we have a group of ambassadors as well online on Friendbase that help us. On top of that we have a chat filter which can red flag certain behavior. So that helps as well.
There are a group that come back over and over again and for some mysterious reason they always use the same username. So they’re not that hard to identify. That group is actually easier to control than a group which has a different motive on why they are online and why they are trying to target youngsters. The toxic ones that are just there because they think it’s fun to behave badly. It’s easy to find them and close down their accounts.
Interviewer: We already touched upon this, but what would you say is the hardest moderation challenge to solve for you right now?
Deborah: The hardest moderation challenge to solve is, of course, finding the people who are deliberately out to target lonely youngsters that hunger for social contact. The whole grooming issue online is a problem. We are constantly trying to find new toolsets and encourage our users to contact us if there’s something that doesn’t feel right. So grooming is something that we’re very, very much aware of. If we happen to shut down someone’s account by mistake for a couple of hours, they’re most welcome to come to us and ask why. But we’d rather be safe than sorry when it comes to this kind of behavior. However, it is hard to track because it can be so very, very subtle in the beginning.
Interviewer: Friendsbase has been around for a while now. Are there any challenges that has changed or increased in occurrence over the years? And if yes. How?
Deborah: Actually, not really. I think the difference is in our own behavior as we are so much more aware of how we can solve different problems.
Bullying has been around for years. Free Internet as well. Sexual harassment of youngsters and between adults, of course, has also been around for years. It’s nothing new. I mean, the Internet is a fantastic place to be. It democratizes learning. You have access to the world and knowledge and entertainment.
But there is a dark side to it. From a bullying perspective you have the fact that previously, if you were bullied at school, you could go home or you could go to your social group somewhere else and you would have somewhere where you would feel safe.
When it’s online, it’s 24/7.
And it is relentless when it comes to the whole, child abuse part. Of course, it existed before as well. But now with the Internet, perpetrators can find groups that have the same desires as themselves and somehow together they can convince themselves as a group that it’s more acceptable. Which is awful. So that is the bad part of the net.
So, when you ask: Have the challenges changed or increased since we started Friendbase? No, not really. But what has changed is the attitude of how important it is to actually address these issues. When we started the company in 2013. We didn’t really talk that much about safety tools. I mean, we talked about should we have whitelist or a blacklist, the words. It was more on that level. But today most social platforms, they have moderation, they have toolsets, they have guidelines and policies and so forth.
So, I think that we who work with online communities as a whole have evolved a lot over the past years.
Interviewer: Yeah, I would say today in 2020, you probably wouldn’t be able to launch a social community or platform without launching with some sort of moderation tools and well-defined guidelines.
Deborah: I think you’re right. Several years ago, I did the pitch where we were talking about online safety and tools of moderation and were completely slaughtered. What we were told was that being good online or this whole be cool to be kind is going to stop our growth. It’s much better to let it all run rampant and then it will grow much faster. I don’t think anyone would say something like that today. So that’s a huge shift in mindset. Which is great. We welcome it.
Interviewer: That’s a fantastic story. You’ve been in this industry so long; you’ve seen this change. I find it fascinating that just seven years ago when you said I want to protect my users, people laughed at you. And now people would laugh at you if you said, I’m gonna go live without it.
Deborah: I know. Can you imagine going on stage today saying that I don’t care about safety? I mean, people would be so shocked.
Interviewer: You said before when we talked about the main challenges if you experienced growth, you’d need to change your approach to moderation and automate more in order to just keep up?
Deborah: Yes, definitely. We try and stay on top of what toolsets are out there.
We build in our own functionality, such as muting users. So, if someone is harassing you, you can mute them so that you can’t see what they’re writing. Small changes like that, we can do ourselves, which will be helpful.
Something I’d like to see more and that we’ve actually designed a research project around is to not only detect and ban bad behavior, but to encourage good behavior.
Because that in itself will also create a more positive environment.
That’s something that we’re really excited about, to work with people that are experts within gamification and natural language processing to see how can we create tool sets where we can encourage good behavior and see what we can do. Maybe we can start deflecting a conversation that is obviously on its way to going seriously wrong. It could be so simple as a small time delay when somebody writes something really toxic with a pop up saying: “Do you really want to say this?”. To just make someone think once more.
This is something that we’re looking into. It’s super interesting. And I hear there’s a couple of companies just the last few months that are also talking about creating tool sets for something like this. So, I think it’s going to be a really, really interesting development over the coming years.
Interviewer: It sounds like safety is very important to Friendbase. Why is that?
Deborah: Why is that? Quite early on, we who work in the company discussed what our core values should be. And one of the core values we decided upon is inclusion. Everybody is welcome. And for everyone to feel welcome. You have to have a welcoming atmosphere.
When you continue along that line of thought, then obviously you come to the point where, OK, if everyone’s going to be welcome and you want it to be a friendly space, then somewhere you’re going to have to stop toxic behavior. So, for us safety, it’s just part of our core values.
And also, I have a teenage daughter who loves gaming. I’ve seen how platforms behave. She’s part of groups that interact with each other online. I just feel that there must be a way of doing things better. It’s as simple as that. We can do better than this, letting it be super toxic. And there are some amazing people out there working with fantastic toolsets. There are some fantastic platforms and social games out there that also work in the same sort of direction as we do. It’s really great.
And you know what? To be quite honest, I think that there have been several case studies where it’s proven as well from a business perspective that you have a longer retention and a higher profitability when you can keep your user online for a longer time. So, you know, in itself, from a business sense, it also makes perfect sense to work in a way where you keep your user as long as possible.
Interviewer: You have tons and tons of experience obviously with startups and social platforms. If you were to give a piece of advice to someone who is running a similar service to Friendbase or even who are thinking about starting one, what would that be?
Deborah: It would be, first of all, to determine what level of safety you want to have, depending on your user group. Obviously, the younger demographic you have, the more safety tools you must ensure that you have in place. Also, not to build everything yourself. Especially if you’re working on an international market with many languages. Just to be able to filter many languages and in a decent way is a huge undertaking. If you think that you’re going to be able to hack together something yourself, it’s not that easy. It’s better to work with a tool or a company that has that as their core business because they will constantly be working with the state of the art solutions.
So better to liaise with switched on companies that already work with this as their main reason for being. I think that’s important. And then, of course, add your own easy to report system, easy to communicate with your user’s system so that you have sort of a double layer.
I mean, I’ve seen several different companies that work now with different moderation tools and chat filters and so forth. Many of them they do stellar work. And it’s important at the end of the day because if anything really, really bad would happen, then you’re just finished as a business. It’s as simple as that. The last thing you would want is to have someone knock on your door and shut you down because something’s happened online in your platform.
Interviewer: Definitely! What’s in the future for Friendbase? Where are you in two years?
Deborah: Where are we now? We’re now raising funds, because what we’ve seen is that we have a very, very loyal member base and they are wanting to invite more of their friends. And I think that with very, very little work, we can get the platform on a really interesting growth path.
Deborah: So, yeah, our our aim is to become one of the big global players. It’s exciting times ahead.
Interviewer: For sure. Any closing remarks? Any statements you want to get out there from a personal point of view or from Friendbase?
Deborah: The Internet is a great place to be because there’s so much you can learn. You can meet so many interesting people. But, there is a dark side as well. And you have to be aware of it. Just by being a little bit street smart online people can keep themselves safe. And we’re getting there. People are learning. Schools have it in their curriculum, social platforms try to teach users how to behave. So slowly but surely, we’re getting there.
The outbreak of COVID-19 or Coronavirus has thrown people all over the world into fear and panic for their health and economic situation. Many have been flocking to stores to stock up on some essentials, emptying the shelves one by one. Scammers are taking advantage of the situation by maliciously playing on people’s fear. They’re targeting items that are hard to find in stores and make the internet – and especially online marketplaces – their hunting ground, to exploit desperate and vulnerable individuals and businesses. Price gouging – or charging unfairly high prices – fake medicine or non-existent loans are all ways scammers try to exploit marketplace users.
In this worldwide crisis, now is a great time for marketplaces to step up and show social responsibility by making sure that vulnerable individuals don’t fall victim to corona related scams and that malicious actors can’t gain on stockpiling and selling medical equipment sorely needed by nurses and doctors fighting to save lives.
Since the start of the Covid-19 epidemic we’ve worked closely with our clients to update moderation coverage to include Coronavirus related scams and have helped them put in place new rules and policies.
We know that all marketplaces currently will be struggling to get on top of the situation and to help we’ve decided to share some best practices to handle moderation during the epidemic.
Here are our recommendations on how to tackle the Covid-19 crisis to protect your users, your brand and retain the trust users have in your platform.
Refusal of coronavirus related items
Ever since the outbreak started, ill-intentioned individuals have made the price of some items spike to unusually high rates. Many brands have already taken the responsible step of refusing certain items they wouldn’t usually reject, and some have set bulk-buying restrictions (just like some supermarkets have done) on ethical and integrity grounds.
Google stopped allowing ads for masks, and many other businesses have restricted the sale or price of certain items. Amazon removed thousands of listings for hand sanitizer, wipes and face masks and has suspended hundreds of sellers for price gouging. Similarly, eBay banned all sales of hand sanitizer, disinfecting wipes and healthcare masks on its US platform and announced it would remove any listings mentioning Covid-19 or the Coronavirus except for books.
In our day to day work with moderation for clients all over the world we’ve seen a surge of Coronavirus related scams and we’ve developed guidelines based on the examples we’ve seen.
To protect your customers from being scammed or victim of price-gouging and to preserve your user trust, we recommend you refuse ads or set up measures against stockpiling for the following items.
- Surgical masks and face masks (type ffp1, ffp2, ffp3, etc.) have been scarcely available and have seen their price tag spike dramatically. Overall, advertisements for all kinds of medical equipment associated with the Covid-19 should be refused.
- Hands sanitizer and disposable gloves are also very prone to being sold by scammers at incredibly high prices. We suggest either banning the ads altogether or setting regular prices on these items.
- Empty supermarket shelves of toilet paper have caused this usually cheap item to be sold online at extortionate prices, we suggest you monitor and ban these ads accordingly.
- Any ads with the mention of Coronavirus or Covid-19 in the text should be manually checked to ensure that they aren’t created with malicious intends.
- The sale of magic medicines pretending to miraculously cure the virus.
- Depending on the country and its physical distancing measures, ads for home services such as hairdressers, nail technicians and beauticians should be refused.
- In these uncertain times, scammers have been selling loans or cash online, preying on the most vulnerable. Make sure to look for these scams on your platform.
- Similarly, scammers have been targeting students talking about interest rates being adjusted.
Optimize your filters
Ever since the crisis started, scammers have become more sophisticated as days go by, finding loopholes to circumvent security measures. By finding alternative ways to promote their scams, they use different wordings such as Sars-CoV-2 or describing masks by their reference numbers such as 149:2001, A1 2009 etc. Make sure your filters are optimized and your moderators continuously briefed and educated to catch all coronavirus-related ads.
Right now, we suggest that tweak your policies and moderation measures daily to stay ahead of the scammers. As the crisis evolves malicious actors will without doubt continue to find new ways to exploit the situation. As such it’s vital that you pay extra attention to your moderation efforts over the following weeks.
If you need help tackling coronavirus-related scams on your platform, get in touch with us.
The biggest challenge facing technology today isn’t adoption, it’s regulation. Innovation is moving at such a rapid pace that the legal and regulatory implications are lagging behind what’s possible.
Artificial Intelligence (AI) is one particularly tricky area for regulators to reach consensus on; as is content moderation.
With the two becoming increasingly crucial to all kinds of businesses – especially to online marketplaces, sharing economy and dating sites – it’s clear that more needs to be done to ensure the safety of users.
But to what extent are regulations stifling progress? Are they justified in doing so? Let’s consider the current situation.
AI + Moderation: A Perfect Pairing
Wherever there’s User Generated Content (UGC), there’s a need to moderate it; whether we’re talking about upholding YouTube censorship or netting catfish on Tinder.
Given the vast amount of content that’s uploaded daily and the volume of usage – on a popular platform like eBay – it’s clear that while action needs to be taken, it’s unsustainable to rely on human moderation alone.
Enter AI – but not necessarily as most people will know it (we’re still a long way from sapient androids). Mainly, where content moderation is concerned, the use of AI involves machine learning algorithms – which platform owners can configure to filter out words, images, and video content that contravenes policies, laws, and best practices.
AI not only offers the scale, capacity, and speed needed to moderate huge volumes of content; it also limits the often-cited psychological effects many people suffer from viewing and moderating harmful content.
Understanding The Wider Issue
So what’s the problem? Issues arise when we consider content moderation on a global scale. Laws governing online censorship (and the extent to which they’re enforced) vary significantly between continents, nations, and regions.
What constitutes ‘harmful’, ‘illicit’ or ‘bad taste’ isn’t always as clear cut as one might think. And from a sales perspective, items that are illegal in one nation aren’t always illegal in another. A lot needs to be taken into account.
But what about the role of AI? What objections could there be for software that’s able to provide huge economies of scale, operational efficiency, and protect people from harm – both users and moderators?
The broader context of AI as a technology needs to be better understood – which itself presents several key ethical questions over its use and deployment, which vary in a similar way – country-to-country – to efforts designed to regulate content moderation.
To understand this better, we need to look at ways in which the different nations are addressing the challenges of digitalisation – and what their attitudes are towards both online moderation and AI.
The EU: Apply Pressure To Platforms
As an individual region, the EU arguably is leading the global debate on online safety. However, the European Commission continues to voice concerns over (a lack of) efforts made by large technology platforms to prevent the spread of offensive and misleading content.
Following the introduction of its Code Of Practice on Disinformation in 2018, numerous high profile tech companies – including Google, Facebook, Twitter, Microsoft and Mozilla – voluntarily provided the Commission with self-assessment reports in early 2019.
These reports document the policies and processes these organisations have undertaken to prevent the spread of harmful content and fake news online.
While a thorough analysis is currently underway (with findings to be reported in 2020), initial responses show significant dissatisfaction relating to the progress being made – and with the fact that no additional tech companies have signed up to the initiative.
AI In The EU
In short, expectations continue to be very high – as evidenced by (and as covered in a previous blog) the European Parliament’s vote to give online businesses one hour to remove terrorist-related content.
Given the immediacy, frequency, and scale that these regulations require, it’s clear that AI has a critical and central role to play in meeting these moderation demands. But, as an emerging technology itself, the regulations around AI are still being formalised in Europe.
However, the proposed Digital Services Act (set to replace the now outdated eCommerce Directive) goes a long way to address issues relating to online marketplaces and classified sites – and AI is given significant consideration as part of these efforts.
Last year the EU published its guidelines on ethics in Artificial Intelligence, citing a ‘human-centric approach’ as one of its key concerns – as it deems that ‘AI poses risks to the right to personal data protection and privacy’ – as well as a ‘risk of discrimination when algorithms are used for purposes such as to profile people or to resolve situations in
While these developments are promising, in that they demonstrate the depth and importance which the EU is tackling these issues, problems will no doubt arise when adoption and enforcement by 27 different member states are required.
Britain Online Post-Brexit
One nation that no longer needs to participate in EU-centric discussions is the UK – following its departure in January this year. However, rather than deviate from regulation, Britain’s stance on online safety continues to set a high bar.
An ‘Online Harms’ whitepaper produced last year (pre-Brexit) sets out Britain’s ambition to be ‘the safest place in the world to go online’ and proposes a revised system of accountability which moves beyond self-regulation and the need to establish a new independent regulator.
Included in this is a commitment to uphold GDPR and Data Protection laws – including a promise to ‘inspect’ AI and penalise those who exploit data security. The whitepaper also acknowledges the ‘complex, fast-moving and far-reaching ethical and economic issues that cannot be addressed by data-protection laws alone’.
To this end, a Centre for Data Ethics and Innovation has been established in the UK – complete with a two-year strategy setting out its aims and ambitions, which largely involves cross-industry collaboration, greater transparency, and continuous governance.
Numerous other countries – from Canada to Australia – have expressed a formal commitment to addressing the challenges facing AI, data protection, and content moderation. However, on a broader international level, the Organisation for Economic Co-operation and Development (OECD) has established some well-respected Principles on Artificial Intelligence.
Set out in May 2019. as five simple tenets designed to encourage successful ‘stewardship’ of AI, these principles have since been co-opted by the G20 in their stance on AI.
They are defined as:
- AI should benefit people and the planet by driving inclusive growth, sustainable development and well-being.
- AI systems should be designed in a way that respects the rule of law, human rights, democratic values and diversity, and they should include appropriate safeguards – for example, enabling human intervention where necessary – to ensure a fair and just society.
- There should be transparency and responsible disclosure around AI systems to ensure that people understand AI-based outcomes and can challenge them.
- AI systems must function in a robust, secure and safe way throughout their life cycles and potential risks should be continually assessed and managed.
- Organisations and individuals developing, deploying or operating AI systems should be held accountable for their proper functioning in line with the above principles.
While not legally binding, the hope is that the level of influence and reach these principles have on a global scale will eventually encourage wider adoption. However, given the myriad of cultural and legal differences, the tech sector faces, international standardisation remains a massive challenge.
The Right Approach – Hurt By Overt Complexity
All things considered, while the right strategic measures are no doubt in place for the most part – helping perpetuate discussion around the key issues – the effectiveness of many of these regulations largely remains to be seen.
Outwardly, many nations seem to share the same top-line attitudes towards AI and content moderation – and their necessity in reducing harmful content. However, applying policies from specific countries to global content is challenging and adds to the overall complexity, as content may be created in one country and viewed in another.
This is why the use of AI machine learning is so critical in moderation – algorithms can be trained to do all of the hard work at scale. But it seems the biggest stumbling block in all of this is a lack of clarity around what artificial intelligence truly is.
As one piece of Ofcom research notes, there’s a need to develop ‘explainable systems’ as so few people (except for computer scientists) can legitimately grasp the complexities of these technologies.
The problem posed in this research is that some aspects of AI – namely neural networks which are designed to replicate how the human brain learns – are so advanced that even the AI developers who create them cannot understand how or why the algorithm outputs what it does.
While machine learning moderation doesn’t delve as far into the ‘unknowable’ as neural networks, it’s clear to see why discussions around regulation persist at great length.
But, as is the case with most technologies themselves, staying ahead of the curve from a regulatory and commercial standpoint is a continuous improvement process. That’s something that won’t change anytime soon.
New laws and legislations can be hard to navigate. Besedo helps businesses like yours get everything in place quickly and efficiently to adhere to these new legislations. Get in touch with us!
Scammers are unrelenting. And smart. They’re active right throughout the year. This means there’s no particular season when online marketplace and classified site owners need to be extra vigilant. The pressure’s always on them to maintain user safety.
However, scammers know when and how to tailor their activities to maximise opportunities. That’s why they’ll often latch onto different events, trends, seasons, sales, and other activities throughout the year – using a variety of techniques to lure in users, under the guise of an offer or piece of information.
With so much going on in 2020 – from the Tokyo Olympics to US election – scammers will almost certainly be more active than usual. Here’s what consumers and marketplaces need to be aware of this year.
If you want to learn more about the specific scam spikes, visit our scam awareness calendar where we predict spikes on a month-by-month basis.
When the nights draw in and temperatures drop, many begin to dream of sunnier climes and set about searching for their next holiday.
But whether it’s a summer booking or winter getaway, price is always an issue. Cue thousands of holiday comparison sites, booking portals, and savings sites. While many of these are legitimate outfits, often the convoluted online booking experience – as a consequence of using aggregation sites – can confuse would-be travellers.
They’re right to be cautious. As with buying any other goods or services online, even the most reputable travel sites can fall victim to scammers – with scammers advertising cheap flights, luxury lodgings at 2 Star prices, and also offering ‘free’ trips (before being lured into attending a pressured timeshare sales pitch).
If in doubt, customers should always book the best-known travel sites, pay using their verified portal (rather than a link sent via email or direct bank transfer) to ensure that the company that they actually pay for their holiday is accredited by an industry body (such as ATOL in the UK).
From Valentine’s Day to Easter; Halloween to Hanukkah – seasonal scams return with perennial menace year-after-year. Designed to capitalise on themed web searches and impulse purchases, fraudsters play the same old tricks – and consumers keep falling for them.
Charity scams tend to materialise around gift-focused holidays, like Thanksgiving in the US, as well as at Christmas. Anyone can fall victim to them – such as the recent case of NFL player, Kyle Rudoph, who gave away his gloves after a high scoring game for what he thought was a charity auction; only to discover they were being sold on eBay a few days later.
Another popular seasonal scam is phishing emails offering limited-time discounts from well-known retailers, as well as romance scams (catfishers) in which some are prepared to cultivate entire relationships online with others simply to extract money from them.
The general rule with any of these is to be wary of anyone offering something that seems too good to be true – whether it’s a 75% off discount or unconditional love. Scammers prey on the vulnerable.
A whole summer of soccer is scheduled for June and July this year – thanks to the upcoming UEFA European Football Championship (Euro 2020) and the Copa America; both of which will run at the same time: on opposite sides of the World.
However, while you’d expect fake tournament tickets and counterfeit merchandise to be par for the course where events like these are concerned – and easily detectable. But the reality is that many fraudulent third party sites are so convincing, buyers are falling for the same scams experienced in previous years.
If in doubt, customers should always purchase from official websites — such as UEFA online and Copa America. While Euro 2020 tickets are sold out for now (over 19 million people applied for tickets), they’ll become available to buy again in April for those whose teams qualified during the playoffs.
While third party sites are the biggest culprits, marketplace owners should be extra vigilant where users are offering surplus or cheap tickets to any games at all. Although given the prices at which the tickets sell for, you’d be forgiven for thinking that the real scammers are the official vendors themselves.
The Summer Olympic Games is no stranger to scandals – of the sporting variety. However, In the same way as the soccer tournaments referenced above, fake tickets tend to surface in the run-up to the games themselves – on ‘pop-up’ sites as well as marketplaces.
Telltale signs of a scam include vendors asking to be paid in cryptocurrencies (such as Bitcoin), official-sounding domain names (that are far from official), as well as phishing emails, malware, and ransomware – all designed by scammers looking to cash in on the surrounding media hype and immediate public interest that high-profile events bring.
In addition to scams preceding the games, advice issued just prior to the 2016 Rio Olympics recommends visitors be wary of free public WiFi – at venues, hotels, cafes, and restaurants – and recommends travellers take other online security precautions; such as using a Virtual Private Network (VPN) in addition to antivirus software.
Lessons learned from the 2018 Winter Olympics in Pyeongchang shouldn’t be ignored either. Remember the ‘Olympics Destroyer’ cyber attack? That shut down the event’s entire IT infrastructure during the opening ceremony? There was little anyone could do to prevent that from happening (so advanced was the attack and so slick was its coordination). Still, it raised a lot of questions around cybersecurity generally – which no doubt have informed best practice elsewhere.
Also, visitors should avoid downloading unofficial apps or opening emails relating to Olympics information – unless they’re from an official news outlet, such as NBC, the BBC, or the Olympic Committee itself.
Probing Political Powers
While those in the public eye may seem to be the most at risk, ordinary citizens are too. We have Facebook and Cambridge Analytica to thank for that.
Despite this high profile case, while political parties themselves must abide by campaigning practices and even though data security laws – such as GDPR – exist to protect our data, it seems more work needs to be done – by social media companies and governments.
But what can people do? There are ways to limit the reach that political parties have, such as opting out of practices like micro-targeting and being more stringent with social media privacy settings, good old-fashioned caution and data hygiene are encouraged.
To help spread this message, marketplaces and classified sites should continue to remind users to change their passwords routinely, exercise caution when dealing with strangers, and advocate not sharing personal data off-platform with other users – regardless of their assumed intent.
Sale Of The Century?
From Black Friday to the New Year Sales – the end of one year and the early part of the next is a time when brands of all kinds slash the prices of excess stock – clearing inventory or paving the way for the coming season’s collection. It’s also a time when scammers prey upon online shoppers’ frenzied search for a bargain or last-minute gift purchase.
As we’ve talked about in previous blogs, the level of sophistication with which scammers operate in online marketplaces seems to get increasingly creative – from posting multiple listings for the same items, changing their IP addresses, or merely advertising usually expensive items at low prices to dupe those looking to save.
Prioritising Content Moderation
The worrying truth is that scammers are becoming increasingly sophisticated with the techniques they use. For online marketplace owners, not addressing these problems can directly impact their site’s credibility, user experience, safety, and the amount of trust that their users have for their service.
Most marketplaces are only too well aware of all of these issues, and many are doing a great deal to inform customers of what to look out for and how to conduct more secure transactions, online.
However, action always speaks louder than words – which is why many are now actively exploring content moderation – using dedicated expert teams and machine learning AI – the latter adds value to larger marketplaces.
Keeping customers informed around significant events and holidays – like those set out above – ensures that marketplaces are seen as transparent and active in combating fraud online.
This also paints sites in a favourable light when it comes to attracting new users, who may stumble upon a new listing in their search for seasonal goods and services.
Ultimately, the more a site does to keep its users safe, the more trustworthy it’ll be seen as.
Online marketplaces are built on the connection between buyers and sellers and the ability to bring the two sides together to transact. However, many users stumble upon poor content or idle buyers or sellers.
In a survey conducted by Besedo, we found that 50% of online marketplace users encountered content they believed to be a scam. On top of that, 75% of them said they’d never return to the site.
Ensuring a smooth, safe and scam-free experience on your online marketplace is a must if you wish to stay in the race with your fierce competition.
What if your buyers and sellers only encountered qualitative users and content on your online marketplace? How would that impact your overall conversion and retention rates?
A significant first step is to qualify your sellers to ensure the high-quality of both sellers and their goods on your marketplace.
In this complimentary white paper, we share advice that will help you build a solid setup to qualify users and how this process can positively impact your retention and conversion rates.
You’d be forgiven for thinking that ensuring users aren’t subjected to bad content on dating sites and online marketplaces means waging war on trolling, nudity, and unsavoury content.
Sure, that’s a large part of it, but the fact is bad content has a broader meaning – it’s anything designed to harm or deceive users; images that can negatively impact their user experience, break their trust, or even – worst-case scenario – put them at risk of theft or abuse.
As a result, marketplaces and dating site owners need to ensure they’re aware of the potential outcomes bad images pose. Let’s take a look at the most common types of bad images and consider the impact on both dating app and marketplace users.
The trouble with watermarks is that they often look bad. Sure, they can be positioned more subtly on an image, but the overall impact is that they detract from the image focus. However, they’re still used by many vendors – often to avoid paying sellers’ fee to use an online marketplace.
For example, someone selling a high-priced item (or numerous items of the same price, like a TV, computer, tablet, or phone) may try to circumvent site policies by including their email address, website URL, or phone/WhatsApp number in the watermark itself.
The likelihood is, however, that those who try to lure users away are scammers – compromising user safety (and user trust too) as they’re directed away from legitimate marketplaces.
On marketplaces, watermarks on profile pics are mainly used in the same way as product photos. However, on dating sites, their use is much more frequent and more-often-than-not, used to promote escort services and prostitution. Watermarks are used similarly in 1-to-1 chats – to send contact information in a way that can’t be detected by text filters.
eBay banned watermarks a couple of years ago, initially stating they would monitor pictures – before quickly reneging on this to simply condemn and discourage watermarks rather than police the site itself.
Presumably, this change of heart was prompted by the seller community – or more specifically, the image creators. From a photographer or designer’s point of view, the argument for watermarks is to prevent the misuse of their work and to preserve copyright over them. Also, while watermarks don’t stop images from being copied, but creators can use services like Google Image Search and TinEye to monitor misuse.
Instead of watermarks, an alternative is only to provide low-resolution images – particularly where product photography’s concerned. Another way is to put copyright information in image metadata.
Ultimately watermarks can make images look clumsy and inauthentic – even though they’re designed to make them look more ‘official’. From a user experience perspective, they disrupt the overall image; masking the complete picture. However, for the marketplaces and dating sites themselves, by luring users away from the platform (using embedded contact information) they eliminate associated fees – and if everyone did that, there’d be a major problem.
An ongoing problem for many marketplaces, duplicate listings are something of a grey area in terms of their legitimacy. Essentially while many vendors mean no malice – other than to double their chances of a decent sale – the wider impact is that duplicate ads and images denigrate the browsing and search user experience.
But their troublesome nature doesn’t end there, unfortunately. The use of duplicate images is a tactic long practised by scammers on – as product images on online marketplaces and profile pictures on dating sites.
Duplicate images are commonly used by those selling counterfeit items – often featuring ‘real’ product images taken from genuine sources and repurposed on public marketplaces. Similarly, dating sites are all too aware of ‘catfishing’ and of profiles using images taken from stock photo libraries or even modelling agencies. It’s not just false advertising that users need to be aware of – but romance scammers too.
While duplicate listing scanners exist, most of these are text-based. More sophisticated image detectors do exist, and big marketplaces offer their own detection algorithms too. But many of these are still relatively rudimentary and open to misuse (case in point: Facebook Marketplace which is notoriously easy to ‘hack’).
As a result, many marketplaces are offering greater clarity in terms of how they define duplicates and won’t allow vendors to list the same item in different categories. eBay sternly warns that offenders will see a loss of visibility for their listings. After all, their site’s trust is at stake – which ultimately leads to fewer conversions.
Also, in a similar way to watermarks, duplicate images can also ruin the user experience – confusing customers as to which is the ‘real’ or original example of the product they’re considering buying. And on dating sites, well, there can’t be more than one person with the same profile picture can there? Honesty accounts for a great deal – and people can look very different from one picture to the next – so including a variety of different images is essential for this reason too.
Love or hate the idea, facial recognition technology is increasing in sophistication. It’s already being used in security tech – to do everything from unlocking phones to crossing borders. However, while Facebook might be making leaps and strides in facial recognition, on marketplaces and dating sites, they remain problematic from a content perspective.
As we’ve discussed already, where images of people – especially faces – are concerned, honesty is always the best policy. On dating sites, in particular, users often use images that make them look more attractive – often using different filters to enhance their appearance.
However, when there’s a lot of people in a photograph, it’s often hard to tell who the profile owner is. This has obvious complications on dating sites – where users could be easily misled. They could begin contact with one person thinking they’re another – something that could be disastrous for the user and the dating site – again, because misconceptions can break the trust bond.
Coupled with the proliferation of deep fakes and face/profile image searches and the problem gains another more complex layer – meaning there’s not just a threat to a user’s experience; their safety is at risk too.
In online marketplaces, this isn’t as big a problem, except that the use of people – or more specifically their faces – distracts from the product itself, so vendors should use as few as possible in photography, or not at all if they can help it.
Wherever users can upload their own content, there can be no denying that pornography, nudity, and sex-related images will appear – in both online marketplaces and dating sites.
Where affairs of the heart (or libido) are concerned, while consenting adults are free to share pictures of whatever body parts they like best; for the most part – on public forums and in private chats – it’s unwanted. And when that’s the case; it’s user harassment.
Harassment (of the pictorial and verbal variety) has become entrenched in dating app culture. Largely as a result of male behaviour toward women (check the Instagram account ‘ByeFelipe’ for some prime examples). So, efforts to get rid of it have spawned a whole new wave of female-initiated dating services; such as Bumble.
However, even this doesn’t prevent lewd images from being shared; which is why additional services are needed. Bumble’s Privacy Detector, for instance, which detects nudity, blurs it, and warn users that a picture or video message may be pornographic when it lands in their chat feed.
Anything nudity related is naturally more common on dating sites than marketplaces, but that doesn’t preclude them. Profile photos can often be revealing (which may or may not be ok depending on the site) and of course, as mentioned above, ‘escort’ services may advertise using images that push the boundaries.
The effect? Not keeping users safe from overtly sexual images is a big problem. As mentioned before, it breaks the trust established between user and site. While on dating sites unsolicited nudity is now frequent, that doesn’t make it acceptable. And where online marketplaces are concerned, user-generated content that contains nudity denigrates the site’s reputation.
However, it’s also essential to maintain a balanced view and offer a specific definition of what constitutes nudity on your own site – which might vary depending on the nature of your website.
Picture Of Success
All in all, you’re not going to be able to stop your users from seeing awful content. When users innocently browse a marketplace or look at dating profiles, there’s no guarantee that the images they’ll see will be legitimate, tasteful, or even legal.
What you can do, though, as a site owner is to ensure your site offers the right policies, definitions, and appropriate courses of action. Moderation is crucial to avoid the proliferation of bad images on your site. But it’s no easy task when it relies on user-generated content.
That’s why online content moderation tools are critical to helping online marketplaces and dating sites detect unwanted images and remove them instantly. At Besedo, we combine AI image moderation with human moderation to efficiently tackle the propagation of inappropriate or undesirable images you don’t want on your site.
As any app (or online service) provider knows – in their quest to hit an all-important network effect – it’s not just downloads and user numbers that indicate success. Revenue generation is ultimately needed to ensure longevity.
Dating apps have established some of the most forward-thinking and innovative monetization methods in technology today. But finding a perfectly matching monetization strategy for your app or dating site means adopting a method that reflects its content, style, and user experience.
Luckily, there are lots of different tried and true monetization strategies out there already. Although they broadly fall into two major categories – in which the user pays or a third party pays – there are many different variations.
Here are some ways dating site owners can monetize their operations or improve their current strategy.
Advertising: Great When There’s Scale
Allowing other brands to advertise on your site has been part of the online world since the first sites went up. A natural extension of the broadcast media commercial model, passing the cost onto third party advertisers allows dating sites and apps to offer services for free: albeit for the price of the user’s attention.
Ad formats themselves come in all shapes and sizes – from simple PPC text ads to full-page interstitials, as well as native ads (more consistent with a site’s overall inventory), in-line carousel ads, in-feed content stream ads; among many others. Revenue is either generated via clicks, views, or transactions.
However, dating apps offer higher click-through rates and eCPMs (effective cost per thousand impressions) than games or other types of apps. Despite this, brands still need to work harder to make an impact as consumers have grown weary/immune/resistant to digital advertising.
Where online dating apps and sites are concerned, third party commercial affiliations range from the sublime to the ridiculous. Some pairings – like Tinder’s Dominos and Bud Light beer partnerships – might appear odd at first but, considering the importance of food and drink in the dating/socializing scene, actually make perfect sense.
From a business perspective, campaigns like these are a testament to a dating app’s ability to engage certain demographics (usually millennials) at scale; demonstrating the pulling power of a specific dating platform.
However, it’s not necessarily a technique that can be relied on to monetize a digital dating service from its very inception. Other methods are much more effective at doing that – often by selling their features and benefits. But this involves the cost being pushed back onto the user.
Subscriptions: Luring Users Behind The Paywall
Subscriptions ain’t what they used to be. Consumers are a lot more reluctant to part with their cash if they can’t see a genuine benefit for the service they’re from the very outset.
For some, better user experience is enough to sway them to part with a little cash each month. However for others, given that so many ‘free’ dating apps exist (admittedly of varying quality), unless they can clearly see what they’ll be getting for their money, they’ll take their chances elsewhere.
To overcome this, dating sites and apps offer varying degrees of ‘membership’ which can seem a little muddled to the uninitiated. So let’s consider the main contenders.
Firstly there’s the ‘free and paid app versions’ model: in which the free version has limited functionality, meaning the user must upgrade to fully benefit. Stalwarts like OkCupid and Plenty of Fish were among the first pioneers here – but many others champion this model too, including EliteSingles, Jaumo, Zoosk, Grindr and HowAboutDating – offering monthly and annual subscriptions.
The ‘Freemium’ model offers a similar experience – providing basic functionality for free – such as finding and communicating with others. However, other perks are available for an additional cost.
Badoo’s ‘Superpowers’ feature is probably the best known: letting users see who ‘liked’ them and added them to their favorites, as well as giving access to invisible mode, having their messages highlighted – plus they don’t see any ads. In fact, the popularity of Tinder’s ‘Rewind’ feature (taking back your swipe) led the company to start charging for it via it’s Tinder Plus and Tinder Gold packages. Bumble Boost, Hinge’s Preferred Membership, and Happn’s Premium are other scope-widening freemium services worth mentioning too.
A slight variation is the ‘free app with in-app purchases’ model. In addition to greater functionality – like a broader search radius and more daily matches – users can buy virtual and actual gifts and services. For example, Plenty Of Fish lets users digitally buy in-app ‘Goldfish’ credits to send virtual gifts to their potential dates – a folly to break the ice basically.
However, those that don’t want to pay, but are keen to test a few additional dating app perks, can often complete in-app tasks for limited-time access to premium accounts. Users are usually presented with an ‘offerwall’ detailing tasks to complete and the rewards to be reaped. MeetMe’s rewarded videos are a great example of this, as are rewarded surveys which seem to become increasingly common – and were trialed by dating app Matcher (now Dangle) a while back.
Activities like these indicate dating sites’ key asset: their audience data. Given that 15% of Americans use dating services and that the average user dedicates around 8 minutes to every session, the opportunity is real for those that achieve a certain scale.
But you can’t just sell data – can you?
Data Monetization: Insights For Sale
The sale of user data is a big no-no when specific information is involved (remember Cambridge Analytica?). But when the user grants consent and the data remains anonymous, well, that’s a different story.
Companies operating in EU countries need to abide by GDPR regulations or risk severe penalties, and other international data security initiatives, such as the EU-US Privacy Shield Framework are held in high esteem. So how can dating sites use their rich data sources as a revenue generation tool?
The only kind of data that can be sold is non-personal data – with a user’s consent. Even then, the type of data source is restricted to basic parameters: device types, mobile operator, country, screen size – among others.
The good news is that there’s significant demand for all of this data – from market researchers across many different sectors for a range of purposes; including optimizing user experiences and understanding buying choices.
On another positive note, according to one research survey, 95% of respondents are content to use apps that collect anonymous usage statistics.
However, unless your dating app has more than 50,000 daily active users, it won’t offer a large enough pool to draw from; and it will prove difficult to find a buyer for it.
Which Monetization Strategy Works Best?
All things considered, as with many types of online businesses, the greater the combination of monetization methods, the more profit there is to be had. Perhaps that can explain Tinder’s phenomenal global success.
But in isolation, each method has its drawbacks. Advertising only reaps a reward when a service offers scale; otherwise, where’s the value for brands? Conversely, charging users for a new service can be tricky to justify – unless the cost unlocks some additional never-seen-before feature. And without scale, charging marketers for data insights is pretty much impossible.
What is crucial, however, from the very outset, is that dating platforms establish a strong, dedicated user base. This means doubling down on trust and user safety, and finding ways to keep users engaged.
Despite the many positive things about dating sites, for some, the negative connotations persist. While sites and apps are a lot more conscious of preventing these, as with any platform that relies on user-generated content, the risk of users being catfished, shown inappropriate content, or de-frauded is always prevalent.
However, there are lots that digital dating platforms can do to build trust in their platforms and boost conversions. Content moderation is just one area – but it’s one that any dating service looking to expand its user base can’t ignore.
Ultimately there’s no substitute for getting the service right, knowing your users’ wants and needs (and there are many different dating services!), and developing a safe, secure and engaging environment for them to interact in. With these established, and when active usage hits a critical mass, monetization becomes a natural next step.
Just like any other business, online marketplaces also continuously tend to look for new and improved revenue streams to boost their growth. We caught up with Marketplace optimization and growth specialist and founder of the marketplaceplaybook.com, Bec Faye, to hear her take on how marketplaces can monetize their platforms efficiently.
In the interview, we speak about the importance of considering your users’ behavior and experiences when deciding on your monetization strategy, and we explore a successful disruptive monetization strategy that will inspire your inner creativity.
Watch the Interview:
Want to read the interview instead?
Emil: Hi, everyone, I’m here with Bec Faye, Marketplace Optimization & Growth Specialist, who is running the marketplaceplaybook.com. Bec, would you like to introduce yourself?
Bec: Hi. It’s really great to be here. As you explained, I specialize in helping marketplaces to really optimize their growth and to be able to work from the UX kind of angle and looking at conventional organization. But in amongst that, I work with a lot of different marketplaces of all different shapes, sizes and stages. I am really looking forward to have a chat today.
Emil: It’s really exciting to have you on. If you haven’t seen before, me and Bec did a webinar as well together around UX design for online marketplaces, which was really valuable. So I’m super happy to have you on board for this interview, as well, around monetization.
Bec: Really great to be back.
Emil: Awesome. Let’s jump straight into it. If we think about monetization strategies for marketplaces, there are many things that you need to consider. But if we can just start with what is monetization for marketplaces, how would you define it?
Bec: I have a couple of thoughts on monetization. I think part of it is obviously how we actually make money from a marketplace, and how do we actually create this piece of technology that’s going to fulfil its purpose. But in a way that we can actually help sustain it as a business or not for profit or whatever shape it happens to be in. But I also think it’s an interesting area where we tend to really focus in on specific trends and the way that people; we always do market monetization in marketplaces. But I think is actually really quite interesting because we see a lot of trends in people going for funding and raising all capital and everything like that. And the marketplace is very much dependent on that. But one of the things I’m really passionate about, exploring over the next two years, is really trying to figure out how do we make marketplaces profitable without necessarily relying on investment quite so much. So it’s what one of my areas of I’m quite passionate about. Sorry, but I think in a nutshell, monetization is really about how do we actually make money? How do we make sure that we’re making something that’s worthwhile that can be sustainable?
Emil: Yes and if you’re a young marketplace as well, like getting the investors on board, there needs to be some plan on how they can make their money back. If you can’t monetize your platform in a right way, then the investments are not going to happen, even to begin with. So for online marketplaces, if you look at different mindset and strategies to implement, there’s so many different alternative business models out there, like commission based selling fees and advertisement subscription. But what would you say, Bec, is important to consider when deciding on a suitable monetization strategy?
Bec: It is a really good question, I think it always comes back to no matter what stage you are in your marketplace. Really understanding the value of your core user group, the value that you’re offering to that user group, what paying for it is solving for them and what you are providing for them and then trying to really distil the value offering and that way basing our monetization around the value, the offering. So it’s going to make our job a lot easy to justify. And it’s also going to mean that people are willing to pay for that. I always say it depends on what side of the market we end up charging and the different structures that we might end up charging. But for me, it always comes back to really ensuring that where we’re providing that value and income associated, the monetization around that typically. One of the exercises I get a lot of early stage marketplaces to do to really prove that the marketplace monetization really work, is trying to create five transactions successfully without actually using the marketplace as a platform.
So to do it manually offsite, if they can create those five transactions manually. They’re saying really if you can prove that you can create these transactions without technology and means and proof, that people are willing to pay at that point in time, you get a lot of really valuable lessons along that journey that allows you to write on that monetization strategy without having to impede on the marketplace as a whole or worrying about technology and things like that. But it’s obviously for early stage marketplaces. The latest stage marketplaces, I think it’s really a matter of trying to figure out why they’re looking at monetization strategies. Is it that they’re looking to add a new revenue streams that business? And if that’s the case, then again, it’s going back to a core users. What value can we offer them? And is there an additional way that we can add revenue to help business in that way? I think we touched on it in the webinar about monetization around how tricky it can be to actually to transform an established marketplace and change that monetization strategy. But if you find ways to add new revenue streams into the business, that could be a way of kind of exploring that.
Emil: You’re right. In the recent webinar that we did around monetization in marketplaces, we spoke about mature marketplaces and well-established global players on how it can be very difficult to change that business model, instead looking at becoming more of an investor and adding revenue streams that way, as well. Just out of curiosity, like regions and geographies, does that really matter? When you look into integrating a new strategy?
Bec: I think it always does, because different regions are always going to have different intricacies about them, whether it’s the currency, whether it’s the exchange rate, whether it’s the culture in the way that that different, different aspects are conceived in that particular culture, for example. So I think it definitely comes into play. That’s where really getting to know your users easier every day, really understanding what it is that drives them, what they’re comfortable with and where they would see the value in your platform.
Emil: It is a good point. You should always look at user behaviour, but what is the user base that you are? What’s your marketplace? Who is the target audience? Who is using your platform? How are they using it? Looking to those different aspects and then you will find the different opportunities. I think you know that the premium listings like that kind of on monetization strategies really comes from that need of people who want to be able to sponsor the content, etc. and these kind of things.
Bec: Exactly. I am a true believer in that your end users are going to tell you what you need to be doing in business. If you are not listening to them, you are making a whole heap of assumptions. I always say with anybody I work with is always getting back to who your users are and listening to them because they are going to tell you the answers.
Emil: Super. If we look at monetization, it feels like things are trending from year to year, there’s a constant development in the marketplaces industry, moving from classifieds to offering more, the full payment solution and the entire offering. So my question is: what is the kind of trending monetization strategies that works in 2019?
Bec: I think, as you just touched on them, one of the big trends that we’re really seeing is the larger marketplaces, those traditional classified spaces trying to get closer and closer to the transaction in the original days and online classifieds. It was very much on listing and then a lot of that transaction happened off technology, basically, along with the real world. When now, they are again going back to that customer journey, really understanding what is that name that they’re selling for they users, for their customers.
For example, when the price of real estate they’re looking at, the original classifieds might be just selling a house. But actually, if we take a step back from that as a whole, lot of stuff that happens in amongst that, there’s things like the need to get financing. There’s the getting the phone and Internet and all of those kind of things connected. And there’s a whole bunch of other stuff that needs to happen around that part of that journey, the customer’s life. So really looking at what that journey looks like and then starting to figure out where they can add more value in, again, adding new revenue streams into that. And it’s thing that we’re seeing time and time again, every base works. Another good example as well, where a base or a trend happening with the tops of these that we use in the marketplaces that we’re using in minor places are using the platform. They realized that there was a lot of people using it for work, so then they started to explore that. And then a baby was born. And there’s another income stream for them. So to me, it felt really that everybody is coming back on this journey and really saying to depict how it’s coming. Get more. Closes the transaction. But how can we add more value as well in the long journey?
Emil: It’s really key for mainly larger mature players to really have that opportunity to both satisfy their users and users by adding these values sort of services. It could be anything from any insurances or delivery and payment solutions. But you offer that complete solution and that also gives not just increase UX, but also actually more revenue streams for yourself. So it’s a win-win in that sense to move in that direction. Out of curiosity, are there any traditional monetization strategies that doesn’t seem to work anymore or are fading out?
Bec: One of the things that probably less than the fading out, but one of the challenges that I’ve seen marketplaces really struggle with, particularly in the early stage, I work with a lot of early stage marketplaces. A lot of them are falling down the tunnel of, they’re in a low frequency marketplace and there in a low value frequency marketplace which means transactions are only happening very rarely for a user. So, for example, one user might only transact with their platform maybe once a year or every six months. That’s only about very low volume. So if you’re taking a ticket, as a percentage of the transaction, they really need a huge scale in order for that to work. And because they’re so close to the business, sometimes can be a little bit hard to see. The fact that they are charging that percentage but get the lifetime value of that customer on the average order value of that customer is so large. It’s just a really challenging space to be in and at scale is really the only option. So this is where I think that I really need to kind of recognize that that’s the case, and if the current shift that around and focus on solving a need that’s around that might actually bring in a higher frequency from that particular customer or a higher value of that transaction. It’s a matter of kind of looking to see what other income streams that can look at. What are the revenue streams can I look at? I think just being mindful of that is something definitely to be considered.
Emil: You touched on a little bit here that you work with a lot of more younger or early stage marketplaces, So I’m going to jump into another question. But feeding off of that, I assume, since you speak to a lot of young early stage companies, marketplaces that you have encountered, a couple of really disruptive sort of out of the box thinking style monetization strategies. Anything that you that you want to share.
Bec: I guess one of the real great things about working with early stage marketplaces is that we can be really inventive and really creative in how we approach monetization. I was working with a client recently, who was in a space where they were very concerned about the fact that they were kind of in the business of matchmaking almost. So they had users that were coming in from one side and the demand side. But it’s very easy for them to form relationships with a supplier, and therefore that relationship could be technically, taken off the platform. Therefore, charging a commission, for example, wasn’t gonna work for them, swinging to really take a step back and just really understand again what the customer journey was like. But really looking at the big picture of what the user was trying to achieve in their particular industry, and by taking that step backward, to identify that there was actually a big need that needed solving, which meant that we could actually really structure a sort of being narrowed in by what was done previously, were able to create more of a disruptive model that would let out that particular marketplace. To then basically bait 10 times the amount, 10 times their revenue that they were then earning from an average order value from that particular customer that was coming through the platform. Then we’re looking at a new way now to which will work out for them buying basically for a single hour. For example, they will be now purchasing bulk hours and then we’re looking into more of a membership type of opportunity which come down the track as well. So suddenly we’ve gone from being worried about the supply and the demand guy off site and platform where we’re losing the transaction to it now actually really increasing and multiplying that revenue stream. Now that’s the marketplace. So it’s kind of combining quite a few different techniques together. But by really understanding what the needs were of that user group, where I would identify this new opportunity, that would mean that we’re basically solving a problem from all angles. Sorry, that’s been really interesting, that experiment we are still running with, but so far the tests have been really positive.
Emil: It is very cool. I know that it can be a big struggle with having conversions like losing out on the actual conversion from your marketplace that you have your buyers and sellers, especially the sellers, I would assume, who are incentivized to leave the marketplace. But here is also, from a moderation standpoint, where it gets dangerous for the user as well. Because you can’t control the conversation and you can’t protect your users, the buyer in that scenario. So if the seller manages to get the conversation off the site, then the chance for a scam, etc. increases significantly. So it’s also like a way to keep protect your users in that sense as well, to be able to keep them on your platform. It’s very interesting. Another way of preventing that as well is adding, like we spoke about value-added services, as monetization strategy. And if you add enough value like OpenTable, or like an Uber, something like that, where you actually can do reservations and book. And the platform itself is so useful for the seller, that can bring down the incentive to actually leave the platform because it adds services that you need.
Bec: Exactly. That’s a really great technique.
Emil: So I think that brings us to the end of today’s interview. So thank you so much, Bec, for taking your time. It’s been really helpful. I hope and I think all the listeners are really happy to hear your tips, tricks and ideas for monetization strategies. If you want to get in touch with Bec, you can reach her on her email on firstname.lastname@example.org. If you want to learn more about monetization strategies, check out our webinar that we did on September 17th. And finally, if you want to learn more about content moderation and how we, at Besedo, can help you improve your content quality, and from that side of things, boost your revenue generation. Then don’t hesitate to reach out to me at email@example.com. Thank you, Bec. Thank you very much. Thank you for having me. Take care, guys. Bye.
Bec: Thank you. Bye.
Choosing and implementing the right monetization strategy for your marketplace is a process that can often feel somewhat complicated and challenging.
You wouldn’t think of starting a marketplace without a strong concept or a target audience in mind. The same goes for your monetization strategy. Having no clear monetization plan in place can easily lead to a disaster even if your marketplace has a great unique selling point.
Finding an adequate and suitable strategy is critical for your marketplace to thrive and be profitable. After all, growing a sustainable platform comes down to generating steady revenue.
Yet, many marketplaces continually struggle to find the optimal monetization strategy for their platform, simply because there is no secret formula that fits all marketplaces perfectly. You just need to find what works and what doesn’t for your business.
One advice that is often shared by experts is to try out different revenue models to find the best option for your platform and value offering. Ultimately, building a lucrative marketplace is a process more than a goal, which involves continuous monitoring and tweaking of the strategy already in place. Yet, with the deluge of information on monetization strategies available, you might wonder where to start.
But the good news is: we have you covered! Check out our latest webinar featuring marketplace experts, Anton Koval, Jeroen Arts and Martin Boss who shared their best tips and tricks, current marketplace monetization trends, and their methods for efficiently integrating your monetization strategy.
Alongside our webinar, you’ll also find our handy guide to popular monetization strategies used by renowned marketplaces currently to guide you along your marketplace growth journey.