From controversial presidential social media posts to an increased number of scams relating to the COVID-19 pandemic, 2020 has been a challenging year for everyone – not least content moderation professionals and online marketplace owners.
Let’s take a look back at some of the major industry stories of 2020 (so far – and who knows what December may yet bring…).
After a number of ill-fated decisions regarding content moderation, social media giant Facebook recorded its first fall in profits for five years.
In previous years, the company faced mounting criticism for its data sharing with Cambridge Analytica, as well as its failure to moderate political adverts for false content, and its handling of fake news during the 2016 elections.
However, in 2018, Facebook announced that efforts to toughen its privacy protections and increase content moderation would negatively impact profits – which was in fact the case. In January 2020, Facebook reported that the company had seen a 16% drop in profits across 2019 – despite significant increases in advertising revenue.
By the end of January, Facebook had named British human rights expert, Thomas Hughes, as the administrative leader of its new oversight board, set up to review user-generated and other content removed from its site, who was quoted as saying: “The job aligns with what I’ve been doing over the last couple of decades – which is promoting the rights of users and freedom of expression”.
The following month, San Francisco’s Ninth Circuit Appeals Court ruled that YouTube had not breached the US constitution’s First Amendment when it decided to censor a right-wing channel.
The court ruled that YouTube, the world’s biggest video-sharing platform is a private company and not a “public forum” and therefore not subject to the First Amendment. The US Bill of Rights (1791), declares that the government will not abridge the freedom of speech in law.
However, this guarantee is between the US Government and its people – not private companies (except when they perform a public function) – meaning the ruling could have huge ramifications for future cases of freedom of speech online.
Back in March, we ran an article about protecting users from emerging Corona virus scams. As the pandemic took hold globally and lockdowns were put in place around the world, online scammers were deliberately exploiting vulnerable individuals. Scammers were charging exorbitant prices – for everything from hand sanitizer to fake medicine – and even offering non-existent loans via online marketplaces and advertising.
European regulators rallied by calling upon digital platforms, social media platforms, and search engines to unite against corona virus-related fraud.
In an effort to try to coordinate efforts, the European Commissioner For Justice and Consumers, Didier Reynders, sent a letter to Facebook, Google, Amazon, and other digital platforms.
Despite the best efforts, some third-party merchants managed to find a loop-hole on Amazon which enabled them to claim that products prevented corona virus. The scammers managed to evade automated detection by inserting claims into product images. After being contacted by the Washington Post, Amazon subsequently removed the product listings.
Another casualty of the global pandemic was the short-term/holiday rentals sector. In April, bookings made through global property rental giant, Airbnb, were down by 85%, with cancellation rates at almost 90% – an estimated cost to the company of $1 billion.
In retail, there were inevitable winners and losers as a result of lockdown. Comscore reported that whilst an increase in remote working prompted rises for the home furnishings and grocery categories, the tickets and events’ sector understandably plummeted.
Following calls to step up its content moderation, as part of efforts to combat hate speech, Facebook partnered with 60 fact-checking organizations.
According to a company blog: “AI now proactively detects 88.8 percent of the hate speech content we remove, up from 80.2 percent the previous quarter. In the first quarter of 2020, we took action on 9.6 million pieces of content for violating our hate speech policies – an increase of 3.9 million.”
Facebook even took it one step further by targeting hate-speech memes, creating a database of multi-modal examples with which to train future AI moderation software.
Despite the negative impacts of corona virus and lockdown on multiple-sectors, Besedo reported on two e-marketplace success stories this month.
Germany’s number one classifieds site, eBayK revealed the strategies they employed to reach a record 40 million live ads in the middle of the pandemic. Over in Norway, FINN.no told us about how they managed to grow traffic during COVID – simply by supporting their users.
Halfway through the year, content moderation issues were at the forefront of the news again with Facebook forced to remove more than 80 posts by President Trump’s campaign team – which contained imagery linked to Nazism. The inverted red triangle, which was used to identify political prisoners in Nazi death camps, appeared in the posts without context.
In France, proposals for tough reforms on hate-speech were reduced to a few moderate reforms after a group of 60 senators from ‘Les Republicans’ mounted a challenge through the French Constitutional Council.
The tougher reforms would have mandated platforms to remove certain types of illegal content within 24 hours of a user flagging it.
Facebook found itself in the spotlight again after a study was published by The Institute For Strategic Dialogue (ISD). The study revealed that Facebook accounts linked to the Islamic State group (ISIS) were exploiting loopholes in content moderation.
Using a variety of tactics, the terrorist group was able to exploit gaps in manual and automated moderation systems – and consequently gain thousands of views. It hacked Facebook accounts and posted tutorial videos, as well as blending content from news outlets – including real TV news and theme music.
Planned raids on other high-profile Facebook pages were also revealed. Facebook removed all of the accounts identified.
Targeting two of China’s biggest apps, President Trump signed special executive orders to stop US businesses from working with TikTok and WeChat – admit fears that the social networking services posed a threat to national security.
The President claimed that parent company, ByteDance, would give the Chinese government access to user data. He gave ByteDance 90 days to sell up (to American stakeholders) or face shutdown.
In August, Microsoft were mooted as front-runners for the buyout, but eventually dropped out of the race. Although set up as a fun video-sharing platform, TikTok has unwittingly become embroiled in conspiracy theories and hate-content. As other platforms have discovered, trying to moderate this content can be exceptionally complicated.
After a summer of lockdown, the world witnessed widespread calls for reforms to regulate online speech. Brookings noticed a discernible shift from the protection of innovation to the protection to the safeguarding of citizens. Countries such as France, Germany, Brazil, and the US explored options for the legislation of content moderation.
Also this month, YouTube revealed it was bringing back teams of human moderators after AI systems were found to be over-censoring and doubling incorrect takedowns. The company also claimed that AI alone failed to match the accuracy of human moderators.
Over a year after a US data scientist raised concerns about the way in which Instagram handles children’s data, Ireland’s Data Protection Commission (DPC) opened two further investigations, following fears that the contact details of minors were being leaked in exchange for free analytics. It was also revealed that users who changed their account settings to ‘business’ had their contact details revealed.
October also saw eBay launch a sneaker authentication scheme in a bid to tackle counterfeits. Limited edition sneakers can be produced in batches of just a few thousand and sometimes only a dozen, driving up the resale value on online marketplaces.
Unfortunately, this has created a market for counterfeits with one US Customs and Border Protection operation last year alone yielding over 14,000 fake Nikes. Products will have to pass to an authentication facility before being passed on to the buyer.
In November, Zoom became the latest online platform to come under fire for its content moderation practices.
This happened after the platform blocked public and politically sensitive events planned for its service – where it felt that users has broke local laws or its rules which require users “to not break the law, promote violence, display nudity or commit other infractions”.
Zoom was accused of censorship in the debate surrounding Section 230 which gives online companies immunity from legal liability for user-generated content.
While December’s content moderation events are underway, what’s become clear in recent months, is that given how much we all rely on online platforms – for everything from shopping to study, work, rest, and play – companies of all kinds continue to struggle with moderation.
Continued uncertainty doesn’t help – in fact it highlights vulnerabilities and loopholes. But at the very least, knowing where potential pitfalls lie enables them to better protect their users, which ultimately is at the heart of all good content moderation efforts.
This starts with having the right systems and processes in place.
The Christmas season is here and while the festivities kick off online retailers hold their breath and wait to see whether all of the preparations they have diligently made will pay off in revenue and sales during this ‘Golden Quarter.’ Will the website be able to handle extra demand? Will all orders be able to be shipped before Christmas?
Yet, The National Cyber Security Centre (NCSC) has highlighted another pressing concern which can have a lasting impact on revenue. Last week it launched a major awareness campaign called Cyber Aware advising potential customers to be aware of an increase in fraud on online platforms this year. This is because millions of pounds are stolen from customers through fraud every year – including a loss of £13.5m from November 2019 to the end of January 2020 – according to the National Fraud Intelligence Bureau.
Fraud is a major concern for marketplaces who are aware of the trust and reputational damage that such nefarious characters on their platform can create. While consumer awareness and education can help, marketplaces know that only keeping one eye on the ball when it comes to fraud, especially within User Generated Content (UGC), is not enough. Fraudulent activity deserves full attention and careful monitoring. Trying to tackle fraud is not a one-off activity but a dedication to constant, consistent, rigorous, and quality moderation where learnings are continuously applied, for the on-going safety of the community.
With that in mind, our certified moderators investigated nearly three thousand listings of popular items on six popular UK online marketplaces, in order to understand whether marketplaces have content moderation pinned down, or, whether fraudulent activity is still slipping through the net. After conducting the analysis during the month of November, including the busy Black Friday and Cyber Monday shopping weekend, we found that:
· 15% of items reviewed showed signs of being fraudulent or dangerous, this rose to 19% on Black Friday and Cyber Monday
· Pets and popular consumer electronics are particular areas of concern, with 22% of PlayStation 5 listings likely to be scams, rising to more than a third of PS5 listings being flagged over the Black Friday weekend
· 19% of listings on marketplaces for the iPhone 12 were also found to show signs of being scams
· Counterfeit fashion items are also rife on popular UK marketplaces, with 15% of listings found to be counterfeits.
The research demonstrates that, even after any filtering and user protection measures marketplaces have a significant number of the products for sale on them are leaving customers open to having their personal details stolen or receiving counterfeit goods. We know that many large marketplaces have a solution in place already, but are still allowing scams to pass through the net, while smaller marketplaces may not have thought about putting robust content moderation practices and processes in place.
Both situations are potentially dangerous if not tackled. While it is certainly a challenging process to quickly identify and remove problematic listings, it is deeply concerning that we are seeing such a high rates of scams and counterfeiting in this data. Powerful technological approaches, using AI in conjunction with human analysts, can very effectively mitigate against these criminals. Ultimately, it should be the safety of the user placed at the heart of every marketplace’s priorities. It’s a false dichotomy that fail safe content moderation is too expensive a problem to deal with – in the longer term, addressing even small amounts of fraud that is slipping through the net can have a large and positive long term impact on the financial health of the marketplace through increased customer trust, acquisition and retention.
2020 was a year we would not want to repeat from a fraud perspective – we have not yet won the battle against criminals. As we move into 2021, we’ll be hoping to help the industry work towards a zero-scam future, one where we take the learnings and lessons together from 2020 to provide a better, safer community for users and customers, both for their safety, but also for the long term, sustainable and financial health of marketplaces.
2020 was a year noone could’ve predicted. E-shopping has been accelerated by years and a whole new segments of customers have been forced online.
Meanwhile, marketplaces have had to deal with new unpredictable moderation challenges as illustrated very poignantly by the spike on this google trends graph showing the past years search volumes for facemasks.
The entire world has had to get used to a market that was very volatile and extremely sensitive to any new development in the pandemic, good as well as bad.
Now however, at the end of 2020, there’s light at the end of the tunnel. The vaccine is finally within reach and hopefully social distancing and face masks will soon be a curious terms in a bizarre chapter of our history books.
A return to “normality” will, however, not erase the incredible jump the world has done towards a fully digitized society.
What will the new world order mean for marketplaces going forward? What can eCommerce, sharing economy and online platforms in general expect from 2021?
We’ve asked 8 industry experts to give their trend prediction for the new year.
7% of reviewed listings for popular items across six UK marketplaces are found to be scams.
This Christmas, British shoppers will be turning to online shopping more than ever before as Internet retailers offer quick and convenient services. Consumers increasingly rely upon these services as the UK faces its second lockdown of the year, with many shoppers preferring not to visit physical stores. The opportunity is for online marketplaces to gain new customers, especially as many consumers turn to shopping for goods online for the first time. In fact, according to research by O2 Business and Retail Economics, 44% of consumers believe their shift to online shopping during the last COVID peak will stay the same going forward.
The good news does not stop there. In a saturated industry where monoliths such as Amazon dominate, consumers have become much more aware of where they shop and are much more willing to try a new service – over a third (39%) of respondents to a survey undertaken by Bazaarvoice said that they had bought from a new brand during the first quarantine. For small to medium sized marketplaces, this shift in shopping habits has opened up an opportunity to appeal to these newly adventurous shoppers, willing to discover a new platform.
Yet, while there is a huge opportunity for marketplaces, there is also considerable threat. The Association of Certified Fraud Examiners found that, in August this year, 77% of anti-fraud experts said that the pandemic has created an increase in fraud, while the New York Times reports that more than 200,000 coronavirus-related scam complaints have already been filed in the US this year.
To uncover the scale of this problem in the UK and help online marketplaces to understand how to keep consumers safe in the run up to Christmas, our certified content moderators started tracking six popular UK marketplaces over the peak shopping period. By reviewing listings of the items most associated with fraudulent selling, they are identifying how many show tell-tale signs of risk despite having made it through the marketplaces’ safety measures.
Having so far reviewed over 1000 listings during the first two weeks of November, we have found that:
- 7% of reviewed listings are likely to be scams
- Puppies are particularly risky, with 23% being found to be scams
- 14% of fashion listings are for counterfeit items
- Smaller marketplaces are particularly rife with scam products
The data demonstrates that no matter how large the marketplace, all marketplaces need to take precautions to prevent fraud on their platforms in the lead up to Christmas and onwards – as these scams and counterfeit posts are certainly not one-offs. Marketplaces need to rid their platforms of fraud. Larger marketplaces may see a smaller percentage of fraudulent posts but considering that they have a much larger user base, even small percentages can lead to thousands of users becoming victims. If marketplaces – of all sizes – do not act now, then they could risk long-term reputational damage and the potential for fraud on their platform to spiral out of control.
Recent research undertaken by Next Consulting on our behalf also demonstrates that:
- Up to 73% of users would never return to a platform after seeing fraudulent content
- 76% of users would not recommend a marketplace after having seen fraudulent content
- 80% would not buy from a brand that they had seen fraudulent posts for previously
Tackling fraudulent activity on marketplace platforms is not an easy challenge. The scam posts and counterfeit goods we found listed above were post-reviewed, meaning that these were the posts that had either slipped through a content moderation process or there was no process and analysis in place to start. Marketplaces need to review their content moderation strategy and explore a holistic approach – one that uses both AI and filter automation layered with manual moderation in order to provide high quality review and prevention. AI and automation are effective as a first line of defense for preventing harmful content. However, alone they are not the panacea. AI is fantastic for routine health checks – such as checking suspicious content that follows a continuous pattern. However, some scam posts venture out of the ‘routine’ and break the patterns that are typically picked up by AI. That is when human moderation will need to step in to ensure that rigorous analysis of the context is applied.
Working with a trusted partner who can provide guidance, best practice approaches and support for content moderation will be crucial for many marketplaces in the lead up to Christmas and beyond.
Do you need help with your content moderation setup? Get in touch.
What is content moderation?
Content moderation is when an online platform screen and monitor user-generated content based on platform-specific rules and guidelines to determine if the content should be published on the online platform, or not.
In other words, when content is submitted by a user to a website, that piece of content will go through a screening process (the moderation process) to make sure that the content upholds the regulations of the website, is not illegal, inappropriate, or harassing, etc.
Content moderation as a practice is common across online platforms that heavily rely on user-generated content, such as social media platforms, online marketplaces, sharing economy, dating sites, communities and forums, etc.
There are a number of different forms of content moderation; pre-moderation, post-moderation, reactive moderation, distributed moderation, and automated moderation. In this article we’re looking closer at human moderation and automated moderation, but if you’re curious to learn more, here’s an article featuring the 5 moderation methods.
What is human moderation?
Human moderation, or manual moderation, is the practice when humans manually monitor and screen user-generated content which has been submitted to an online platform. The human moderator follows platform-specific rules and guidelines to protect online users by keeping unwanted, illegal and inappropriate content as well as scams and harassment, off the site.
What is automated moderation?
Automated moderation means that any user-generated content submitted to an online platform will be accepted, refused, or sent to human moderation, automatically – based on the platform’s specific rules and guidelines. Automated moderation is the ideal solution for online platforms who want to make sure that quality user-generated content goes live instantly and that users are safe when interacting on their site.
According to a study done by Microsoft, humans only stay attentive for 8-seconds on average. Therefore, online platforms cannot afford to have slow time-to-site of user-generated content or they might risk losing their users. On the other hand, users who encounter poor quality content, spam, scam, inappropriate content, etc., are likely to leave the site instantly. So, where does that leave us? In order for online platforms not to jeopardize quality or time-to-site, they need to consider automated moderation.
When talking about automated moderation, we often refer to machine learning AI (AI moderation) and automated filters. But what are they really?
What is AI moderation?
AI moderation, or tailored AI moderation, is machine learning models built from online platform-specific data, to efficiently and accurately catch unwanted user-generated content. An AI moderation solution will take highly accurate automated moderation decisions – refusing, approving, or escalating content automatically.
One example that showcases the power of AI moderation is the Swiss online marketplace, Anibis, who successfully automated 94% of their moderation whilst achieving 99.8% accuracy.
As long as you have a high quality dataset that models can be build on, AI moderation is going to be great for routine decisions. It excels at dealing with cases that almost always look the same or very similar. This usually includes the vast majority of items that are posted to online marketplaces and as such most platforms can benefit from using AI moderation.
It should also be mentioned; AI moderation can be built on generic data. These models can be effective but are not as accurate as a tailored AI solution as they don’t take site specific rules and circumstances into account.
What is Automated filter moderation?
Automated filter moderation is a set of rules to automatically highlight and catch unwanted content. The filters (or rules) are efficient at finding content that can’t be misinterpreted or are obvious scams.
Filters are also great for covering sudden rule changes, where the AI has not gotten up to speed yet (Training takes some time and a quality data set). This was well illustrated when the Corona pandemic suddenly made masks and toilet paper problematic . This makes filters a solid complimentary automation tool for your moderation set up. Automated filters can easily be created, edited and deleted in our all-in-one content moderation tool, Implio – learn how to create filters here.
Do’s and don’ts of content moderation
Determining what to do and not to do in content moderation, may vary from site to site. There are many elements and factors that need consideration to get the moderation set up best suited for your specific needs.
However, regardless if you’re running an online marketplace, social media platform, or sharing economy site, etc., there are some things true of what to do and not to do when it comes to content moderation.
Do’s of content moderation
Do: Select the moderation method that’s right for your needs
Start off by looking at what kind of content your site hosts and who your users are. This will help you create a clear picture of what’s required from your moderation method and setup. For example, the type of user-generated content found on Medium contra Facebook is very different, and their users’ behavior too. This makes their moderation methods and setups look different in order to fit their platform’s specific needs.
Do: Create clear rules and guidelines
Your content moderation rules and guidelines need to be clear for everyone who is directly involved with your online platform’s content moderation. Everyone from the data scientist developing your AI moderation to the human moderator reviewing content, regardless if they sit in-house or are outsourced to partners. Uncertainty in your rulebook can set your moderation efforts back; both from a financial and from a user experience perspective.
Do: Moderate all types of content
Regardless if you’re running an online marketplace, dating site, or a social media platform, your users are key contributors to your platform. Making sure they’re enjoying pleasant experiences and are met with quality content on your site, should be of your interest. To achieve this, you need to make sure your content moderation is done right.
In a perfect world moderating all types of content on your site, from text and images, to videos and 1-to-1 messages, would be ideal. The reality though, is that this is not an approach possible for all online platforms; for financial and technical reasons. If that’s your case, as a minimum approach make sure to identify your high-risk categories and content and start your moderation efforts there.
Don’ts of content moderation
Don’t: misinterpret what good content is
Quality content is key to build user trust and achieve a splendid user experience on your online platform, but it’s important to understand what good content is. Don’t make the mistake of misinterpreting good content and end up rejecting user-generated content simply because it’s of negative nature.
For example, a negative comment or review following a transaction can still be good content, as long as no harsh language is used of course. Genuine content is what you want, as it enhances quality and user trust.
Don’t: wait too long before you get started with moderation
If you’re in the early stages of establishing your online platform, getting started with content moderation might feel like its miles away. It’s not.
Don’t get us wrong, perhaps it shouldn’t be your main priority right out of the gate, but you need to have a plan for how to handle user-generated content, from a moderation perspective, when you scale. As you’re growing, and the network effect kicks in, you often see a rapid increase of content flooding into your site. You need to be prepared to handle that; if not, your big break might actually end up hurting you in the long run.
Don’t: waste resources
Don’t reinvent the wheel. With multiple content moderation tools and solutions, like Implio, available in the market, it’s important that you prioritize your resources carefully. Innovation and growth are what will boost your online platform to success, and this is where your dev resources will give you the most competitive advantage. Find your way to free up your resources for innovation, without risking falling behind with your moderation efforts.
We can’t deny that 2020 hasn’t gone according to plan. It’s smashed through any notion of normality and has pushed companies all over the world to the very limit – in many different ways.
Yes, the events of 2020 have been a massive catalyst for change – particularly for online marketplaces, classified sites, and all kinds of digitally-native businesses. However, it’s not just companies that are changing. We’re also seeing a significant behavioral shift from customers.
It appears that a greater reliance on digital services is bringing about a need for more transparency and trust; as well as a desire for more personalized products and services. More accurate data is also helping to instill more empathy between customers and online vendors; and it appears as though quality is now more important than ever to shoppers.
Is this lasting change? How is it manifesting? How are online marketplaces and classified sites turning these challenges into opportunities? Let’s take a closer look at why empathy, personalization, and quality are just as highly prized by consumers as availability, cost, and convenience.
First let’s consider how things look from a technology standpoint. It’s clear that there has been a massive shift here too. The adoption of digital services rose by nearly a decade’s worth of growth in just a few short weeks – across retail, grocery shopping, deliveries, payments, and much more besides.
According to the UK Office of National Statistics, between February and May 2020, internet sales jumped nearly 14% (from 19% to 32.8%) – a rise which took some 9 years previously.
Now, we’d be remiss if we were to ignore the reasons behind this dramatic increase. It took place during peak lock down season. While the usual suspects (remember Amazon’s $11,000 per second earnings?) undoubtedly profited, so too did peer-to-peer orientated online marketplaces that were able to mobilize themselves in the right way.
But how were they able to do so? Competing with the likes of Amazon when it comes to cost, choice, and availability is seemingly impossible.
It seems that the challenges we collectively faced brought about a renewed focus in not only what we wanted as shoppers, but the way in which we perceived value.
Empathy is one thing that’s been in short supply for too long, but in recent years, it has started to permeate company and buyer behaviors in line with wider awareness of CSR (corporate social responsibility).
To the cynical shopper, corporate altruism is easy to pick holes in. But looking more closely at shopping trends, it’s clear that the initiatives being put in place by many online marketplaces and classified sites are largely customer-driven. In short, they make good business sense as well.
An Accenture report conducted in April 2020 shows that consumers are now taking health and environmental consideration into their shopping choices – with 45% of consumers saying they’re making more sustainable choices when shopping and will likely continue to do so.
Earlier this year we interviewed senior representatives from both eBay K (a dedicated Germany-focused classifieds version of the online platform) and Norwegian marketplace, Finn.no – both of which managed to grow during the pandemic by focusing on helping their communities.
Other digital companies with the capacity to be of broader service are adapting their business models too – at a time when overall demand for their original service has suffered setbacks.
Take ride-sharing app BlaBlaCar, which created a volunteer service during the pandemic – BlaBlaHelp – where drivers could sign up to help deliver groceries and medicine to those in need. Competitor mobility services could have elected to do the same, at a time when the entire transportation sector faced the same challenges, but didn’t.
The upshot is that the business was able to double down on the community aspect of its service – something that will no doubt continue to positively impact BlaBlaCar’s brand moving forward.
Ultimately empathy impacts loyalty: but it also has a positive impact on a company’s bottom line. A study conducted in 2016 on the link between empathy and profitability showed the top 10 empathy-driven companies increased in value more than twice as much as the bottom 10, and generated 50% more earnings.
Personalization Make A Play
Deeply understanding your customer’s journey – across all touch points – is also critical to ongoing success for online marketplaces. This understanding can be translated into personalization which in turn drives loyalty and conversions.
With all of the digital technology at our fingertips, personalization is nothing new to digitally-native businesses. In fact, it’s arguably one of the cornerstones of online commerce. After all, the more user data is available, the easier it becomes to provide and suggest products and services to customers based on their buyer behavior.
But why has this become of particular importance in 2020? Well, the more online retail options there are, the harder it becomes for shoppers to differentiate between similar products and services.
Competition for attention is fiercer than ever. But rather than simply bombard shoppers with yet more ads, companies are focusing on personalizing their overall customer experience – opting for attraction over interruption as their calling card.
For example, take Spotify’s Discover Weekly option – an algorithm-curated playlist based on user preferences – or app-based Atom Bank’s customization and naming options; even Starbucks’ much hackneyed practice of baristas scrawling customer names across their morning cup of coffee. When businesses – both digital and analogue – take the time to show that customers are more than just a dollar sign, they open the doors to more meaningful interactions.
The stats concur. According to a personalization development study, conducted by customer optimization experts, Monetate, 93% of companies with an “advanced personalization strategy” saw revenue growth. Similarly, companies that spent at least 20% of their marketing budget on personaliszation saw double the amount of ROI (return on investment).
The study shows that personalization has a positive impact on loyalty too – which will become more essential as companies that have experienced growth look to maintain their new customers.
In a similar way to how digital businesses are personalizing their customer experiences, more and more online marketplaces are focusing on improving the quality of their overall digital offering.
With potentially more customers coming to different sites, the need for a welcoming, user-friendly experience is crucial. If a site or marketplace is difficult to navigate, doesn’t offer different languages or even accessibility options, companies could be losing out on much needed revenue.
After all, if customers can’t find what they’re looking for, they have plenty of other options out there. However, this need for better quality extends to the actual products on offer too.
The ‘finding discounts strategy’ many shoppers have favored in the past has been replaced by a search for the ‘best possible option’. This is true across many different categories – from electrical goods to cars and furniture purchases, according to a survey carried out by First Insights.
The study shows that up to 98% of shoppers (at least where furniture is concerned) stated that price had no bearing on their purchase decisions.
But what does this all tell us? Well, if price has less impact than quality, then it’s clear that online marketplace and classified sites that specialize in particular niche should look more closely at their customer group’s needs and provide a product range and customer experience that matches higher expectations.
Building Long-Term Trust Is Crucial
All things considered, although many businesses face an uncertain future, as more shopping and interaction takes place remotely, there’s never been a better time to be an online marketplace or classifieds site.
To truly thrive however, companies need to continually focus on customers needs and to work on building communities rather than empires. Trust is more important than ever before. Which is why, among other things, online content moderation should be a priority for sites that rely on user-generated content.
Knowing that a site or marketplace is a safe, well-maintained place to transact will become even more important. This not only reassures customers your site is safe, it’s a broader indication that you genuinely care about your community and that you’re committed to providing the best possible user experience and that you understand why these things are important to your customers.
The pandemic will hopefully prove to be a short-term concern. But there are a lot of long-term lessons we can learn from how businesses and consumers are behaving. The challenge now is to commit to implementing the positive ones here and now. For everyone’s sake.
The lockdown’s lifting – at least in some parts of the world. But it’ll be some time yet before the wheels of global commerce begin to turn with any degree of regularity once again.
While it’s easy to assume that the shift to remote working, online shopping, and video-socializing has positively impacted most digital businesses and online marketplaces, that’s not necessarily the case.
However, while many digital services are undoubtedly thriving, this surge in demand continues to highlight different issues for many others – from a security, capacity, and scalability perspective.
In a similar way, companies that use technology to facilitate offline services – such as socializing, dating, or the exchange of services – are having to pivot to find new ways to stay relevant and active.
Let’s take a closer look at how many different digitally-driven companies in different sectors are addressing and overcoming the challenges they face.
Loud & Clear
One area that’s seen huge expansion during the lockdown is videoconferencing. It’s easy to see why.
Prior to the pandemic, one particular platform called Zoom was growing steadily, mostly among business customers. With 10 million active daily users back in December 2019, expectations were moderately ambitious. But fast-forward to April 2020, and user numbers had grown to an astonishing 300 million.
We all know what happened there. But then something else became apparent – Zoom wasn’t as secure as many users first thought. Cue an onslaught of privacy issues, such as ‘Zoom bombers’ and other uninvited video chat guests intent on password and identity theft.
To counter the issues the platform faced, the team has now rolled out end-to-end encryption: for its paid users. But despite these issues, Zoom continues to make massive profits – making $27m between February and April 2020: a sharp increase compared with its $198,000 profit just 12 months ago.
So what’s Zoom’s secret? People need it right now. Not just businesses intent on maintaining contact between usually office-based staff, but everyone else too – from those looking to connect with families and friends, to the global events industry which has literally moved talks, seminars, and other discussion-based happenings to the digital realm (as for instance exemplified with the recent Global Online Classifieds Summit).
But is its success sustainable? While it’s clear that ‘encryption-for-some’ must become ‘encryption-for-all’ in the long-term, right now it seems need outweighs any particular risk.
In short, it’s become an essential utility for many.
Eking Out A Living From eCommerce
In a similar way, the lockdown has sparked a massive upturn for online shopping. Given that over a third of shoppers are apparently unwilling to return to bricks and mortar stores until a COVID-19 vaccine is available, it’s not surprising that many large online retailers, fulfillment services, and manufacturers are reporting demand outstripping anything they could have been prepared for.
Of course, Amazon, the global eCommerce giant, is leading the way, as we’d assume – with Q1 2020 results 26% up year-on-year. In fact, given the increased demand for its services, Amazon has recruited an additional 175,000 people during the COVID-19 crisis.
Pre-financial announcements, the company was reportedly making $11,000 per second back in April. However, it in fact transpires that Amazon’s actually making a loss right now. All of the extra revenues are being used to pay workers and increase capacity.
Looking ahead, the mighty online retailer is unlikely to be toppled anytime soon; though it clearly demonstrates that they too have had to prioritize meeting demand rather than doubling down on profitability.
But, while Amazon’s offline order fulfillment service may be suffering, it’s not hard to believe that losses are being offset by its purely digital services – TV, music, eBooks, cloud computing services. Diversification has presumably been its saving grace.
Beyond The Ban
However, many other businesses who essentially use digital services to enhance the customer experience – and automate backend processes such as data collation, CRM functionality, and order processing – are facing tough times.
Take the travel sector for instance, which has probably taken the hardest hit of all, given the restrictions that were put in place to stop the spread of corona virus.
In the absence of being able to guarantee immediate bookings, many companies are asking customers to book for 2021 already, in an attempt to maintain cash flow and remain operational.
However, companies that would usually generate smaller profits from multiple bookings and casual stays could lose out if things don’t recover quickly. In cases like these, it really is a case of the strongest surviving.
But that said, some well-placed creativity and innovation can go a long way.
Take Airbnb, for example, which has recently rolled out its new Online Experiences initiative to not only boost revenues, but to give customers a taste of what everyone’s missing out on, and to help bring people closer together – in a way that picks up where Airbnb’s popular in-person experiences left off.
Using the service, customers can learn and interact with experts and enthusiasts from all over the world; doing everything from family baking sessions to taking part in history quizzes – both for fun and educational purposes.
Could it be that the service that started life as a couch surfing app becomes a bonafide education platform? Only time will tell. But Airbnb’s well-timed pivot certainly plays to its strengths.
In Sweden, travel company, Apollo Tours, has started focusing on the domestic market rather than far-flung destinations. Anticipating that international travel will take a while to be fully operational, Apollo is offering and organizing local activities and training sessions – for everything from mountain biking to yoga – to give customers something proactive to be able to do during the summer vacation, both alone or in small groups.
Love In A COVID 19 Climate
Interaction is just as important as stimulation. We’re social creatures after all. And while many of us have learned to deal with being distanced from our loved ones, what about those looking for love? The countless singletons and lonely hearts out there unable to meet with prospective partners in person.
Well, dating apps and platforms open doors to new matches. They provide a safe space to interact, message, and meet new people who share the same interests and outlook.
While Tinder’s going all out encouraging users to go on virtual dates – co-watching Netflix shows and movies, ordering takeout from the same place and dining by FaceTime – the stark advice to maintain distance and avoid sneaky visits to your intended’s sleeping quarters remains in place.
Up and comer app, Hinge is attempting to bridge the lockdown divide with its own bespoke ‘date from home’ feature – connecting matched users to those ready to video chat there and then.
While these efforts may be admirable, in their efforts to capture different aspects of spontaneity, meaningful connections, and quality time, in effect they haven’t deviated too far from their original offerings.
These features might actually be kept long-term for those keen to maintain physical distance before meeting someone new in person – or where busy schedules don’t allow – let’s be honest: there’s no substitute for face-to-face meetings where affairs of the heart are concerned.
Focus On Users: Nothing Else
Ultimately, what can you do when the very nature of your business model is under threat? You find ways to give your customers what they want.
As the companies mentioned are realizing, supporting users is what counts – offering real value, in the most authentic, meaningful way possible
It’s about putting them first – not just to keep them engaged and subscribed to your service or platform – but to genuinely offer help and support during a difficult time.
This was a sentiment echoed when we spoke with online marketplace, FINN.no, fraud manager, Geir Petter Gjefsen recently. By focusing on its users, and actively encouraging its users to ask for help or help others during the crisis, not only did the initial dip in traffic recover, but deeper customer bonds were formed.
Similarly, eBayK (eBay Kleinanzeigen), a free online classifieds market that’s committed to sustainable trade, created a ‘Neighborhood Help’ category where customers could offer their service – from dog walking to tuition – as the world faced COVID 19 uncertainty. The result? A peak in traffic and 40 million live ads.
All things considered, to stay afloat, maintain customer loyalty, and to come out the other side of the crisis intact, digital businesses need to be agile. They need to adapt by focusing on their own strengths and tailoring even closer to what their customers need. As teams need to focus on delivering the best possible service, machine learning, AI and outsourced agents can play a part in helping moderate the content itself.
Now is the time for action and innovation. After all, what have you got to lose?
While many businesses sadly are struggling due to the global Covid-19 pandemic a few have managed to grow. eBayK, Germany’s No. 1 classifieds site is one of them. They’ve kindly offered to share how they’ve approached the crisis, steps they’ve taken and strategies they’ve applied to avoid the negative impact caused by social distancing rules, lock-down and general unease among the worlds population caused by the uncertainty of the situation.
We hope that other marketplaces can benefit from learning about eBayk’s approach and that it can help those that struggle, turn the negative trend around.
Interviewer Please introduce yourself and eBayK.
Stefanie Pritzkow I’m Stefanie Pritzkow, Head of Customer Support at eBay Kleinanzeigen. eBay Kleinanzeigen is a free online classifieds market that brings the joy of sustainable trade to everyone. Already today users buy and sell on Germany’s No. 1 for classifieds mainly second-hand. In this way they make an active contribution to more sustainability. On average, more than 40 million ads are available in numerous categories – from children’s supplies to electronics and real estate. eBay Kleinanzeigen also offers small and medium-sized businesses the opportunity to present their services online. Around 32 million users per month make eBay Kleinanzeigen one of the most widely used websites in Germany. The online classifieds market was launched by eBay in September 2009.
Interviewer When did you start preparing for the Corona crisis?
Stefanie Pritzkow We have begun to take actions with the announcement of national measures to contain the spread of COVID-19 in the middle of march.
Interviewer What were your main concerns in terms of impact due to the Corona crisis?
Stefanie Pritzkow As in many other areas, the impact of the spread of the virus itself and the measures taken by the government on our business was difficult to assess. In particular, there was the question of how the fear of contagion would affect our local business, which is characterized by personal contact of buyers and sellers. Furthermore, it was not possible to predict whether people would continue to buy and sell at all.
Interviewer Which actions did you take to lighten the impact of the Corona crisis?
Stefanie Pritzkow At eBay Kleinanzeigen, simplicity is one of our core principles. Short-term tactics do not fit in with this principle. Nevertheless, it was necessary to adapt to the new situation. Among other things, we made it easier to find items that could be shipped. And, of course, our iron principle, which obliges our sellers to offer pick up of items was also put to the test. We have suspended this until further notice. We have also temporarily restricted trade in certain products, including respiratory masks and disinfectants. But the crisis also revealed positive aspects. We were particularly pleased with the willingness to help of many users. We wanted to support this commitment, which is so important these days, which is why we created the “Neighbourhood Help” category. In this category, users can offer their services to neighbours in need of help – for example, shopping assistance, walks with the dog or tutoring for schoolchildren. Within a few days, more than 10,000 ads were available. And even now, months after the crisis began, many users continue to offer their help.
Interviewer What challenges did you face in handling the Corona crisis?
Stefanie Pritzkow Uncertainty. And dependence. We did not know what measures would be taken next by the national and local governments. We’ve had to digitalize a part of our corporate culture – at eBay Kleinanzeigen, we work together in cross-functional teams. Personal exchange is therefore particularly important. Video conferences cannot replace spontaneous exchange. At the same time, for us “New Work” is not zeitgeist, but part of everyday life. The processes could therefore be adapted to the new circumstances without too much challenge. Furthermore, it has once again been shown that eBay Kleinanzeigen is relevant for many people. Around one in two onliners uses eBay Kleinanzeigen – each month. We would like to build on this.
Interviewer How did traffic/user engagement look in March/April? Why do you think it looked like that?
Stefanie Pritzkow We initially saw a slight drop with the announcement of governmental measures. People initially had other worries and had to come to terms with the new situation. But just one week later we were back at the previous year’s level. Since then, use has increased significantly. Many people obviously use the “extra free time” to get rid of unused things. The number of new ads even tripled on some days compared to the previous year. We have reached an important milestone for us, 40 million live ads, much earlier than initially expected. We have also set new records for weekly app downloads and monthly visits.
Interviewer Why do you think you were successful in deflecting the negative impact from the pandemic when other marketplaces are getting hit quite hard?
Stefanie Pritzkow We have taken measures to adapt to the needs of users in the current situation, e.g. with new features which make it easier to find items to ship and the new category “Neighbourhood Help”. As market leader for classifieds in Germany, eBay Kleinanzeigen was naturally the first place to go for many users. Our platform has a high reach and people know that it is very likely to find a match, even in uncertain times.
Interviewer What’s the number one step you’d recommend taking in a similar situation in the future or to those struggling right now?
Stefanie Pritzkow In times of crisis, it is especially important for a brand to stay true to their core principles. For us, this means enabling our users to trade in a safe, convenient and sustainable way by providing a stable and secure platform. People have other things to worry about. That is why we wanted to make sure that they have the best possible experience on eBay Kleinanzeigen.
Interviewer Are there any actions you took due to the pandemic that you intent to carry forward even after the crisis is over?
Stefanie Pritzkow We have seen that sustainable action also – or even especially – played an important role during the crisis. A recent study by Accenture has shown that 45% of consumers make more sustainable choices when shopping and are likely to continue to do so after the crisis. We want to let people know that by trading on eBay Kleinanzeigen, users make an active contribution to more sustainability. We have recently launched a campaign that highlights this aspect in a special way. The new features already mentioned have met with a very positive response. These make trading on eBay Kleinanzeigen more efficient and easier – even after the crisis. In addition, we will evaluate in due course whether we keep our new category “Neighbourhood Help” and how it can create value in regular times.
Interviewer If you saw an increase in traffic due to the pandemic, do you think that increase will last even after the crisis is over?
Stefanie Pritzkow We assume that the strong growth, as we have seen in the past few weeks, will decrease somewhat sooner or later, but will remain at a high level. As mentioned, we have reached an important milestone for us, 40 million live ads, much earlier than initially expected. It was only in October 2019 that the number of ads available at the same time rose to 35 million. The year before, in October 2018, we reached the number of 30 million live ads. Use has increased continuously over the last years and we expect this development to go on.
Stefanie Pritzkow, Head of Customer Support,
As Head of Customer Support at eBay Kleinanzeigen, Stefanie Pritzkow is responsible for all customer service activities and also manages the continuous improvement process for end customers.
Before taking up this position, she was responsible for various projects in the area of customer service at eBay Kleinanzeigen. Before joining eBay Kleinanzeigen in September 2011, she worked for Deutsche Lufthansa AG. There she passed through several stations, first as a recruiter and finally as customer team consultant for Lufthansa e-Commerce GmbH in Frankfurt am Main. She then moved on to the Online Sales department and was responsible for customer service at lufthansa.com.
Stefanie Pritzkow studied business administration at the University of Applied Sciences in Berlin with a focus on human resources, sales and marketing. She lives in Berlin.
The restrictions put in place to combat the global Covid-19 pandemic has had a devastating effect on many businesses. Social distancing, restrictions on physical services and a downturn in spending has also hurt most marketplaces and sharing economy sites despite their digital nature.
After months of closed down societies and harsh restrictions, nations are slowly and carefully opening up again, but the world is forever changed. Businesses who understands and adapts quickly to the new reality will be successful. To do so they’ll need to understand the challenges and opportunities arising in the post-corona business landscape.
We’ve asked 8 online marketplace experts to share their thoughts and predictions to help you prepare and adapt to the new reality.
User safety is key for all online platforms, particularly when you’re dealing with vulnerable youngsters. Moderating can be challenging and getting the balance between censorship and safety right can be hard.
We sat down with industry veteran and founder of Friendbase; Deborah Lygonis, to discuss the experience she’s gained from developing and running a virtual world for teens.
Interviewer: Hi Deborah. Could you please give us a short introduction to yourself?
Deborah: My name is Deborah Lygonis and I am a serial entrepreneur. I have started and run several businesses over the years, mainly within the software and gaming sector, but also e-health and other tech. I love tech and I’m passionate about startups and entrepreneurship. I also work as a coach and mentor for entrepreneurs within what’s called the European Space Agency Business Incubator; The ESA BIC, and for a foundation called Entrepreneurs Without Borders.
Interviewer: Wow! That’s an impressive background. One of the things you’ve started as an entrepreneur is Friendbase, right? Could you tell us a bit more about that?
Deborah: Yes. Friendbase is a company that I founded with my brother and a third guy called Andreas. We’ve known each other for many years. Well, obviously, I’ve known my brother for many years, but Andreas as well, has been part of our group of friends and acquaintances for many, many years. We decided to found Friendbase in 2013. We saw that the whole idea of virtual worlds hadn’t really migrated over to smartphones and we wanted to see if it was possible to create a complete cross-platform version.
So, we put together a mockup of an Android, IOS, Web version and put it out there to see if that was something that today’s young people would like.
Friendbase is a virtual world for teens where they can chat, play games and also design their looks and spaces. Now we’re also moving towards Ed tech in the way that we’ll be introducing quizzes that are both for fun but also have learning elements in them.
Interviewer: That sounds awesome. What would you say is the main challenge when it comes to running cross-platform online community and specifically one that caters to teens?
Deborah: There are a lot of challenges with startups in general, but also, of course, running an online community. One challenge is when you have people that meet each other in the forms of Avatar and written chat and they have different personalities and different backgrounds that can cause them to clash. The thing is that when you write in a chat, the nuances in the language don’t come through as opposed to when you have a conversation face to face. It’s really very hard to judge, the small subtleties in language and that can lead to misunderstandings.
Add to that as well that there are lots of different nationalities online. That in itself can lead to misunderstandings because they don’t speak the same language.
What starts off as a friendly conversation can actually rapidly deteriorate and end up in a conflict just because of these misunderstandings. That is a challenge, but that’s a general challenge, I think, with written social interactions.
Interviewer: Just so we understand how Friendsbase work. Do you have one to one chat, one to many chats or group chats? How does it work?
Deborah: The setup is that we can have up to 20 avatars in one space. No more, because then it will get too cluttered on the small phone screens. So, you can have group chats. I mean, you see the avatars and then they have a text bubble as they write so that it can be several people in one conversation.
Interviewer: Do you have the opportunity for groups of friends to form and join the same kind of space together?
Deborah: Yes. Each member has its own space. They can also invite and open up their space for other friends.
Interviewer: And in that regard. What you often see in the real world with team dynamics is that there is a group of friends and there is the popular people in that group. And then one person who maybe is a little bit an outsider, who will at times be bullied by the rest of the group. Do you see people ganging up on each other sometimes?
Deborah: I haven’t seen groups of people ganging up on one individual. It’s more the other way around. There are individuals that are out to cause havoc and who are just online to be toxic.
Interviewer: That means that you have in general, you have a really nice and good user base. But then there’s the rotten fruits that come in from time to time.
Deborah: That is what it is like today. We are still fairly early stage, though, when it comes to the amount of users. So I would expect this to change over time. And this is something that we’re prepared for. We added safety tools at a really early stage to be able to learn how to handle issues like this and also how to moderate the platform when incidents occur. So, I think that even though that we don’t have that type of ganging up on each other at the moment, I would expect that to happen in the future.
Interviewer: But it sounds like you’re prepared for it. Now you’ve made a really nice segue into my next question; What is the main motivation challenges you experienced running Friendbase? What are the main challenges right now and what do you expect you will have to handle later on?
Deborah: I think that a challenge in itself for all social platforms is to set the bar on what is acceptable and not.
Our target group are mid teens and up. So we don’t expect young children to be on Friendsbase. We feel that if we made a social world for young children, then we’d need to have a completely different set of regulations, more controlled regulations, rather than when it is teenagers and upwards.
However, that demographic is also very vulnerable. So, of course, there has to be some sort of measurement in place. The challenge is to determine, at what level do you want to put the safety bar and also how can you tell the difference between what is banter between friends and when it sort of flips over to actually be toxic or bullying? That’s something that is really, really hard to differ between. And I think that if you work with chat filters, then you have to have some sort of additional reporting system for when maybe the filters don’t manage this challenge. The filter is only a filter and can’t determine between the two. So that’s one challenge. It’s also complex to enforce the rules that are in place to protect the users without being perceived as controlling or patronizing.
At the moment, we also have a challenge in that we have users that come back solely for the purpose to cause havoc and create a toxic environment. We track them down and we ban their accounts, but it’s a continuous process.
That is something that should it escalate over time it will become increasingly time consuming. That’s why it’s really, really important for us to have tools in place so that it doesn’t have to be moderated manually. That will just take too much resource and time.
Of course, you have the even darker side of the internet; sexual predators that are out to groom vulnerable youngsters and to get them to maybe move over to a different platform where they can be used in a way that is extremely negative.
That’s something that is difficult to handle. But today, thanks to artificial intelligence and again, amazing toolsets out there. There are attempts to look at speech patterns and try and identify that sort of behavior. And there it’s also really great to have your own tool sets where the user can actually report someone if they feel threatened or if they feel that someone’s really creepy.
Interviewer: When you have returning users who have made it their goal to attack the platform, in a malicious way, do you see that it’s the same people returning based on their IP or the way that they talk?
Deborah: It’s not always possible to see it based on their IP because they use different ways of logging in. However, given their behavior, we can quickly identify them. And we have a group of ambassadors as well online on Friendbase that help us. On top of that we have a chat filter which can red flag certain behavior. So that helps as well.
There are a group that come back over and over again and for some mysterious reason they always use the same username. So they’re not that hard to identify. That group is actually easier to control than a group which has a different motive on why they are online and why they are trying to target youngsters. The toxic ones that are just there because they think it’s fun to behave badly. It’s easy to find them and close down their accounts.
Interviewer: We already touched upon this, but what would you say is the hardest moderation challenge to solve for you right now?
Deborah: The hardest moderation challenge to solve is, of course, finding the people who are deliberately out to target lonely youngsters that hunger for social contact. The whole grooming issue online is a problem. We are constantly trying to find new toolsets and encourage our users to contact us if there’s something that doesn’t feel right. So grooming is something that we’re very, very much aware of. If we happen to shut down someone’s account by mistake for a couple of hours, they’re most welcome to come to us and ask why. But we’d rather be safe than sorry when it comes to this kind of behavior. However, it is hard to track because it can be so very, very subtle in the beginning.
Interviewer: Friendsbase has been around for a while now. Are there any challenges that has changed or increased in occurrence over the years? And if yes. How?
Deborah: Actually, not really. I think the difference is in our own behavior as we are so much more aware of how we can solve different problems.
Bullying has been around for years. Free Internet as well. Sexual harassment of youngsters and between adults, of course, has also been around for years. It’s nothing new. I mean, the Internet is a fantastic place to be. It democratizes learning. You have access to the world and knowledge and entertainment.
But there is a dark side to it. From a bullying perspective you have the fact that previously, if you were bullied at school, you could go home or you could go to your social group somewhere else and you would have somewhere where you would feel safe.
When it’s online, it’s 24/7.
And it is relentless when it comes to the whole, child abuse part. Of course, it existed before as well. But now with the Internet, perpetrators can find groups that have the same desires as themselves and somehow together they can convince themselves as a group that it’s more acceptable. Which is awful. So that is the bad part of the net.
So, when you ask: Have the challenges changed or increased since we started Friendbase? No, not really. But what has changed is the attitude of how important it is to actually address these issues. When we started the company in 2013. We didn’t really talk that much about safety tools. I mean, we talked about should we have whitelist or a blacklist, the words. It was more on that level. But today most social platforms, they have moderation, they have toolsets, they have guidelines and policies and so forth.
So, I think that we who work with online communities as a whole have evolved a lot over the past years.
Interviewer: Yeah, I would say today in 2020, you probably wouldn’t be able to launch a social community or platform without launching with some sort of moderation tools and well-defined guidelines.
Deborah: I think you’re right. Several years ago, I did the pitch where we were talking about online safety and tools of moderation and were completely slaughtered. What we were told was that being good online or this whole be cool to be kind is going to stop our growth. It’s much better to let it all run rampant and then it will grow much faster. I don’t think anyone would say something like that today. So that’s a huge shift in mindset. Which is great. We welcome it.
Interviewer: That’s a fantastic story. You’ve been in this industry so long; you’ve seen this change. I find it fascinating that just seven years ago when you said I want to protect my users, people laughed at you. And now people would laugh at you if you said, I’m gonna go live without it.
Deborah: I know. Can you imagine going on stage today saying that I don’t care about safety? I mean, people would be so shocked.
Interviewer: You said before when we talked about the main challenges if you experienced growth, you’d need to change your approach to moderation and automate more in order to just keep up?
Deborah: Yes, definitely. We try and stay on top of what toolsets are out there.
We build in our own functionality, such as muting users. So, if someone is harassing you, you can mute them so that you can’t see what they’re writing. Small changes like that, we can do ourselves, which will be helpful.
Something I’d like to see more and that we’ve actually designed a research project around is to not only detect and ban bad behavior, but to encourage good behavior.
Because that in itself will also create a more positive environment.
That’s something that we’re really excited about, to work with people that are experts within gamification and natural language processing to see how can we create tool sets where we can encourage good behavior and see what we can do. Maybe we can start deflecting a conversation that is obviously on its way to going seriously wrong. It could be so simple as a small time delay when somebody writes something really toxic with a pop up saying: “Do you really want to say this?”. To just make someone think once more.
This is something that we’re looking into. It’s super interesting. And I hear there’s a couple of companies just the last few months that are also talking about creating tool sets for something like this. So, I think it’s going to be a really, really interesting development over the coming years.
Interviewer: It sounds like safety is very important to Friendbase. Why is that?
Deborah: Why is that? Quite early on, we who work in the company discussed what our core values should be. And one of the core values we decided upon is inclusion. Everybody is welcome. And for everyone to feel welcome. You have to have a welcoming atmosphere.
When you continue along that line of thought, then obviously you come to the point where, OK, if everyone’s going to be welcome and you want it to be a friendly space, then somewhere you’re going to have to stop toxic behavior. So, for us safety, it’s just part of our core values.
And also, I have a teenage daughter who loves gaming. I’ve seen how platforms behave. She’s part of groups that interact with each other online. I just feel that there must be a way of doing things better. It’s as simple as that. We can do better than this, letting it be super toxic. And there are some amazing people out there working with fantastic toolsets. There are some fantastic platforms and social games out there that also work in the same sort of direction as we do. It’s really great.
And you know what? To be quite honest, I think that there have been several case studies where it’s proven as well from a business perspective that you have a longer retention and a higher profitability when you can keep your user online for a longer time. So, you know, in itself, from a business sense, it also makes perfect sense to work in a way where you keep your user as long as possible.
Interviewer: You have tons and tons of experience obviously with startups and social platforms. If you were to give a piece of advice to someone who is running a similar service to Friendbase or even who are thinking about starting one, what would that be?
Deborah: It would be, first of all, to determine what level of safety you want to have, depending on your user group. Obviously, the younger demographic you have, the more safety tools you must ensure that you have in place. Also, not to build everything yourself. Especially if you’re working on an international market with many languages. Just to be able to filter many languages and in a decent way is a huge undertaking. If you think that you’re going to be able to hack together something yourself, it’s not that easy. It’s better to work with a tool or a company that has that as their core business because they will constantly be working with the state of the art solutions.
So better to liaise with switched on companies that already work with this as their main reason for being. I think that’s important. And then, of course, add your own easy to report system, easy to communicate with your user’s system so that you have sort of a double layer.
I mean, I’ve seen several different companies that work now with different moderation tools and chat filters and so forth. Many of them they do stellar work. And it’s important at the end of the day because if anything really, really bad would happen, then you’re just finished as a business. It’s as simple as that. The last thing you would want is to have someone knock on your door and shut you down because something’s happened online in your platform.
Interviewer: Definitely! What’s in the future for Friendbase? Where are you in two years?
Deborah: Where are we now? We’re now raising funds, because what we’ve seen is that we have a very, very loyal member base and they are wanting to invite more of their friends. And I think that with very, very little work, we can get the platform on a really interesting growth path.
Deborah: So, yeah, our our aim is to become one of the big global players. It’s exciting times ahead.
Interviewer: For sure. Any closing remarks? Any statements you want to get out there from a personal point of view or from Friendbase?
Deborah: The Internet is a great place to be because there’s so much you can learn. You can meet so many interesting people. But, there is a dark side as well. And you have to be aware of it. Just by being a little bit street smart online people can keep themselves safe. And we’re getting there. People are learning. Schools have it in their curriculum, social platforms try to teach users how to behave. So slowly but surely, we’re getting there.