Behind every successful company is a team of hardworking tech people diligently keep the IT infrastructure safe, optimized and resilient towards unsuspected events. They are the unseen and sometimes unsung heroes of digital operations. At Besedo the head of that team is Kevin E. Ducón Pardey. We sat down with him to understand what it takes to ensure that we can deliver high quality services to our clients 24/7 all year long.
Interviewer: Could you introduce yourself?
Kevin: I’m just a guy who dreamt about doing what he loves and who worked and studied hard to achieve it. I’m Colombian, 34 years old, married with one kid and a dog. I hold an MSc in Computer Science from Universidad Politecnica de Madrid and a BSc in Computer Science from Universidad Ditrital de Bogotá and I have different certifications in IT Service management, IT security, and cloud computing. I have been working in ICT for more than fifteen years. I have been working at Besedo for almost seven years. I started as a local ICT Administrator in our Colombian branch office, then I was promoted to ICT Supervisor, and currently, I am the Global Head of ICT-IS, which means I’m in charge of all levels of ICT support. Together with my amazing team, we make sure that we fulfill our most important metrics, service-level agreements, and customer satisfaction.
I have applied my knowledge and skills to this ever-changing industry by creating policies and processes aligned with the industry’s best practices of IT service management. I’ve developed strategic and operational plans aligned with security guidelines for the ICT department to ensure that all necessary tools and processes are fully functional to achieve the company’s goals. Our commitment is to keep our mission-critical systems alive and running 24/7, to ensure that our moderation services are successfully delivered to our customers worldwide.
Interviewer: How does your team manage such a variety of clients ICT needs and set ups?
Kevin: We need to be aligned with the business guidelines to properly onboard the clients. So, with the help of Sales, Customer Success, and Service delivery teams, we are able to translate business and operational needs to ICT needs. It all starts with a cross-functional plan which is the key to understanding the whole process, IT requirements, and compliance aspects.
Once this plan is clear, from ICT we need to provide all necessary tools and fulfill all requirements. Our local ICT teams manage the local implementation according to the plan and they ensure that everything required is ready to go-live under the defined time-frames. They also to tackle any post-implementation issue immediately.
When you have a capable, knowledgeable, and committed team like I do, things are always easier.
Interviewer: How did you make the transition from office to home office so efficiently?
Kevin: We’ve actually had a remote workforce project running since 2017. We were quite early in identifying remote workforce as our future and as a necessity in order to explore new markets and cope with some specific requirements. Also, working at home boost the team’s morale and it is a benefit in countries where some have commutes as long as 2 hrs. COVID was a challenge for all ICT departments, but it also made our work visible. Talking with different colleagues, a pandemic situation likelihood was low, but the impact based on the government measures taken in each country was high, it’s what we refer to as a black swan. So, COVID gave a boost to the investment in the required resources. Also, it changed the business culture to focus on identifying areas and covering areas of vulnerability, we’re now even more resilient, we’re managing the internal resources more efficiently, and are taking a people-centric approach to IT security. As we we’re already working with cloud and virtualization, it was relatively painless to scale up. But I won’t lie, it was not easy. As we moved from 30% to 100% of the operation working from home, we had to boost some processes, provide resources, and try to tackle challenges fast, since we were operating under tight deadlines. However, we’ve made it. Of course, there were challenges at the beginning (as there always are when a new technology is implemented) in terms of support requests and provision of services but now, we are in a stable situation. The next challenge is to improve our delivery and make this more efficient. That’s an on-going process.
Interviewer: How do we manage disaster recovery in Besedo?
Kevin: It’s important to have proactive safety measures in place to guarantee that the moderation operation always is carried out correctly. A good first step is to plan the implementation of the moderation services before putting disaster mitigation plans in place. As I mentioned earlier, a good onboard process, but also, a fault-tolerant and highly available infrastructure is necessary.
For example, at Besedo offices, we work with different Internet service providers in case one fails to deliver correctly. We also work with fault-tolerant networks, a resilient infrastructure, third-party support, etc., to ensure that our IT operations remain stable when potential risks materialize. Also, we believe that remote workforce/work from home can support the delivery on-premises, and a mixed model (On-premises and remote) makes sense.
To complement proactive measures, we run IT checklists, backup routines and we have monitoring systems that allow us to prevent potential challenges during IT ops.
Interviewer: What is the next big ICT project you have planned in your team?
Kevin: We plan to go-live from a new office in Germany this year. We are preparing all that is required to have an operational center that is compliant with some specific guidelines and which are imperative to Besedo’s worldwide commitment to client satisfaction. Also, we are making exhaustive research in a Desktop as a service (DaaS) solution which can improve our current remote workforce delivery. That will be beneficial when we explore new markets and make it easier to scale up or down according to business demands.
From ancient Greek merchants attempting to claim insurance by sinking their ships, to Roman armies ‘selling the Emperor’s throne’(!): since time began, whenever there’s been a system open to exploitation, there have been fraudsters willing to try their luck. And succeeding.
However, most historical crimes were essentially isolated events. While a single fraudster could be a repeat offender, compared to the sheer number of people who can be duped at scale across different digital channels – by largely ‘invisible’ cyber criminals – it’s clear that fraud has become much more of an everyday concern for all of us.
But how did we get to this point? What risks do we need to be aware of right now? What can we do about it?
Let’s consider the issues in more detail.
A History Of Digital Deviance
In a similar way to other forms of fraud, digital scams date back further than you might think. Email phishing allegedly first took place in the early 1970s, although it’s generally accepted that the term was coined and the practice became commonplace in the mid-1990s.
Since then, the online world has seen con artists try their hand at everything from fake email addresses to using information gleaned from massive data breaches; with $47 million being the most amount of money a single person has lost to an email scam
(Incidentally, the most famous email scam, the ‘419’, aka ‘Advance-fee’, aka ‘the Nigerian Prince’ scam surfaced as mail fraud some 100 years ago.
But email isn’t the only digital channel that’s been hijacked. The very first mobile phone scams came about during the high-flying 80s when they became available – way before they were popular.
Given the high cost of these now infamous brick-sized devices, the wealthy were pretty much only people in possession of them (so it makes sense that they quickly became fraud targets too).
SMS messages requesting funds be ‘quickly sent’ to a specific account by a ‘family member’ began to soon after, though again didn’t surge in number until well into the 90s when uptake soared.
Of course, these aren’t the only forms of online fraud that surfaced at the start of the Internet’s popularity. Password theft, website hacks, and spyware – among others – proliferated at an alarming rate around the world at a similar time.
So, if we take it that digital scams have been around for some 25 years, why do they persist – especially when awareness is so high –? One of the biggest problems we face today is the ease with which online fraud can take place.
Hackers, of course, continue to evolve their skills in line with advances in tech. But when you consider the number of sites that anyone can access – the marketplaces and classified sites/apps that rely on user-generated content – pretty much anyone can find a way to cheat these systems and those that use them.
Fraud Follows Trends
As we’ve explored previously, scammers operate with an alarming regularityall year round. However, they’re much more active around specific retail events – such as peak online shopping periods, like Black Friday, the January sales, back to school accommodation searches, and the Chinese New Year.
However, while 2020 was shaping up to be a landmark year for fraudsters, given the many different sporting and cultural events – such as the Euro 2020 and Copa Americas football tournaments, and of course, the Summer Olympics – it seems that fate had very different plans for all of us: in the form of the COVID-19 pandemic.
But true to form, scammers are not above using an international healthcare crisis to cheat others.COVID-19 has given rise to different challenges and opportunities for online businesses. For example, video conferencing services, delivery apps, dating websites, and marketplaces themselves have largely been in an advantageous position financially, given the fact they’re digital services.
However, given the knock-on economic factors of coronavirus and the danger of furlough drifting into long term unemployment – among other things – there may be more wider reaching behavioral shifts to consider.
That said, fraudulent behavior simply seems to adapt to any environment we find ourselves in. In the UK, research shows that over a third (36%) of people have been the target of scammers during lockdown – with nearly two thirds stating that they were concerned that someone they knew could be targeted.
Examples playing on fear of contamination include the sale of home protection products, while other more finance-focused scams include fake government grants (requesting personal information), help with credit applications (for a fee), and even investment opportunities promising recession-proof returns for those with enough Bitcoin to put into such schemes.
The Gap Is Closing
It’s clear that online fraud is closely aligned with wider trends. In fact, the ‘newer’ something is, the more likely scammers are likely to step in. Looking at the various timelines of many of these scams against when the technology itself was first invented, and it’s clear that as time progresses, the gap between the two is closing.
There’s a very good reason for this: the pace of adoption. Basically, the more people there are using a particular device or piece of software, the more prolific specific types of scam have become.
Consider the release of the iPhone XS back in 2013 and the popularity of mobile game, Pokemon Go, three years later. Both events provided fraudsters with enough of an incentive to target innocent users.
As with most (if not all) eCommerce scams, the launch of the iPhone XS played upon consumer desire and ‘FOMO’ (fear of missing out) to manipulate Apple enthusiasts into being able to get their hands on the latest technology ahead of the official launch date.
Not only that, but the accompanying hype around the launch proved the perfect time for fraudsters to offer other mobile phone-orientated scams.
In general new tech and trends lead to new scams. For instance, Pokémon Go gave rise to new scams such as the Pokémon Taxi (‘expert’ drivers literally taking users for a ride to locations where rare Pokémon were said to frequent) and the advanced user profile (basically paying for accounts that other players had already leveled up on).
The fact it was so new and its popularity surged in such a short period of time, it made it a whole lot easier for fraud to materialize. Essentially, there was no precedent set – no history of usage. No-one knew what exactly to anticipate. As a result, scams were materializing as quickly as new users signed up.
In one case, players were being tricked into paying for access as ‘new server space was needed’. Not paying the $12.99 requested would result in their hard fought Pokemon Go accounts being frozen. While that might be a small price to pay for one person, at scale it would mean a significant amount.
Regardless of which methods they use, fraudsters are ultimately focused on one thing. Whether they’re using ransomware to lock out users from their data, posting ‘too-good-to-be-true’ offers on an online marketplace, or manipulating lonely and vulnerable people on a dating app – the end result is cold hard cash through scalable methods.
However, while hackers will simply barge their way into digital environments, those using online marketplaces and classifieds sites essentially need to worm their way into their chosen environment. In doing so they often leave behind a lot of clues that experienced moderators can detect.
For example, the practice of ad modification or use of Trojan ads on public marketplaces follow particular patterns of user behavior that can cause alarm bells to ring.
So what can marketplace and classified site owners do to stay ahead of fraudsters? A lot, in fact. Awareness is undoubtedly the first step to countering scams, but that alone will only raise suspicion than act as a preventative measure.
Data analysis is another important step. But, again, the biggest issue is reviewing and moderating at scale. When you have an international platform, how can a small moderation team police every single post or interaction when thousands are created every day?
This is where moderation technology – such as filters – can help weed out suspicious activity and flag possible fraud.
In order to stay ahead of fraudsters, you need a combination of human expertise, AI, and filters. While it’s possible for marketplace owners to train AI to recognize these patterns at scale, completely new scams won’t be picked up by AI (as it relies on being trained on a dataset). This is where experienced and informed moderators can really add value.
People who follow scam trends and spot new instances of fraud quickly are on full alert during big global and local events. They can very quickly create and apply the right filters and begin building the dataset for the AI to be trained on.
Ultimately, as tech advances, so too will scams. And while we can’t predict what’s around the corner, adopting an approach to digital moderation that’s agile enough to move with the demands of your customers – and with fast intervention when new scam trends appear – is the only way to future proof your site.
Prevention, after all, is much better than a cure. But where fraud is concerned, a blend of speed, awareness, and action are just as critical.
Why is the quality of content important?
Every marketplace professional knows that quality content is the key to success. Without good user generated content, you won’t drive traffic to your platform.
The quality of your content helps build trust (a vital element to marketplace success) and directly impacts conversions, engagement and retention. In fact, visitors are more than twice as likely to return to your site if they encounter high quality content on their first visit.
Now, what constitutes good content can differ wildly depending on; your audience, the services your platform facilitates and the overall competition in your specific niche.
However, we’ve collected some general rules of thumb that rarely fail.
What’s a good online marketplace profile?
A good online marketplace profile helps build trust in the owner, makes it easy to identify and re-identify them and provides the reader with enough relevant information for them to decide whether it makes sense to engage or not.
Polina Yelkina, Manager Russia & Ukraine, Community Relations at Blablacar breaks down their view on a good profile to the following:
- Maximized useful info (bio, preferences, photo, certified phone/email, car details).
- Real information.
- Relevant content (no gibberish).
- No surname mentioned.
- No contact details publicly visible.
- No advertising/spam.
- No offensive/insulting content in the profile.
- Content which complies with the rules of the platform.
Expanding on this and based on our experience, it’s clear that there’s some universal criteria quality profiles meet:
Good profiles contain relevant information that helps the reader understand whether the profile is matching their current needs. Which information is relevant, will depend on the platform and the service exchanged between the profile owner and the person viewing it.
For a profile on a car sharing platform for instance, it might be relevant to include reviews or information such as whether the driver accepts smokers or not.
Spam and advertising are of course completely unacceptable and links that lead off the site are inadvisable due to the risk of fraud and platform leakage.
Profiles should be streamlined to communicate the exact message needed for the reader to understand who they are about to engage with. It’s important that it’s clear who the profile owner is so pictures of multiple people are usually a no go. While the profile should be informative, it also shouldn’t contain irrelevant information. Short and to the point is better than rambling and incoherent.
We live in a visual age. Regardless of whether the profile is for a classifieds site, a gamer account or a car sharing platform, having a visual representation of the user is important. In general, this should be in the form of a picture of the user but could also be an avatar for platforms that are more lenient on anonymity (like community sites for kids or gaming platforms). The image should however be recognizable, so the representation is always easy to identify as the profile owner.
We’ll get further into this in the next paragraph, but make sure the image is clear, sharp, properly formatted and inviting.
What’s a good image for an online marketplace?
What constitutes a good image, is highly reliant on context. However, usually the following is true:
Blurry images communicate low quality and don’t instill trust in the person looking at it. Make sure users upload sharp images where it’s easy to see the subject.
Images that are formatted wrong, or are too dark or too bright, signal low value and should generally be avoided.
Photos that look genuine are preferable so try to get users to stay away from the stock photo look. There is such a thing as too perfect. In general, if your users are trying to sell an item it’s good to have multiple images. A couple that show the item, some that show details of the item in closeup and one or two that show the item in action (preferably in a natural, yet non-home-video-style fashion).
For profile pictures, a good image ensures that the face is easily distinguishable. After all, you’re trying to simulate a face to face encounter. Sunglasses, shots from behind and other images that remove information should be discouraged.
Depending on your site policies you may also want to remove inappropriate images where people have gone in the opposite direction and reveal too much.
In a qualitative survey on user search experience 100% of all participants highlighted good images as a reason to click on an advert. Underlining that great pictures drive engagement and conversions.
“The pictures are showing that it actually belongs to a real person. It looks less blurry, and like the seller really cares to sell the item” – Louise
Anibis.ch’s Product Support Lead, Ana Castro and Anti-Fraud Specialist Jelena Moncilli reveals that their internal guidelines echo this call for genuine and clear images.
“A good quality image is an authentic image captured by the user, directly related to the listing, without contacts or web links, and where the product or service offered can be clearly visualized, without being blurred.”
The same message of clarity is repeated by Polina Yelkina, Manager Russia & Ukraine, Community Relations at Blablacar.
In pictures of cars we want the following to be true:
- The whole car should be visible.
- The car from the picture should be easily recognized by passengers.
- No people in the picture.
- No contact details/advertising visible in the picture or on the car itself.”
What’s a good marketplace listing?
What is the purpose of listing text? It’s to provide the potential buyer or user with enough information for them to decide whether to engage with the seller.
If it takes too much effort on the buyer-side to decode the item or service specifications, it’s likely the buyer gives up and moves on to the next listing or worse, to the next platform.
As such the goal with listing text should be to clearly and concisely describe the service or item for sale. Misinformation, spammy, scammy or inappropriate content should obviously be disallowed and immediately removed.
Detailed descriptions aren’t just good for the browsing experience, it also helps with SEO driving traffic to the listing and the platform as a whole.
Listing text is also very important in building long-term trust in the platform brand. Since the buyer can’t physically examine the item, they have to rely on the description. If this turns out to be dishonest or lacking, trust will quickly deteriorate. As such an honest and detailed description is recommendable as a good listing text.
Ana Castro and Jelena Moncilli backs the need for detailed descriptions stating:
“A good quality listing is a listing in compliance with our insertion rules, offering only one service or product, placed in the right category at a fair price, with a clear title with good key words which help to find the service or product offered by going through a list of results. Description should contain all necessary information like the size, the color etc. A good listing is completed by a picture illustrating the item or the service which is sold. All necessary contact information should be correctly filled in the specific field.
For the three points above, all of them must respect our insertion rules as well as the Swiss law.”
The perils of low-quality content
We’ve listed a lot of benefits to high quality content, but what issues does low-quality content cause?
Imagine walking by your local clothing store and seeing dirty, tattered dresses tossed around in the window. You’d be unlikely to enter the store and even less likely to purchase from there.
The page a visitor lands on when first visiting your site is your storefront. It should be inviting and beautiful. You want the user generated content to show your platform in the best light possible. Blurry images, lacking descriptions and user profiles that don’t give insight into the owner will deter visitors. Guide users to create and submit better content and take care of the bad pieces that inevitably slip through. That way you’ll ensure a more successful platform with more engagement, traffic and higher retention.
Need help moderating your content? Get in touch and we’ll have a look at how you can optimize your current content moderation setup.
The lockdown’s lifting – at least in some parts of the world. But it’ll be some time yet before the wheels of global commerce begin to turn with any degree of regularity once again.
While it’s easy to assume that the shift to remote working, online shopping, and video-socializing has positively impacted most digital businesses and online marketplaces, that’s not necessarily the case.
However, while many digital services are undoubtedly thriving, this surge in demand continues to highlight different issues for many others – from a security, capacity, and scalability perspective.
In a similar way, companies that use technology to facilitate offline services – such as socializing, dating, or the exchange of services – are having to pivot to find new ways to stay relevant and active.
Let’s take a closer look at how many different digitally-driven companies in different sectors are addressing and overcoming the challenges they face.
Loud & Clear
One area that’s seen huge expansion during the lockdown is videoconferencing. It’s easy to see why.
Prior to the pandemic, one particular platform called Zoom was growing steadily, mostly among business customers. With 10 million active daily users back in December 2019, expectations were moderately ambitious. But fast-forward to April 2020, and user numbers had grown to an astonishing 300 million.
We all know what happened there. But then something else became apparent – Zoom wasn’t as secure as many users first thought. Cue an onslaught of privacy issues, such as ‘Zoom bombers’ and other uninvited video chat guests intent on password and identity theft.
To counter the issues the platform faced, the team has now rolled out end-to-end encryption: for its paid users. But despite these issues, Zoom continues to make massive profits – making $27m between February and April 2020: a sharp increase compared with its $198,000 profit just 12 months ago.
So what’s Zoom’s secret? People need it right now. Not just businesses intent on maintaining contact between usually office-based staff, but everyone else too – from those looking to connect with families and friends, to the global events industry which has literally moved talks, seminars, and other discussion-based happenings to the digital realm (as for instance exemplified with the recent Global Online Classifieds Summit).
But is its success sustainable? While it’s clear that ‘encryption-for-some’ must become ‘encryption-for-all’ in the long-term, right now it seems need outweighs any particular risk.
In short, it’s become an essential utility for many.
Eking Out A Living From eCommerce
In a similar way, the lockdown has sparked a massive upturn for online shopping. Given that over a third of shoppers are apparently unwilling to return to bricks and mortar stores until a COVID-19 vaccine is available, it’s not surprising that many large online retailers, fulfillment services, and manufacturers are reporting demand outstripping anything they could have been prepared for.
Of course, Amazon, the global eCommerce giant, is leading the way, as we’d assume – with Q1 2020 results 26% up year-on-year. In fact, given the increased demand for its services, Amazon has recruited an additional 175,000 people during the COVID-19 crisis.
Pre-financial announcements, the company was reportedly making $11,000 per second back in April. However, it in fact transpires that Amazon’s actually making a loss right now. All of the extra revenues are being used to pay workers and increase capacity.
Looking ahead, the mighty online retailer is unlikely to be toppled anytime soon; though it clearly demonstrates that they too have had to prioritize meeting demand rather than doubling down on profitability.
But, while Amazon’s offline order fulfillment service may be suffering, it’s not hard to believe that losses are being offset by its purely digital services – TV, music, eBooks, cloud computing services. Diversification has presumably been its saving grace.
Beyond The Ban
However, many other businesses who essentially use digital services to enhance the customer experience – and automate backend processes such as data collation, CRM functionality, and order processing – are facing tough times.
Take the travel sector for instance, which has probably taken the hardest hit of all, given the restrictions that were put in place to stop the spread of corona virus.
In the absence of being able to guarantee immediate bookings, many companies are asking customers to book for 2021 already, in an attempt to maintain cash flow and remain operational.
However, companies that would usually generate smaller profits from multiple bookings and casual stays could lose out if things don’t recover quickly. In cases like these, it really is a case of the strongest surviving.
But that said, some well-placed creativity and innovation can go a long way.
Take Airbnb, for example, which has recently rolled out its new Online Experiences initiative to not only boost revenues, but to give customers a taste of what everyone’s missing out on, and to help bring people closer together – in a way that picks up where Airbnb’s popular in-person experiences left off.
Using the service, customers can learn and interact with experts and enthusiasts from all over the world; doing everything from family baking sessions to taking part in history quizzes – both for fun and educational purposes.
Could it be that the service that started life as a couch surfing app becomes a bonafide education platform? Only time will tell. But Airbnb’s well-timed pivot certainly plays to its strengths.
In Sweden, travel company, Apollo Tours, has started focusing on the domestic market rather than far-flung destinations. Anticipating that international travel will take a while to be fully operational, Apollo is offering and organizing local activities and training sessions – for everything from mountain biking to yoga – to give customers something proactive to be able to do during the summer vacation, both alone or in small groups.
Love In A COVID 19 Climate
Interaction is just as important as stimulation. We’re social creatures after all. And while many of us have learned to deal with being distanced from our loved ones, what about those looking for love? The countless singletons and lonely hearts out there unable to meet with prospective partners in person.
Well, dating apps and platforms open doors to new matches. They provide a safe space to interact, message, and meet new people who share the same interests and outlook.
While Tinder’s going all out encouraging users to go on virtual dates – co-watching Netflix shows and movies, ordering takeout from the same place and dining by FaceTime – the stark advice to maintain distance and avoid sneaky visits to your intended’s sleeping quarters remains in place.
Up and comer app, Hinge is attempting to bridge the lockdown divide with its own bespoke ‘date from home’ feature – connecting matched users to those ready to video chat there and then.
While these efforts may be admirable, in their efforts to capture different aspects of spontaneity, meaningful connections, and quality time, in effect they haven’t deviated too far from their original offerings.
These features might actually be kept long-term for those keen to maintain physical distance before meeting someone new in person – or where busy schedules don’t allow – let’s be honest: there’s no substitute for face-to-face meetings where affairs of the heart are concerned.
Focus On Users: Nothing Else
Ultimately, what can you do when the very nature of your business model is under threat? You find ways to give your customers what they want.
As the companies mentioned are realizing, supporting users is what counts – offering real value, in the most authentic, meaningful way possible
It’s about putting them first – not just to keep them engaged and subscribed to your service or platform – but to genuinely offer help and support during a difficult time.
This was a sentiment echoed when we spoke with online marketplace, FINN.no, fraud manager, Geir Petter Gjefsen recently. By focusing on its users, and actively encouraging its users to ask for help or help others during the crisis, not only did the initial dip in traffic recover, but deeper customer bonds were formed.
Similarly, eBayK (eBay Kleinanzeigen), a free online classifieds market that’s committed to sustainable trade, created a ‘Neighborhood Help’ category where customers could offer their service – from dog walking to tuition – as the world faced COVID 19 uncertainty. The result? A peak in traffic and 40 million live ads.
All things considered, to stay afloat, maintain customer loyalty, and to come out the other side of the crisis intact, digital businesses need to be agile. They need to adapt by focusing on their own strengths and tailoring even closer to what their customers need. As teams need to focus on delivering the best possible service, machine learning, AI and outsourced agents can play a part in helping moderate the content itself.
Now is the time for action and innovation. After all, what have you got to lose?
For a long time, we’ve been told that data is the holy grail of growth. Everything must be data driven and no decision should be taken without being supported by data. There’s just one problem. As data collection has become easier the amount of data to dig into has grown radically. That sounds like a luxury problem (and it is), but really, it makes it very time consuming to approach data driven actions in a meaningful way.
In the past we had to mine for data. Getting any insight into user behavior was hard work and all data that could be gathered was treasured. Now data gathering is more often than not automated and much of it happens without us even making a conscious decision on what to gather and why.
This results in a ton of data and much of it is just sitting there untapped. In fact, according to Splunk, half of all data collected by companies is unused, also known as dark data.
The solution isn’t to stop collecting data of course. It’s always good to build your databank, one day you might need the insight.
But to stop data apathy and to actually work in a data driven manner it’s a good idea to stop trying to find meaning in your unsorted piles of data and instead turn the process on its head.
Start by setting growth goals, pick a couple of data points that could support you in investigating potentially untapped avenues of growth and decide on the KPI’s that will help you measure success. Once that’s done, you’ll have a pretty good picture of what data you should track.
This will limit the scope of data you have to dig through and help you get a much better overview of the success of your actions.
Want some tangible examples of how data can be used to grow online marketplaces? Watch as Natalia Cieslak, Director of AppJobs Institute and Sigrid Zeuthen, Global Marketing Manager at Besedo discuss real life applications of data.
Webinar: Applying a data driven approach to help your marketplace grow
COVID-19 continues to create new challenges for all. To stay connected, we’re seeing businesses and consumers spend an increasing amount of time online – using different chat and video conferencing platforms to stay connected, and combat social distancing and self-isolation.
We’ve also seen the resurgence of interaction via video games during the lockdown, as we explore new ways to entertain ourselves and connect with others. However, a sudden influx of gamers also brings a new set of content moderation issues – for platform owners, games developers, and gamers alike.
Let’s take a closer look.
The video game industry was already in good shape before the global pandemic. In 2019, ISFE (Interactive Software Federation of Europe) reported a 15% rise between 2017 and 2018, turning over a combined €21bn. Another report by ISFE shows that over half of the EU’s population played video games in 2018 – some 250 million players, gaming for an average of nearly 9 hours per week: with a pretty even gender split.
It’s not surprising that the fastest growing demographic was the 25-34 age group – the generation who grew alongside Nintendo, Sony, and Microsoft consoles. However, gaming has broader demographic appeal too. A 2019 survey conducted by AARP (American Association Of Retired Persons) revealed that 44% of 50+ Americans enjoyed video games at least once a month.
According to GSD (Games Sales Data) in the week commencing 16th March 2020, right at the start of the lockdown, video games sales increased by 63% on the previous week. Digital sales have outstripped physical sales too, and console sales rose by 155% to 259,169 units in the same period.
But stats aside, when you consider the level of engagement possible, it’s clear that gaming is more than just ‘playing’. In April, the popular game Fortnite held a virtual concert with rapper Travis Scott; which was attended by no less than 12.3 million gamers around the world – a record audience for an in-game event.
Clearly, for gaming the only way is up right now. But given the sharp increases, and the increasingly creative and innovative ways gaming platforms are being used as social networks – how can developers ensure every gamer remains safe from bullying, harassment, and unwanted content?
Ready Player One?
If all games have one thing in common, it’s rules. The influx of new gamers presents new challenges in a number of ways, where content moderation is concerned. Firstly, because uninitiated gamers (often referred to as noob/newbie/nub) are likely to be unfamiliar with established, pre-existing rules for online multiplayer games or the accepted social niceties or jargon of different platforms.
From a new user’s perspective, there’s often a tendency to carry over offline behaviours into the online environment – without consideration or a full understanding of the consequences. The Gamer has an extensive list of etiquette guidelines which get frequently broken by online multiplayer gamers, from common courtesies such as not swearing in front of younger users on voice-chat, not spamming chat-boxes to not ‘rage-quitting’ a co-operative game due to frustration.
However, when playing in a global arena, gamers might also encounter subtle cultural differences and behave in a way which is considered offensive to certain other groups of people.
Another major concern, as outlined by Otis Burris, Besedo’s Vice President Of Partnerships, outlined in a recent interview, which affects all online platforms, is the need to “stay ahead of the next creative idea in scams and frauds or outright abuse, bullying and even grooming to protect all users” because “fraudsters, scammers and predators are always evolving.”
Multiplayer online gaming is open to negative exploitation by individuals with malicious intent or grooming, simply because of the potential anonymity and sheer numbers of gamers taking part simultaneously around the globe.
While The Gamer list spells out that kids (in particular) should never use someone else’s credit card to pay for in-game items, when you consider just how open gaming can be from an interaction perspective, the fact that these details could easily be obtained by deception or coercion needs to be tackled.
A New Challenger Has Entered
In terms of multiplayer online gaming, cyberbullying and its regulation continue to be a prevalent issue. Some of the potential ways in which users can manipulate gaming environments in order to bully others include:
- Ganging up on other players
- Sending or posting negative or hurtful messages (using in-game chat-boxes for example)
- Swearing or making negative remarks about other players that turn into bullying
- Excluding the other person from playing in a particular group
- Anonymously harassing strangers
- Duping more vulnerable gamers into revealing personal information (such as passwords)
- Using peer pressure to push others into perform acts they wouldn’t normally have
Whilst cyberbullying amongst children is fairly well researched, negative online interactions between adults are less well documented and studied. The 2019 report ‘Adult Online Harms’ (commissioned by the UK Council for Internet Safety Evidence Group) investigated internet safety issues amongst UK adults, and even acknowledges the lack of research into the effect of cyberbullying on adults.
With so much to be on the lookout for, how can online gaming become a safer space to play in for children, teenagers, and adults alike?
According to a 2019 report for the UK’s converged communications regulator Ofcom: “The fast-paced, highly-competitive nature of online platforms can drive businesses to prioritize growing an active user base over the moderation of online content.
“Developing and implementing an effective content moderation system takes time, effort and finance, each of which may be a constraint on a rapidly growing platform in a competitive marketplace.”
The stats show that 13% of people have stopped using an online service after observing harassment of others. Clearly, targeted harassment, hate speech, and social bullying need to stop if games manufacturers want to minimize churn rate and risk losing gamers to competitors.
So how can effective content moderation help?
Let’s look at a case study cited in the Ofcom report. As an example of effective content moderation, they refer to the online multiplayer game ‘League Of Legends’ which has approximately 80 million active players. The publishers, Riot Games, explored a new way of promoting positive interactions.
Users who logged frequent negative interactions were sanctioned with an interaction ‘budget’ or ‘limited chat mode’. Players who then modified their behavior and logged positive interactions gained release from the restrictions.
As a result of these sanctions, the developers noted a 7% drop in bad language in general and an overall increase in positive interactions.
Taking ‘League Of Legends’ as an example, a combination of human and AI (Artificial Intelligence) content moderation can encourage more socially positive content.
For example, a number of social media platforms have recently introduced ways of helpfully offering users alternatives to UGC (user generated content) which is potentially harmful or offensive, giving users a chance to self-regulate and make better choices before posting. In addition, offensive language within a post can be translated into non-offensive forms and users are presented with an optional ‘clean version’.
Nudging is also another technique which can be employed to encourage users to question and delay posting something potentially offensive by creating subtle incentives to make the right choice and thereby help to reduce the overall number of negative posts.
Chatbots, disguised as real users, can also be deployed to make interventions in response to specific negative comments posted by users, such as challenging racist or homophobic remarks and prompting an improvement in the user’s online behavior.
Finally, applying a layer of content moderation to ensure that inappropriate content is caught before it reaches other gamers will help keep communities positive and healthy. Ensuring higher engagement and less user leakage.
Game Over: Retry?
Making good from a bad situation, the current restrictions on social interaction offer a great opportunity for the gaming industry to draw in a new audience and broaden the market.
It also continues to inspire creative innovations in artistry and immersive storytelling, offering new and exciting forms of entertainment, pushing the boundaries of technological possibility, and generating new business models.
But the gaming industry also needs to ensure it takes greater responsibility for the safety of gamers online by ensuring it incorporates robust content management strategies. Even if doing so at scale, especially when audience numbers are so great, takes a lot more than manual player intervention or reactive strategies alone.
This is a challenge we remain committed to at Besedo – using technology to meet the moderation needs of all digital platforms. Through a combination of machine learning, artificial intelligence, and manual moderation techniques we can build a bespoke set of solutions that can operate at scale.
To find out more about content moderation and gaming, or to arrange a product demonstration, contact our team!
While many businesses sadly are struggling due to the global Covid-19 pandemic a few have managed to grow. eBayK, Germany’s No. 1 classifieds site is one of them. They’ve kindly offered to share how they’ve approached the crisis, steps they’ve taken and strategies they’ve applied to avoid the negative impact caused by social distancing rules, lock-down and general unease among the worlds population caused by the uncertainty of the situation.
We hope that other marketplaces can benefit from learning about eBayk’s approach and that it can help those that struggle, turn the negative trend around.
Interviewer Please introduce yourself and eBayK.
Stefanie Pritzkow I’m Stefanie Pritzkow, Head of Customer Support at eBay Kleinanzeigen. eBay Kleinanzeigen is a free online classifieds market that brings the joy of sustainable trade to everyone. Already today users buy and sell on Germany’s No. 1 for classifieds mainly second-hand. In this way they make an active contribution to more sustainability. On average, more than 40 million ads are available in numerous categories – from children’s supplies to electronics and real estate. eBay Kleinanzeigen also offers small and medium-sized businesses the opportunity to present their services online. Around 32 million users per month make eBay Kleinanzeigen one of the most widely used websites in Germany. The online classifieds market was launched by eBay in September 2009.
Interviewer When did you start preparing for the Corona crisis?
Stefanie Pritzkow We have begun to take actions with the announcement of national measures to contain the spread of COVID-19 in the middle of march.
Interviewer What were your main concerns in terms of impact due to the Corona crisis?
Stefanie Pritzkow As in many other areas, the impact of the spread of the virus itself and the measures taken by the government on our business was difficult to assess. In particular, there was the question of how the fear of contagion would affect our local business, which is characterized by personal contact of buyers and sellers. Furthermore, it was not possible to predict whether people would continue to buy and sell at all.
Interviewer Which actions did you take to lighten the impact of the Corona crisis?
Stefanie Pritzkow At eBay Kleinanzeigen, simplicity is one of our core principles. Short-term tactics do not fit in with this principle. Nevertheless, it was necessary to adapt to the new situation. Among other things, we made it easier to find items that could be shipped. And, of course, our iron principle, which obliges our sellers to offer pick up of items was also put to the test. We have suspended this until further notice. We have also temporarily restricted trade in certain products, including respiratory masks and disinfectants. But the crisis also revealed positive aspects. We were particularly pleased with the willingness to help of many users. We wanted to support this commitment, which is so important these days, which is why we created the “Neighbourhood Help” category. In this category, users can offer their services to neighbours in need of help – for example, shopping assistance, walks with the dog or tutoring for schoolchildren. Within a few days, more than 10,000 ads were available. And even now, months after the crisis began, many users continue to offer their help.
Interviewer What challenges did you face in handling the Corona crisis?
Stefanie Pritzkow Uncertainty. And dependence. We did not know what measures would be taken next by the national and local governments. We’ve had to digitalize a part of our corporate culture – at eBay Kleinanzeigen, we work together in cross-functional teams. Personal exchange is therefore particularly important. Video conferences cannot replace spontaneous exchange. At the same time, for us “New Work” is not zeitgeist, but part of everyday life. The processes could therefore be adapted to the new circumstances without too much challenge. Furthermore, it has once again been shown that eBay Kleinanzeigen is relevant for many people. Around one in two onliners uses eBay Kleinanzeigen – each month. We would like to build on this.
Interviewer How did traffic/user engagement look in March/April? Why do you think it looked like that?
Stefanie Pritzkow We initially saw a slight drop with the announcement of governmental measures. People initially had other worries and had to come to terms with the new situation. But just one week later we were back at the previous year’s level. Since then, use has increased significantly. Many people obviously use the “extra free time” to get rid of unused things. The number of new ads even tripled on some days compared to the previous year. We have reached an important milestone for us, 40 million live ads, much earlier than initially expected. We have also set new records for weekly app downloads and monthly visits.
Interviewer Why do you think you were successful in deflecting the negative impact from the pandemic when other marketplaces are getting hit quite hard?
Stefanie Pritzkow We have taken measures to adapt to the needs of users in the current situation, e.g. with new features which make it easier to find items to ship and the new category “Neighbourhood Help”. As market leader for classifieds in Germany, eBay Kleinanzeigen was naturally the first place to go for many users. Our platform has a high reach and people know that it is very likely to find a match, even in uncertain times.
Interviewer What’s the number one step you’d recommend taking in a similar situation in the future or to those struggling right now?
Stefanie Pritzkow In times of crisis, it is especially important for a brand to stay true to their core principles. For us, this means enabling our users to trade in a safe, convenient and sustainable way by providing a stable and secure platform. People have other things to worry about. That is why we wanted to make sure that they have the best possible experience on eBay Kleinanzeigen.
Interviewer Are there any actions you took due to the pandemic that you intent to carry forward even after the crisis is over?
Stefanie Pritzkow We have seen that sustainable action also – or even especially – played an important role during the crisis. A recent study by Accenture has shown that 45% of consumers make more sustainable choices when shopping and are likely to continue to do so after the crisis. We want to let people know that by trading on eBay Kleinanzeigen, users make an active contribution to more sustainability. We have recently launched a campaign that highlights this aspect in a special way. The new features already mentioned have met with a very positive response. These make trading on eBay Kleinanzeigen more efficient and easier – even after the crisis. In addition, we will evaluate in due course whether we keep our new category “Neighbourhood Help” and how it can create value in regular times.
Interviewer If you saw an increase in traffic due to the pandemic, do you think that increase will last even after the crisis is over?
Stefanie Pritzkow We assume that the strong growth, as we have seen in the past few weeks, will decrease somewhat sooner or later, but will remain at a high level. As mentioned, we have reached an important milestone for us, 40 million live ads, much earlier than initially expected. It was only in October 2019 that the number of ads available at the same time rose to 35 million. The year before, in October 2018, we reached the number of 30 million live ads. Use has increased continuously over the last years and we expect this development to go on.
Stefanie Pritzkow, Head of Customer Support,
As Head of Customer Support at eBay Kleinanzeigen, Stefanie Pritzkow is responsible for all customer service activities and also manages the continuous improvement process for end customers.
Before taking up this position, she was responsible for various projects in the area of customer service at eBay Kleinanzeigen. Before joining eBay Kleinanzeigen in September 2011, she worked for Deutsche Lufthansa AG. There she passed through several stations, first as a recruiter and finally as customer team consultant for Lufthansa e-Commerce GmbH in Frankfurt am Main. She then moved on to the Online Sales department and was responsible for customer service at lufthansa.com.
Stefanie Pritzkow studied business administration at the University of Applied Sciences in Berlin with a focus on human resources, sales and marketing. She lives in Berlin.
At the end of February 2020 the Norwegian government made it clear that Covid-19 might turn into a global crisis. At that time FINN.no started taking active measures to ensure their business and employees.
While FINN.no initially saw a downturn in traffic as users were caught up in the uncertainty of the situation. However quick implementation of trust-building decisions like bans on face masks and disinfectant products quickly reversed the trend.
Other initiatives like creating 2 new categories; one for people wanting to help others and one for those needing help due to the Corono Pandemic cemented the loyalty of FINN.no’s user base.
Creating these categories wasn’t completely without issues though as some users abused and tried take advantage of the pro bono initiative.
Watch the full interview with Geir Petter Gjefsen to learn more about how FINN.no grew their traffic during the Corono Pandemic.
The restrictions put in place to combat the global Covid-19 pandemic has had a devastating effect on many businesses. Social distancing, restrictions on physical services and a downturn in spending has also hurt most marketplaces and sharing economy sites despite their digital nature.
After months of closed down societies and harsh restrictions, nations are slowly and carefully opening up again, but the world is forever changed. Businesses who understands and adapts quickly to the new reality will be successful. To do so they’ll need to understand the challenges and opportunities arising in the post-corona business landscape.
We’ve asked 8 online marketplace experts to share their thoughts and predictions to help you prepare and adapt to the new reality.
User safety is key for all online platforms, particularly when you’re dealing with vulnerable youngsters. Moderating can be challenging and getting the balance between censorship and safety right can be hard.
We sat down with industry veteran and founder of Friendbase; Deborah Lygonis, to discuss the experience she’s gained from developing and running a virtual world for teens.
Interviewer: Hi Deborah. Could you please give us a short introduction to yourself?
Deborah: My name is Deborah Lygonis and I am a serial entrepreneur. I have started and run several businesses over the years, mainly within the software and gaming sector, but also e-health and other tech. I love tech and I’m passionate about startups and entrepreneurship. I also work as a coach and mentor for entrepreneurs within what’s called the European Space Agency Business Incubator; The ESA BIC, and for a foundation called Entrepreneurs Without Borders.
Interviewer: Wow! That’s an impressive background. One of the things you’ve started as an entrepreneur is Friendbase, right? Could you tell us a bit more about that?
Deborah: Yes. Friendbase is a company that I founded with my brother and a third guy called Andreas. We’ve known each other for many years. Well, obviously, I’ve known my brother for many years, but Andreas as well, has been part of our group of friends and acquaintances for many, many years. We decided to found Friendbase in 2013. We saw that the whole idea of virtual worlds hadn’t really migrated over to smartphones and we wanted to see if it was possible to create a complete cross-platform version.
So, we put together a mockup of an Android, IOS, Web version and put it out there to see if that was something that today’s young people would like.
Friendbase is a virtual world for teens where they can chat, play games and also design their looks and spaces. Now we’re also moving towards Ed tech in the way that we’ll be introducing quizzes that are both for fun but also have learning elements in them.
Interviewer: That sounds awesome. What would you say is the main challenge when it comes to running cross-platform online community and specifically one that caters to teens?
Deborah: There are a lot of challenges with startups in general, but also, of course, running an online community. One challenge is when you have people that meet each other in the forms of Avatar and written chat and they have different personalities and different backgrounds that can cause them to clash. The thing is that when you write in a chat, the nuances in the language don’t come through as opposed to when you have a conversation face to face. It’s really very hard to judge, the small subtleties in language and that can lead to misunderstandings.
Add to that as well that there are lots of different nationalities online. That in itself can lead to misunderstandings because they don’t speak the same language.
What starts off as a friendly conversation can actually rapidly deteriorate and end up in a conflict just because of these misunderstandings. That is a challenge, but that’s a general challenge, I think, with written social interactions.
Interviewer: Just so we understand how Friendsbase work. Do you have one to one chat, one to many chats or group chats? How does it work?
Deborah: The setup is that we can have up to 20 avatars in one space. No more, because then it will get too cluttered on the small phone screens. So, you can have group chats. I mean, you see the avatars and then they have a text bubble as they write so that it can be several people in one conversation.
Interviewer: Do you have the opportunity for groups of friends to form and join the same kind of space together?
Deborah: Yes. Each member has its own space. They can also invite and open up their space for other friends.
Interviewer: And in that regard. What you often see in the real world with team dynamics is that there is a group of friends and there is the popular people in that group. And then one person who maybe is a little bit an outsider, who will at times be bullied by the rest of the group. Do you see people ganging up on each other sometimes?
Deborah: I haven’t seen groups of people ganging up on one individual. It’s more the other way around. There are individuals that are out to cause havoc and who are just online to be toxic.
Interviewer: That means that you have in general, you have a really nice and good user base. But then there’s the rotten fruits that come in from time to time.
Deborah: That is what it is like today. We are still fairly early stage, though, when it comes to the amount of users. So I would expect this to change over time. And this is something that we’re prepared for. We added safety tools at a really early stage to be able to learn how to handle issues like this and also how to moderate the platform when incidents occur. So, I think that even though that we don’t have that type of ganging up on each other at the moment, I would expect that to happen in the future.
Interviewer: But it sounds like you’re prepared for it. Now you’ve made a really nice segue into my next question; What is the main motivation challenges you experienced running Friendbase? What are the main challenges right now and what do you expect you will have to handle later on?
Deborah: I think that a challenge in itself for all social platforms is to set the bar on what is acceptable and not.
Our target group are mid teens and up. So we don’t expect young children to be on Friendsbase. We feel that if we made a social world for young children, then we’d need to have a completely different set of regulations, more controlled regulations, rather than when it is teenagers and upwards.
However, that demographic is also very vulnerable. So, of course, there has to be some sort of measurement in place. The challenge is to determine, at what level do you want to put the safety bar and also how can you tell the difference between what is banter between friends and when it sort of flips over to actually be toxic or bullying? That’s something that is really, really hard to differ between. And I think that if you work with chat filters, then you have to have some sort of additional reporting system for when maybe the filters don’t manage this challenge. The filter is only a filter and can’t determine between the two. So that’s one challenge. It’s also complex to enforce the rules that are in place to protect the users without being perceived as controlling or patronizing.
At the moment, we also have a challenge in that we have users that come back solely for the purpose to cause havoc and create a toxic environment. We track them down and we ban their accounts, but it’s a continuous process.
That is something that should it escalate over time it will become increasingly time consuming. That’s why it’s really, really important for us to have tools in place so that it doesn’t have to be moderated manually. That will just take too much resource and time.
Of course, you have the even darker side of the internet; sexual predators that are out to groom vulnerable youngsters and to get them to maybe move over to a different platform where they can be used in a way that is extremely negative.
That’s something that is difficult to handle. But today, thanks to artificial intelligence and again, amazing toolsets out there. There are attempts to look at speech patterns and try and identify that sort of behavior. And there it’s also really great to have your own tool sets where the user can actually report someone if they feel threatened or if they feel that someone’s really creepy.
Interviewer: When you have returning users who have made it their goal to attack the platform, in a malicious way, do you see that it’s the same people returning based on their IP or the way that they talk?
Deborah: It’s not always possible to see it based on their IP because they use different ways of logging in. However, given their behavior, we can quickly identify them. And we have a group of ambassadors as well online on Friendbase that help us. On top of that we have a chat filter which can red flag certain behavior. So that helps as well.
There are a group that come back over and over again and for some mysterious reason they always use the same username. So they’re not that hard to identify. That group is actually easier to control than a group which has a different motive on why they are online and why they are trying to target youngsters. The toxic ones that are just there because they think it’s fun to behave badly. It’s easy to find them and close down their accounts.
Interviewer: We already touched upon this, but what would you say is the hardest moderation challenge to solve for you right now?
Deborah: The hardest moderation challenge to solve is, of course, finding the people who are deliberately out to target lonely youngsters that hunger for social contact. The whole grooming issue online is a problem. We are constantly trying to find new toolsets and encourage our users to contact us if there’s something that doesn’t feel right. So grooming is something that we’re very, very much aware of. If we happen to shut down someone’s account by mistake for a couple of hours, they’re most welcome to come to us and ask why. But we’d rather be safe than sorry when it comes to this kind of behavior. However, it is hard to track because it can be so very, very subtle in the beginning.
Interviewer: Friendsbase has been around for a while now. Are there any challenges that has changed or increased in occurrence over the years? And if yes. How?
Deborah: Actually, not really. I think the difference is in our own behavior as we are so much more aware of how we can solve different problems.
Bullying has been around for years. Free Internet as well. Sexual harassment of youngsters and between adults, of course, has also been around for years. It’s nothing new. I mean, the Internet is a fantastic place to be. It democratizes learning. You have access to the world and knowledge and entertainment.
But there is a dark side to it. From a bullying perspective you have the fact that previously, if you were bullied at school, you could go home or you could go to your social group somewhere else and you would have somewhere where you would feel safe.
When it’s online, it’s 24/7.
And it is relentless when it comes to the whole, child abuse part. Of course, it existed before as well. But now with the Internet, perpetrators can find groups that have the same desires as themselves and somehow together they can convince themselves as a group that it’s more acceptable. Which is awful. So that is the bad part of the net.
So, when you ask: Have the challenges changed or increased since we started Friendbase? No, not really. But what has changed is the attitude of how important it is to actually address these issues. When we started the company in 2013. We didn’t really talk that much about safety tools. I mean, we talked about should we have whitelist or a blacklist, the words. It was more on that level. But today most social platforms, they have moderation, they have toolsets, they have guidelines and policies and so forth.
So, I think that we who work with online communities as a whole have evolved a lot over the past years.
Interviewer: Yeah, I would say today in 2020, you probably wouldn’t be able to launch a social community or platform without launching with some sort of moderation tools and well-defined guidelines.
Deborah: I think you’re right. Several years ago, I did the pitch where we were talking about online safety and tools of moderation and were completely slaughtered. What we were told was that being good online or this whole be cool to be kind is going to stop our growth. It’s much better to let it all run rampant and then it will grow much faster. I don’t think anyone would say something like that today. So that’s a huge shift in mindset. Which is great. We welcome it.
Interviewer: That’s a fantastic story. You’ve been in this industry so long; you’ve seen this change. I find it fascinating that just seven years ago when you said I want to protect my users, people laughed at you. And now people would laugh at you if you said, I’m gonna go live without it.
Deborah: I know. Can you imagine going on stage today saying that I don’t care about safety? I mean, people would be so shocked.
Interviewer: You said before when we talked about the main challenges if you experienced growth, you’d need to change your approach to moderation and automate more in order to just keep up?
Deborah: Yes, definitely. We try and stay on top of what toolsets are out there.
We build in our own functionality, such as muting users. So, if someone is harassing you, you can mute them so that you can’t see what they’re writing. Small changes like that, we can do ourselves, which will be helpful.
Something I’d like to see more and that we’ve actually designed a research project around is to not only detect and ban bad behavior, but to encourage good behavior.
Because that in itself will also create a more positive environment.
That’s something that we’re really excited about, to work with people that are experts within gamification and natural language processing to see how can we create tool sets where we can encourage good behavior and see what we can do. Maybe we can start deflecting a conversation that is obviously on its way to going seriously wrong. It could be so simple as a small time delay when somebody writes something really toxic with a pop up saying: “Do you really want to say this?”. To just make someone think once more.
This is something that we’re looking into. It’s super interesting. And I hear there’s a couple of companies just the last few months that are also talking about creating tool sets for something like this. So, I think it’s going to be a really, really interesting development over the coming years.
Interviewer: It sounds like safety is very important to Friendbase. Why is that?
Deborah: Why is that? Quite early on, we who work in the company discussed what our core values should be. And one of the core values we decided upon is inclusion. Everybody is welcome. And for everyone to feel welcome. You have to have a welcoming atmosphere.
When you continue along that line of thought, then obviously you come to the point where, OK, if everyone’s going to be welcome and you want it to be a friendly space, then somewhere you’re going to have to stop toxic behavior. So, for us safety, it’s just part of our core values.
And also, I have a teenage daughter who loves gaming. I’ve seen how platforms behave. She’s part of groups that interact with each other online. I just feel that there must be a way of doing things better. It’s as simple as that. We can do better than this, letting it be super toxic. And there are some amazing people out there working with fantastic toolsets. There are some fantastic platforms and social games out there that also work in the same sort of direction as we do. It’s really great.
And you know what? To be quite honest, I think that there have been several case studies where it’s proven as well from a business perspective that you have a longer retention and a higher profitability when you can keep your user online for a longer time. So, you know, in itself, from a business sense, it also makes perfect sense to work in a way where you keep your user as long as possible.
Interviewer: You have tons and tons of experience obviously with startups and social platforms. If you were to give a piece of advice to someone who is running a similar service to Friendbase or even who are thinking about starting one, what would that be?
Deborah: It would be, first of all, to determine what level of safety you want to have, depending on your user group. Obviously, the younger demographic you have, the more safety tools you must ensure that you have in place. Also, not to build everything yourself. Especially if you’re working on an international market with many languages. Just to be able to filter many languages and in a decent way is a huge undertaking. If you think that you’re going to be able to hack together something yourself, it’s not that easy. It’s better to work with a tool or a company that has that as their core business because they will constantly be working with the state of the art solutions.
So better to liaise with switched on companies that already work with this as their main reason for being. I think that’s important. And then, of course, add your own easy to report system, easy to communicate with your user’s system so that you have sort of a double layer.
I mean, I’ve seen several different companies that work now with different moderation tools and chat filters and so forth. Many of them they do stellar work. And it’s important at the end of the day because if anything really, really bad would happen, then you’re just finished as a business. It’s as simple as that. The last thing you would want is to have someone knock on your door and shut you down because something’s happened online in your platform.
Interviewer: Definitely! What’s in the future for Friendbase? Where are you in two years?
Deborah: Where are we now? We’re now raising funds, because what we’ve seen is that we have a very, very loyal member base and they are wanting to invite more of their friends. And I think that with very, very little work, we can get the platform on a really interesting growth path.
Deborah: So, yeah, our our aim is to become one of the big global players. It’s exciting times ahead.
Interviewer: For sure. Any closing remarks? Any statements you want to get out there from a personal point of view or from Friendbase?
Deborah: The Internet is a great place to be because there’s so much you can learn. You can meet so many interesting people. But, there is a dark side as well. And you have to be aware of it. Just by being a little bit street smart online people can keep themselves safe. And we’re getting there. People are learning. Schools have it in their curriculum, social platforms try to teach users how to behave. So slowly but surely, we’re getting there.