Could you tell us a bit about yourself?
My name is Kevin Ducón from Bogotá, Colombia. I hold an MSc in Computer Science from Universidad Politecnica de Madrid and a BSc in Computer Science from Universidad Ditrital de Bogotá.
I have been working in information and communications technology for more than fifteen years and began working at Besedo five years ago, specializing in IT Service Management and Information Security. I started as a local ICT Administrator in our Colombian center, then as an ICT Supervisor, and currently, I am the Global Head of ICT-IS.
Over the past five years, I have applied my knowledge and skills to this ever-changing industry by creating policies and processes aligned with the industry’s best practices, supporting our clients, and continuously improving our on-going projects.
What are your responsibilities as Global Head of ICT-IS?
As the Global Head of ICT-IS at Besedo, I’m in charge of all levels of support in information technology and communications.
I oversee the Global ICT work, and together with my ICT team, I make sure that we fulfill our most important metrics – availability, service-level agreement, and customer satisfaction.
On top of that, I manage and provide insights into our security guidelines and develop strategic and operational plans for the ICT department to ensure that all necessary tools and processes are fully functional to achieve the company’s overarching goals and ambitions.
I also have hands-on technical responsibilities in supporting and developing mission-critical systems, which are running 24/7, to make sure our moderation services are successfully delivered to our customers worldwide.
From an ICT point of view, what are the key elements that must go right when running a content moderation operation?
The essential part from an ICT standpoint when running a content moderation operation is to truly understand the priorities and needs specific to the operation. Having an IT strategy to translate business needs into functioning IT operations is vital for a successful moderation setup.
Furthermore, ensuring good practices in network infrastructure and server setup, device management, and IT support is key to achieve a solid moderation operation. Finally, it’s crucial to have a knowledgeable and committed IT staff behind the scenes.
What are the common things that can go wrong?
When running a moderation operation, many potential issues can occur, some of the most common hazards include Internet connection, networks, or servers going down, power outages and failed infrastructure deployments.
For instance, content moderation relies heavily on a stable Internet connection, and you cannot blindly trust that it will just work. Instead, you need to make sure that your Internet service always works to its full capacity.
What safety measures are needed to make sure the moderation operation runs smoothly?
It’s important to have proactive safety measures in place to guarantee that the moderation operation always is carried out correctly. A good first step is to plan the implementation of the moderation services thoroughly before putting disaster mitigations plans in place.
For example, at Besedo, we work with several Internet service providers in case one of those fails to deliver correctly. We also work with fault-tolerant networks, a resilient infrastructure, third-party support, etc., to ensure that our IT operations remain stable when potential risks materialize.
On top of this, we run daily IT checklists and use monitoring systems that allow us to prevent potential challenges during IT ops. Also, we have backup routines to avoid any information loss or damage and use UPS to keep our critical devices turned on.
All in all, for anyone looking to run a successful moderation operation, many countermeasures must be put in place to make sure that IT operations run smoothly.
What’s the best thing about your job?
My job allows me to work in the different areas of the ICT Function and with all the disciplines that contribute to the business. For some people, ICT only assists with end-user tickets because that’s what’s visible to them. However, IT is not just a commodity but a strategic ally for us to deliver the highest level of services to our customers.
I’m proud to apply my skill-set and knowledge to Besedo’s purpose and values, which I genuinely believe in. When I took the role as Global Head of ICT-IS, I sought out to implement our promise ‘grow with trust’ into everything we do in our team. This has shaped the ICT team’s goal to help all functions grow with trust, through efficient processes, guaranteed quality of services, and high customer satisfaction.
At Besedo, we have an excellent ICT team of committed and ambitious individuals who love what they do and work hard to improve the company every day.
Kevin Ducón is Besedo’s Global Head of ICT-IS. He has been working in information and communications technology for more than fifteen years. Over the past five years at Besedo, he has applied his knowledge and skills to the ever-changing content moderation industry.
Internet fraud. It’s pretty sophisticated. And for online marketplaces – or any other platform that relies on User Generated Content – it’s often well-hidden or undetectable.
Scammers are an increasingly resourceful bunch, so if there’s a system to be gamed, you can bet they’ll find a way to work it.
However, with the right insight, awareness, and detection processes in place, site owners can keep their users safe – and put a stop to scams before they endanger anyone.
Let’s have a look at some of the most common online scam types to be aware of on your online marketplace, how you stay on top of them, and ultimately how to prevent them.
Online Shopping Scams
One of the most common types of fraudsters plaguing digital marketplaces, online shopping scammers usually advertise high-ticket items for sale at low prices. Typically, these include mobile phones, video game consoles, laptops, jewelry, and cars – with commercial vehicles and heavy equipment on the upraise.
They may be advertised along with a believable, yet fabricated story. This can be something like the fact they’re selling ‘excess stock’ or that goods have ‘minor damages’ – for example.
The reason scammers do this is simply to give some degree of credibility to their request for partial payment upfront. Of course, they have no intention of selling any goods at all. They simply aim to dupe users.
As a marketplace owner, it’s important to advise your users that if something sounds too good to be true, it usually is. It is also vital to warn them against sending any form of payment before obtaining any goods. They should also be wary of paying by direct transfer, using prepaid cards, or any requests to pay for goods using cryptocurrencies.
Dating & Romance Scams
Dating scams are probably the best-known kinds of online fraud – a topic we’ve covered before in our blog.
While many of us have used flattering photos of ourselves in online dating profiles, there’s a big difference between presenting ourselves in our best light and creating a fake online identity.
While TV shows and high profile cases of this practice – known as ‘catfishing’ – have raised awareness, it still remains a common issue on a lot of dating sites.
Essentially, romance scams Works when a scammer (posing as an attractive man or woman) reaches out to a user, builds a relationship with them exclusively online – sometimes over a period of months – before proceeding to either ask them for money, or even to do favors for them: activities that could well be criminal in their nature.
Why does the scam work so well? Catfishers do whatever it takes to win their targets’ trust. And once that trust is established, the target is too emotionally invested to question the scammer’s motives.
While different official organizations – like the Online Dating Association – are doing more to raise awareness, dating sites themselves need to do more to highlight the dangers and behavior patterns fake users follow.
For example, there are many keywords and phrases catfishers use to make themselves sound more credible (as we outline here). They may claim to be religious or work in a trustworthy job – like the police or military.
A common struggle for many sites is that they’re not quick enough to remove scammers. Dating and romance scammers are quick to move the conversation away from the site to avoid detection – sites need to prevent that from happening already from scratch. Learn how you can create filters to detect and prevent personal details automatically.
Fake Charity Scams
Many of us are wary of so-called ‘chuggers’ (charity + muggers) approaching us on the street asking for donations and we’d be right to – given the recent news that one scam in London was so well-orchestrated that even those collecting cash didn’t know it was a shady operation.
However, online – where donation platforms are becoming increasingly popular owing to their ease of use – how can those donating be sure their money ends up where it’s supposed to?
Transparency is key. The more information a site offers about the charities they’re working with; how much (if anything) they take as commission; and how long donations take to reach each charity, the more trustworthy they’re likely to be.
But what about online marketplaces and classified sites? Charity scams are just as likely here – particularly in the wake of high profile disasters.
As a result, site owners need to advise their users to exercise caution when those requesting funds…
- say they’re from a charity they’ve never heard of
- won’t/can’t give all the details about who they’re collecting for
- seem to be pushing users to donate quickly
- say they only want cash or wire transfers (credit card is much safer)
- claim donations are tax-deductible
- offer sweepstakes prizes for donations
When working with charities, online marketplaces and classified sites should ensure that rigorous security checks are in place. For example, as phishing is a common fake charity scam, it’s crucial that any relevant in-platform messages that provide a link to an external ‘charity site’ are detected early on.
Online fraud and employment may sound like a fairly unlikely pairing, but in fact, it’s a lot more sophisticated than many might think.
There are numerous ways in which scammers abuse online marketplace and classified sites, and most of the time they’re looking to either extract money or steal your identity (more on that below too).
One of the most frequent employment-related scams is a fake job posting looking for people to handle ‘payment processing’. The scammer may find CV/Resumes online or they may post on credible boards – such as Craigslist.
The trick being played out here is one where the proceeds of crime are handled by the user (for a small commission) and transferred back to the fraudster – who is essentially using the ‘employee’ to launder money.
Another common job-related scam is one where ‘recruiters’ coax candidates into paying for additional job training or career development courses – or when an ‘employer’ asks candidates to cover the costs of a credit record check.
In all cases, employment-focused marketplace owners need to be acutely aware of anyone asking users to impart finance-related information or money.
However, these requests may not materialize until the conversation has been moved to email – away from the site – so it’s critical for those operating job boards to put some form of prevention and moderation effort in place.
Recent news that a young journalist had a job application withdrawn by someone pretending to be him – via email – is alarming but not uncommon. However, impersonation takes on a whole new meaning when linked to identity theft.
While the most likely scenario in which identity theft occurs is an online data breach, internet shopping also puts users at risk. According to Experian, 43% of online shopping identity theft happens during the peak holiday shopping season (Black Friday onward).
Many scammers use familiar tricks – like phishing – to steal personal details, debit and credit card details, and social security numbers; using them to buy goods (often high priced items in bulk), to claim refunds from ‘faulty’ items, or to open accounts in other peoples’ names to mask other fraudulent activities.
Scammers can buy stolen identities on the dark web very cheaply. And it’s not uncommon for fraudsters to advertise usually high priced items at low prices for quick sale on marketplaces… and then steal shoppers’ credit card details.
While general advice is routinely given to consumers – such as vigilance over website security, visiting preferred stores directly rather than clicking search engine links, and not to store card details online – online marketplaces need to prioritize monitoring and prevention too.
Preventing Scams on Online Marketplaces
With so many ways in which scammers can benefit, it’s clear that they’re not going to stop anytime soon.
This means that in an environment where trust is a limited commodity, the pressure increases on e-commerce sites, online marketplaces, and classified sites, to maintain it.
While official bodies, governments, and consumer rights groups; as well as Facebook (as reported in TechCrunch this week) and other tech champions with considerable clout are informing and empowering users to recognize and take an active stance against suspicious activity, online marketplaces also have a responsibility to detect and eliminate fraud.
As marketplaces scale and begin to achieve a network effect, they need to adopt more stringent cybersecurity protocols to protect their users – multi-factor authentication, for example. Similarly, mapping user behavior can help site owners to identify how genuine customers navigate it – giving them the intelligence they can use to benchmark suspicious activity.
Essentially, the better you know your users and the way they behave, and the more emphasis you put on transparency as a prerequisite for joining your community, the greater the deterrent. But as discussed, there are ways scammers to mask their behavior.
Being a step ahead of scammers is important (as our trust and security expert explains in this article). Therefore, it’s essential to anticipate the different times of the year when certain scams manifest – as outlined in our Scam Spikes Awareness Calendar.
However, by far the most effective way to prevent fraudulent activity on online marketplaces is to have a solid content moderation setup in place. While this could be a team or person manually monitoring the behaviors most likely to be scams, as a marketplace grows, this process needs to function and maintain at scale.
Enter machine learning AI – a moderation solution trained to detect scammers before fraudulent content is posted. Essentially this works by ‘feeding’ the AI with data to recognize suspicious behavioral patterns, and can, therefore, identify a number of possible fraud threats simultaneously.
At Besedo, we fight fraud by giving marketplace owners the tools – not just the advice – they need to stop it before it is published.
All things considered, scammers are merely opportunists looking for an easy way to make money. The harder it becomes to do this on online marketplaces, the less inclined they’ll be to target them.
Keen to learn more about content moderation? Let’s talk.
What is a content moderator? why not ask one. We sat down with Michele Panarosa, Online Content Moderator Level 1 at Besedo, to learn more about a content moderators daily work, how to become one, and much more.
Hi Michele! Thank you for taking the time to sit down with us. Could you tell us a bit about yourself?
My name is Michele Panarosa, I’m 27 years old and I come from Bari, Puglia, Italy. I’ve been an online content moderator for nine months now, formerly an IT technician with a passion for technology and videogames. In my spare time, I like to sing and listen to music. I’m a shy person at first, but then I turn into an entertainer because I like to have a happy environment around me. They call me “Diva” for a good reason!
What is a content moderator?
A content moderator is responsible for user-generated content submitted to an online platform. The content moderator’s job is to make sure that items are placed in the right category, are free from scams, doesn’t include any illegal items, and much more.
How did you become a content moderator?
I became an online content moderator by training with a specialist during the first weeks of work, but it’s a never-ending learning curve. At first, I was scared to accidentally accepting fraudulent content, or not doing my job properly. My teammates, along with my manager and team leaders, were nice and helped me throughout the entire process. As I kept on learning, I started to understand fraud trends and patterns. It helped me spot fraudulent content with ease, and I could with confidence escalate items to second-line moderation agents who made sure it got refused.
Communication is essential in this case. There are so many items I didn’t even know existed, which is a enriching experience. The world of content moderation is very dynamic, and it has so many interesting things to learn.
What’s great about working with content moderation?
The great part of content moderation is the mission behind it. Internet sometimes could seem like a big and unsafe place where scammers are the rulers. I love this job because I get to make the world a better place by blocking content that’s not supposed to be online. It’s a blessing to be part of a mission where I can help others and feel good about what I do. Besides, it makes you feel important and adds that undercover aspect of a 007 agent.
How do you moderate content accurately and fast?
Speed and accuracy could be parallel, but you need to be focused and keep your eyes on the important part of a listing. Only a bit of information in a listing can be very revealing and tell you what your next step should be. On top of that, it’s crucial to stay updated on the latest fraud trends to not fall into any traps. Some listings and users may appear very innocent, but it’s important to take each listing seriously and it’s always better to slow down a bit before moving on to the next listing.
What’s the most common type of content you refuse?
The most common type of items I refuse must be weapons – any kind of weapons. Some users try to make them seem harmless, but in reality, they’re not. It’s important to look at the listing images, and if the weapon is not exposed in the image, we’ll simply gather more information about the item. Usually, users who want to sell weapons try to hide it by not using images and be very short in their description (sometimes no description at all). It’s our task, as content moderators, to collect more details and refuse the item if it turns out to be a weapon. Even if it’s a soft air gun or used for sports.
What are the most important personal qualities needed to become a good content moderator?
The most important personal qualities needed to become a good content moderator are patience, integrity, and curiosity.
Moderating content is not always easy and sometimes it can be challenging to maintain a high pace while not jeopardizing accuracy. When faced with factors that might slow you down, it’s necessary to stay patient and not get distracted.
It’s all about work ethic, staying true to who you are and what you do. Always remember why you are moderating content, and don’t lose track of the final objective.
As a content moderator, you’re guaranteed to stumble onto items you didn’t even know existed. It’s important to stay curious and research the items, to make sure they’re in the right category, or should be refused – if the item doesn’t meet the platform’s rules and guidelines.
Michele is an Online Content Moderator Level 1 and has worked within this role for nine months. Previously he worked as an IT technician. Michele is passionate about technology and videogames, and in his spare time, he enjoys music both to sing and listen.
What is content moderation?
Content moderation is when an online platform screen and monitor user-generated content based on platform-specific rules and guidelines to determine if the content should be published on the online platform, or not.
In other words, when content is submitted by a user to a website, that piece of content will go through a screening process (the moderation process) to make sure that the content upholds the regulations of the website, is not illegal, inappropriate, or harassing, etc.
Content moderation as a practice is common across online platforms that heavily rely on user-generated content, such as social media platforms, online marketplaces, sharing economy, dating sites, communities and forums, etc.
There are a number of different forms of content moderation; pre-moderation, post-moderation, reactive moderation, distributed moderation, and automated moderation. In this article we’re looking closer at human moderation and automated moderation, but if you’re curious to learn more, here’s an article featuring the 5 moderation methods.
What is human moderation?
Human moderation, or manual moderation, is the practice when humans manually monitor and screen user-generated content which has been submitted to an online platform. The human moderator follows platform-specific rules and guidelines to protect online users by keeping content like unwanted, illegal, scam, inappropriate, and harassment, off the site.
What is automated moderation?
Automated moderation means that any user-generated content submitted to an online platform will be accepted, refused, or sent to human moderation, automatically – based on the platform’s specific rules and guidelines. Automated moderation is the ideal solution for online platforms who want to make sure that qualitative user-generated content goes live instantly and that users are safe when interacting on their site.
According to a study done by Microsoft, humans only stay attentive for 8-seconds on average. Therefore, online platforms cannot afford to have slow time-to-site of user-generated content or they might risk losing their users. On the other hand, users who encounter poor quality content, spam, scam, inappropriate content, etc., are likely to leave the site instantly. So, where does that leave us? In order for online platforms not to jeopardize quality or time-to-site, they need to consider automated moderation.
When talking about automated moderation, we often refer to machine learning AI (AI moderation) and automated filters. But what are they really?
What is AI moderation?
AI moderation, or tailored AI moderation, is machine learning models built from online platform-specific data, to efficiently and accurately catch unwanted user-generated content. An AI moderation solution will take highly accurate automated moderation decisions – refusing, approving, or escalating content automatically.
One example that showcases the power of AI moderation is the Swiss online marketplace, Anibis, who successfully automated 94% of their moderation whilst achieving 99.8% accuracy.
It should also be mentioned; AI moderation can be built on generic data. These models can be very effective but are in most cases not as accurate as a tailored AI solution.
What is Automated filter moderation?
Automated filter moderation is a set of rules to automatically highlight and catch unwanted content. The filters (or rules) are efficient while finding content that can’t be misinterpreted or are obvious scams. This makes them a solid complimentary automation tool for your moderation set up. Automated filters can easily be created, edited and deleted in our all-in-one content moderation tool, Implio – learn how to create filters here.
Do’s and don’ts of content moderation
Determining what to do and not to do in content moderation, may vary from site to site. There are many elements and factors that need consideration to get the moderation set up best suited for your specific needs.
However, regardless if you’re running an online marketplace, social media platform, or sharing economy site, etc., there are some things true of what to do and not to do when it comes to content moderation.
Do’s of content moderation
Do: Select the moderation method that’s right for your needs
Start off by looking at what kind of content your site hosts and who your users are. This will help you create a clear picture of what’s required from your moderation method and setup. For example, the type of user-generated content found on Medium contra Facebook is very different, and their users’ behavior too. This makes their moderation methods and setups look differently in order to fit their platform’s specific needs.
Do: Create clear rules and guidelines
Your content moderation rules and guidelines need to be clear for everyone who is directly involved with your online platform’s content moderation. Everyone from the data scientist developing your AI moderation to the human moderator reviewing content, regardless if they sit in-house or are outsourced to partners. Uncertainty in your rulebook can set your moderation efforts back; both from a financial and from a user experience perspective.
Do: Moderate all types of content
Regardless if you’re running an online marketplace, dating site, or a social media platform, your users are key contributors to your platform. Making sure they’re enjoying pleasant experiences and are met with quality content on your site, should be of your interest. To achieve this, you need to make sure your content moderation is done right.
In a perfect world moderating all types of content on your site, from text and images, to videos and 1-to-1 messages, would be ideal. The reality though, is that this is not an approach possible for all online platforms; for financial and technical reasons. If that’s your case, as a minimum approach make sure to identify your high-risk categories and content and start your moderation efforts there.
Don’ts of content moderation
Don’t: misinterpret what good content is
Quality content is key to build user trust and achieve a splendid user experience on your online platform, but it’s important to understand what good content is. Don’t make the mistake of misinterpreting good content and end up rejecting user-generated content simply because it’s of negative nature.
For example, a negative comment or review following a transaction can still be good content, as long as no harsh language is used of course. Genuine content is what you want, as it enhances quality and user trust.
Don’t: wait too long before you get started with moderation
If you’re in the early stages of establishing your online platform, getting started with content moderation might feel like its miles away. It’s not.
Don’t get us wrong, perhaps it shouldn’t be your main priority right out of the gate, but you need to have a plan for how to handle user-generated content, from a moderation perspective, when you scale. As you’re growing, and the network effect kicks in, you often see a rapid increase of content flooding into your site. You need to be prepared to handle that; if not, your big break might actually end up hurting you in the long run.
Don’t: waste resources
Don’t reinvent the wheel. With multiple content moderation tools and solutions, like Implio, available in the market, it’s important that you prioritize your resources carefully. Innovation and growth are what will boost your online platform to success, and this is where your dev resources will give you the most competitive advantage. Find your way to free up your resources for innovation, without risking falling behind with your moderation efforts.
This is a guest post by Martin Boss, Founder of MultiMerch. MultiMerch helps online marketplace owners build and grow successful businesses.
Starting a regular, run-of-the-mill online store is a piece of cake nowadays. Get a Shopify account and upload a few products, sell and dispatch them – and you’re done.
Two-sided marketplace platforms with vendors are different beasts. Not only you need more complex software to run a marketplace, but the operations are also more difficult as well when you’re dealing with two sides at once.
Here are some of the marketplace operations you’ll have to master to start a successful online marketplace business.
Setting up and maintaining your marketplace platform
First and foremost, a regular e-commerce platform like Shopify won’t cut it if you want to run a marketplace.
So, you will be looking for multi-vendor software (or two-sided marketplace solutions) to power your new marketplace.
There are three main types of marketplace software:
- SaaS or cloud-based platforms
- self-hosted marketplace software
- multi-vendor e-commerce extensions
Cloud marketplace solutions like Sharetribe, Arcadier, and Marketplacer are the easiest to get started with and require little to no technical knowledge. In most cases, they also offer free trials so you can spend a while playing with the platform to make sure the solution fits your requirements.
Self-hosted marketplace software like MultiMerch or CS-Cart gives you much more customization and modification possibilities since you have complete access to the source code and own it outright. However, you will need at least some technical skill to build a marketplace platform using self-hosted software.
Finally, you can take a regular CMS like WordPress, Magento or OpenCart and slap a multi-vendor extension on top of it. In this case, you absolutely want to have a developer on your team who has experience working both with the underlying system and with two-sided marketplaces. We never recommend this approach to new marketplace owners with no technical team.
Depending on the software type you go with, setting up a new marketplace platform will take you anywhere from a few hours to a few months (I’ve seen years, but I don’t recommend it).
Finding, onboarding, and managing vendors
It’s not a multi-vendor marketplace if you don’t have any vendors. So, you will need to find vendors and convince them to join your platform.
We recommend doing market research and lining up at least a few vendors early on – even before you start building the platform. Not only will this allow you to validate your idea and generate a little awareness, but it will create a pool of early adopters who are willing to test your platform and provide feedback.
Here are a few ideas for finding your initial vendors:
- join a bunch of Facebook groups for store owners
- reach out to Etsy/Amazon sellers directly or via communities
- create a few paid Google/Facebook/Instagram ads
- go door-to-door the old school way if you’re starting a local marketplace
This article by Shopify describes the process of finding manufacturers and wholesale suppliers. You’ll find it most helpful for a B2B marketplace platform, but the process will be similar if you’re building a B2C marketplace.
Now you will need to convince your vendors to join your platform, especially when you’re just starting out. What’s in it for them to join a marketplace that has barely any users? Here, you will need a landing page or a slide deck showcasing your value proposition and why it makes sense for a vendor to join you. Consider offering an incentive or two to your early adopters, such as lower (or waived) selling fees or a free onboarding service.
To reduce friction for new vendors, make sure your sign up flow is as simple and straight-forward as possible. Simplify your vendor registration forms – don’t ask for too many details right off the bat if you can do it later after the vendor signs up. Describe the process of signing up and be available to new vendors for support.
Here’s how Etsy makes the onboarding process for new vendors a breeze:
First, they really go out of their way to answer all possible questions a new vendor may have on its “Sell on Etsy” landing page. This includes a few sections covering the selling features Etsy offers, a clear overview of the fee structure, a FAQ section and even a few highlights of existing sellers.
To sign up as an Etsy seller, you first need to create an Etsy account (if you don’t have one), then set up your shop. Creating an account is simple enough – you’ll only need your first name, email address and your password:
Then, you’ll go through 5 simple steps of setting up your store – from specifying your regional settings and naming your store to upload your first items and filling out the payment details.
Note that Etsy will require you to upload at least one item before you can complete your setup process. While you’ll want to avoid this extra step to reduce friction for your new vendors when you’re just starting out, this helps Etsy weed out dummy signups and only accept sellers who are serious about opening their stores.
Creating and managing the product catalog
Now, it’s time for your vendors to start listing their products on your marketplace.
There are two common ways your marketplace can tie products to individual vendors:
- Etsy-style, where every product is unique and belongs to one vendor only
- Amazon-style, where a single product can be sold by multiple vendors at different prices
Both have their benefits and drawbacks, but you’ll find that most ready-made marketplace solutions out there will offer Etsy-style listings due to a somewhat simpler system architecture.
In any case, you’ll want to keep the process of listing new products simple – your vendors are busy people. This means simplifying your product listing forms, asking for the crucial information only and tailoring them to product categories you plan to carry.
Consider offering your vendors a way to upload products in bulk or import their catalog from a third party marketplace platform. If they already sell on Amazon (or elsewhere), they’ll appreciate a quick way of syncing their existing catalog with your marketplace instead of having to list all products manually. Do it for them as an onboarding service.
Decide whether or not you want to moderate products published by your vendors to keep your product catalog clean (the answer is probably yes, especially for a larger marketplace). If you do, make sure you have a moderation system in place to prevent delays. The last thing you want is for your vendors to wait a week to get their product published because you’re moderating thousands of products by hand.
Processing payments and managing orders
As your marketplace starts getting traction, you will be processing payments and helping your vendors deal with customer orders.
There are two common ways your marketplace can process customer order payments:
- aggregated, where your platform collects buyers’ payments and later redistributes them to vendors via payouts
- split or parallel, where the payment is instantly split at checkout and distributed between the vendors (and your platform, if you charge a selling fee) by the payment processor
Aggregated payments make single checkout possible and allow you to own the transaction, but place an extra burden of tracking your sellers’ finances and making regular payouts on your platform. And with the payment industry becoming increasingly more regulated, payment aggregation as a marketplace payment flow will only be getting more complex.
Split payment processing eliminates these issues and shifts the liability and the compliance burden from your business to the payment processing company. The biggest remaining problems of parallel payment processing are implementation complexity and the lack of marketplace processor availability.
However, the trend is changing. At MultiMerch, we’ve recently discovered at least 26 different marketplace payment solutions that cater to two-sided platforms.
Now, while you won’t be processing orders yourself, your vendors will need a way to receive, process and dispatch products to their customers. Since the shipping cost and speed directly affect conversion rates, the sooner your vendor gets notified about the new order, the sooner they can dispatch it and keep the customer happy and willing to purchase again.
To keep the customer informed about the status of their order, offer your vendors a way to specify the tracking number after they’ve dispatched the order. Consider connecting your order management system to a third party shipping service like ShipStation or AfterShip to let the buyer track the order without leaving your marketplace and get notified about the status changes automatically.
Handling customer and vendor relations
Now, simply attracting new vendors and buyers and hoping it will all work out by itself won’t do in most cases – you will need to actively keep your users happy, loyal and using your platform on a regular basis. As your platform grows, you will be supporting your users, answering their questions and helping them solve various problems on a regular basis. Do have a plan for that in place.
These are just a few of the challenges you’ll be dealing with when running your platform:
- helping vendors sign up, customize their stores and list products
- resolving technical issues and software glitches the users’ report
- mediating order disputes between the vendor and the buyer
- enforcing the rules and moderating user-generated content
To reduce your own involvement, you’ll want to allow the buyer to communicate with the vendor directly and ask questions about the products and the orders, both before and after the sale. The more options to reach out you offer, the more likely is the issue to be resolved without your intervention.
Delays, questions, and disputes are inevitable, so you’ll need a plan (and a system) to handle them. As you grow, you may consider starting a user community and creating your own protection programs for buyers and sellers, like eBay’s Money Back Guarantee or Airbnb’s Host Protection Insurance.
Growing your marketplace
“If you build it, they will come” sounds great in theory, but will probably not happen in practice. You’ll get some traction due to the nature of two-sided platforms but will still need to actively promote your platform, especially in the early stages.
The unique challenge online marketplace owners face is growing both the supply and the demand side at once – getting vendors while you have zero customers and finding buyers with only a small product catalog.
Earlier, I’ve outlined a few ways of getting those first few vendors to join your platform. When it comes to customer acquisition and growth, paid, social and organic search will be your main channels – as well as other business development initiatives.
Set up a presence on Instagram and Facebook. While organic social media reach is going down, it’s still an important channel for an e-commerce brand. Now, have a paid promotion strategy for your marketplace in place to attract new users and create brand awareness. The more content gets created every day, the more value you’ll get out of paid marketing campaigns. Set aside an initial paid advertising budget and test a few different channels to see what works best. Instagram is a great channel for highly visual ads, for example.
In the long run, search engine optimization will be crucial for your marketplace since over 50% of trackable traffic on average still comes from organic search. To minimize the amount of SEO work you’ll need to do, make sure your marketplace platform is built with search engines in mind. This means search engine friendly, optimized and shareable product listings, seller stores and the rest of your marketplace. Keep in mind that most sellers won’t bother optimizing every listing (if they’ve heard about SEO at all), so make it a no-brainer for them and automate as much as possible.
And SEO isn’t only about products – there’s way more user-generated content on a typical marketplace platform:
- product and vendor reviews
- public questions and answers
- vendor stories, blogs, and other things
This is great for you, the marketplace owner. Encourage your users to create great content on your platform and feature it prominently for better SEO and social proof.
And that is how you start and manage an online marketplace platform in a nutshell. If it sounds difficult, it’s because it is. One thing I learned here at MultiMerch is that marketplace businesses are more challenging than regular online stores, but also so much more rewarding.
For a more in-depth guide on starting a two-sided marketplace, check out my Beginner’s Guide to Starting an Online Marketplace Business.
No online marketplace founder or entrepreneur set out to fail. The world loves a romantic success story, where a disruptive idea changes how we look at an entire industry. Two examples that immediately come to mind are Airbnb and Uber.
Yet, 90% of startups fail and that is something we don’t talk about enough.
Failure in itself may not be something glorious, but it’s an important ingredient for success. From failure comes learnings, and hearing the mistakes of other marketplaces can be very useful to followers looking to avoid the same pitfalls.
Learnings from a failed online marketplace
Anton Koval is the founder of Brainjobs.pl – a failed online marketplace. Today he’s moved on and is helping founders and companies build and grow their own online marketplaces, through his agency Braincode.
We caught up with him to hear the story of his failed online marketplace, what went wrong, and the lessons he learned from the experience.
In the first part of the interview, Anton shares the business idea, USP, and operational setup.
In part two, Anton shares what went wrong, the actions they took to turn it around, and the main lessons learned he took with him from the experience.
Anton Koval is a founder of Braincode an agency that works with founders and companies to help them build their own online marketplaces. Previously Anton bootstrapped his own marketplace in HR- Tech area. Anton is a big advocate of the platform economy and remote work.
Trust is a key component of a successful marketplace and there are many small parts that help achieve it. One element that plays a major role in trust-building is of course how you present your platform to your users and the experience they have while using it. But how can you use UX design to build trust in your marketplace?
UX design is often described as the process of enhancing user satisfaction by improving the usability, efficiency, and accessibility of a website.
This definition is true when designing for online marketplace too. A marketplace’s UX design should be viewed and function as the spine of the platform. Its task is to efficiently guide users through the site to the desired end destination (oftentimes transaction completion).
What’s different for online marketplaces, is that most of them rely heavily on user-generated content. This dependency limits the level of control you have over a vast majority of the user experience. Since you are not the one choosing the images and creating the text, it’s harder to ensure that it aligns with your brand, tone of voice and messaging. A marketplace’s role is to help strangers find and transact with each other. Without the important physical clues, we’d normally use to establish trust and the added challenge of limited content control it can be a struggle to achieve high enough trust levels for strangers to engage.
That’s why it’s vital for online marketplaces to include trust-building elements in their UX design. It’s also imperative that this is combined with a highly selective content curation and reviewing strategy since low quality and irrelevant content can quickly destroy any trust gained from trust-inducing UX design.
Keep in mind that trust building isn’t a one-off effort. In order to achieve a truly trustworthy marketplace, your trust-building elements need to become an integral part of your marketplace’s UX design, from pre-acquisition and throughout the entire user journey. On top of that, you need to continuously deliver on the trust promise you make with your UX design. This means following through and actually making your users safe for instance by offering great and timely customer support, curating and reviewing content diligently and providing secure payment channels.
How do I build trust through UX design?
Make sure to design and develop the user journey for trust. Whether it’s keeping your top listings on the home page, ensuring quality suppliers, presenting honest reviews, or offer easy support, UX elements like these will help build trust in both your platform and users.
Want more detailed info on how you can build trust into your UX design? We invited Bec Faye, Marketplace Optimization & Growth Specialist, for a webinar to share her knowledge and expertise. Watch the full webinar recording here.
We are seeing an increase in legislation aimed at the digital world across the globe. What does that mean for online marketplaces, are there any trends we can see already now and what can we expect from the future? We’ve taken a deep dive into the legal pool to see if we can make sense of it all.
The line between the digital and offline society gradually gets blurrier as human interaction increasingly happen on and jump between online platforms and digital spaces.
Unsurprisingly this merge of tech-driven and traditional doesn’t always happen smoothly. Governments have been particularly slow at catching up to the new world order leaving the digital society to its own devices when it comes to upholding law and order.
Recent events, like election meddling, the increased suicide rate attributed to cyberbullying and clashes between online and offline workforce has however kickstarted government involvement across the globe and we are starting to see an increased interest in, and legislation aimed at taming the digital world.
For those of us who operate in the space and must navigate the legislative jungle, it can be a challenge as politicians scramble to catch up and implement regulations.
With so much going on, it can also be hard for site owners to keep track of these different developments. But doing so is critical to stay compliant and it’s especially important for digital businesses looking to scale or expand into new markets.
It’s increasingly clear that there’s going to be a conflict of interest as user privacy, businesses goals, and government interests clash. Because of the complexity of the digital landscape and as many politicians don’t really understand the inner workings of the Internet and the businesses that operate through it, many laws come out vague, impossible to fulfill or are drawn up without a true understanding of the full impact they will have. This means that many of the recent legislative initiatives are hard to interpret and often highly controversial. Operating in this environment making sure your business adhere to all relevant laws can be a legal minefield.
It also raises the question of just how effective are the different regulations are? Do they really tackle the problems they mean to solve? Are they too fixated on holding online marketplaces and other digital players accountable for harmful user-generated content (UGC)? To what extent do they curb users’ rights rather than empower them? Can there ever be a ‘one size fits all’ solution that works both at a global and a local level?
Let’s take a closer look at some regulatory developments from around the world, consider the most prominent global trends in online safety legislation, and speculate what’s coming next.
Online regulations around the world
The following stories feature synopses of some of the most interesting safety-related stories from the last year or so. They all impact online marketplaces and classified sites in different ways; evidencing the complexities associated with featuring and curating UGC.
India: Banning sales of exotic animals
Online marketplaces in India have cracked down on attempts by users to disguise the illegal sale of rare and exotic animals (and their parts).
This comes after sites such as Amazon India, eBay, OLX, and Snapdeal were revealed to be among over 100 marketplaces where such items can be bought (an issue we covered in a blog post a while back).
Many items are listed under code names – such as ‘Australian Teddy Bear’ for koalas and the Hindi term for ‘Striped Sheet’ in place of tiger skin – but bigger sites are now actively working with government and wildlife protection officials to weed out offending posts.
EU: Take down terror content sooner or face fines
In April this year, the European Parliament voted in favor of a law which would give online businesses one hour (from being contacted by law enforcement authorities) to remove terrorist-related content, which remains more dangerous the longer they’re kept live online.
Failure to comply with the proposed ruling could incur businesses a fine of up to 4% of their global revenue. However, for smaller sites, a 12 hour grace period could be put in place.
US: Safeguarding Children’s Data From Commercial Availability
In America, online shopping giant, Amazon, recently attracted scrutiny over the launch of its brightly-colored kids’ Echo Dot Alexa device – and the use and storage of children’s data.
Despite the company’s assertion that its services comply with child protection legislation, privacy advocates and children’s rights groups are now urging the US Federal Trade Commission to investigate.
Canada: Illegal Online Sales Of Legal Marijuana Sparks Cybersecurity Worries
America’s northern neighbor made medical and recreational cannabis completely legal last year. Since then, the Canadian government has taken significant steps to regulate the sale and distribution of marijuana – restricting it to licensed on- and offline dispensaries.
However, unlicensed black market Mail Order Marijuana services (MOMs) still dominate online sales – given their ability to undercut regulated sales on price, as well as their broader product variety and availability.
While many lawmakers are content to dismiss this gray area as ‘teething issues’, law enforcement agencies are taking it more seriously, citing cybersecurity concerns: as in many cases, buyers are essentially financing and handing their data to, organized crime syndicates.
Britain: An online safety paradise?
In the UK, there have been several interesting developments in the online safety space. Firstly, in a bid to prevent youngsters from accessing sexual content online Britain is banning access to online pornography for those who can’t legitimately verify that they’re of adult age.
In addition, a government whitepaper issued in April aims to make Britain the safest place to be online and calls for an independent regulator to ‘set clear safety standards, backed up by reporting requirements and effective enforcement powers’.
The paper, titled ‘Online Harms’ sets out plans to take tech companies beyond self-regulation to develop ‘a new system of accountability’. This would see a number of key developments take shape, including social media transparency reports; greater scrutiny checks to prevent fake news from spreading, and a new framework to help companies incorporate online safety features into apps and other online platforms.
Which trends will impact online marketplaces & classifieds sites the most?
It’s clear that there’s a lot of hype around online safety. But reading between the lines, it’s crucial to keep in mind the issues that are most likely to have a bearing on UGC-focused companies operating online.
Safety first, but liability still a grey area
Safeguarding users seems to be a prominent issue in all of this. However, there’s also an overwhelming need to protect the innocent victims featured in malicious and harmful user-generated content – as is the case with sex trafficking, revenge porn, and even exotic animals being sold.
However, there’s a strong argument in that unless there’s clear evidence of a crime, the true perpetrator cannot be punished. A piece of UGC provides proof that could hold criminals accountable.
But should facilitation and curation of harmful content be punishable? As we discussed in our recent video interview with Eric Goldman, law professor at Santa Clara University School of Law and co-founder of four ‘Content Moderation at Scale’ conferences, there’s a marked difference between how moderation, liability, and activity are treated, which has a number of bearings on how companies operating online should behave.
For example, in the US, the Communications Decency Act (aka Section 230) relinquishes users and site owners of any wrongdoing and therefore responsibility. However, the clause here is that the site itself is free to remove ‘obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable’ content.
In other countries, as in the UK and EU, where governments are setting their own frameworks, online marketplaces face prosecution in the event of a breach. The danger here is that companies instead focus on compliance rather than the needs of their customers and communities.
Limits on personal freedoms drive dangerous workarounds
Although the safety and liability message is being heard loud and clear, the need to balance personal freedoms with the eradication of harmful content is a key concern. While the intent is protection, the notion of ‘enforcement’ remains at odds with the notion of individual freedom.
For example, Britain’s online porn ban could arguably push youngsters into more nefarious ways of circumventing the restrictions. For example, the easiest way for users to bypass online blocks is to use TOR browsers and Virtual Private Networks.
As demonstrated in Canada, forcing users to explore darker, more unregulated areas of the web potentially makes users more vulnerable to attack by cybercriminals.
International enforcement remains problematic
Perhaps one of the biggest trends we can see which is of particular concern to online marketplaces is being able to monitor, regulate, and abide by laws across different areas.
Laws pertaining to the sale of weapons, drugs, and other restricted items differ between countries, regions, and states. In addition, age restrictions can vary too.
For example, in Canada, edible marijuana products aren’t yet legal – and therefore cannot be sold online. However, in the US states where recreational cannabis is legal, so too are ‘edibles’.
While it’s not hard to imagine that an online age/ISP/location verification (or a simple ‘Where We Deliver’ policy) would solve such issues, the fact remains that these factors have major ramifications for sites that operate internationally.
And given that there’s rife speculation that Amazon could soon sell cannabis, it’s only a matter of time before these issues take center stage – which can ultimately only be positive for governments and marketplaces alike.
One size doesn’t fit all
Scale is also an important factor to keep in mind. Laws and regulations that are designed to curb the huge amount of data that larger marketplaces curate can’t be deployed in the same way by smaller outfits. And vice versa.
Governments are holding online businesses to task for failing to police their sites appropriately. And while they’re maybe right to do so, it can be tough for marketplaces of all sizes to employ enough resources to professionally cover content moderation needs.
Ultimately, we’re still in the ‘Wild West era’ of online regulation. What’s acceptable is very much culture-led; which is why we continue to see such diverse developments at a global and local level.
For example, in Thailand, where the King is held in utmost regard, any content pertaining to him must be strictly moderated and often removed – unthinkable even in another ‘royal’ nation like the UK. General common sense can’t prevail in such a disparate regulatory environment where user attitudes are so polarized.
In addition, the involvement of governments in setting a best practice framework all too often means that those championing issues like censorship, privacy, and accessibility online aren’t the experts in these matters.
What we’d hope is that moving forward, governments continue to work proactively alongside large and small industry players to understand the true nature of the challenges they face, and foster better relationships with them, in order to create an effective, lasting, best practice solution that benefits users but is also realistically achievable by the online businesses.
We saw this recently at a European Parliament-run content moderation conference, where leading lights from some of the world’s best-known technology companies gathered to share their ideas and challenges with politicians.
However, variety (as they say) is the spice of life. Standardizing the international regulatory environment wouldn’t be effective given the rich diversity of content moderation practices and culturally driven needs. What could work though is an adaptable set of guidelines that nations could adopt and customize to suit their user base – a framework that could be informed by both users and online marketplace owners themselves to map out the limit of acceptability. The only problem could be that the nature of UGC constantly change in line with the way in which technology impacts our lives.
All things considered, going forward online marketplaces and classifieds sites will need to pay even closer attention to the trends, safety regulations, and legislation being set locally and globally.
Otherwise, they may quickly be shut down for being non-compliant.
The new laws can be hard to navigate, and it can be even harder to implement the actions, manpower, and tech needed to be compliant. Companies Besedo are set up to help businesses like yours get everything in place in a fraction of the time and at a lower cost than having to go it alone.
If you feel you could use a hand with your content moderation strategy let us know and we’ll be happy to review your current setup and suggest beneficial alterations.
We recently had the pleasure to interview the former CEO of dubizzle, and founder of Working in Digital, Arto Joensuu. Throughout the interview, Arto shares his knowledge, experiences, and advice on how to establish a successful online marketplace, as we explore numerous key areas vital to marketplace success including lead generation, monetization, expansion, trust building and much more.
Q: Hi Arto, thank you for taking the time from your busy schedule to talk with us. Can you begin by sharing some insights about your career so far?
Arto Joensuu: Hi Emil and thanks for getting in touch. Sure. I’ve been working in the digital side or marketing throughout the past 20 years. It all started in the late ’90s with a startup in the mobile services space (we’re talking ringtones, SMS groups, mobile wallpapers, and WAP-based games back then). Iobox was doing some pioneering work in this space and I was happy to be a part of it until it got sold off to Terra Mobile. I then went on to do a “quick” visit at Nokia, which spanned across 12 years and included pretty much the “A-Z” across the digital customer journey. After Nokia, I spent 6 years in Dubai, UAE, where I had the opportunity of being involved in one of the early stage success startup stories: dubizzle.com. After the company was sold off to Naspers, I’ve been involved as an investor and advisor with multiple startups, ranging from job finding solutions for the emerging markets, cryptocurrencies for classifieds and once in a lifetime golf trips across the world. Time flies…
Q: Your first steps into the world of marketplaces were as Head of Marketing at dubizzle in the UAE. How did you strategically align the business and expanded the growth?
Arto Joensuu: Dubizzle was at an interesting stage of development when I joined. It had become a success in Dubai (one of the emirates within the UAE) and was looking to expand further within the country, as well as across the MENA region. The company was led by 2 smart and driven entrepreneurs who wanted to spread the concept across the region and simultaneously increase the revenue streams within the UAE.
In order to prepare ourselves for growth expansion, we needed to ensure we had a common understanding of WHY we exist as a company as well as how we could go about driving this vision forward across markets that we had little personal understanding on. This sparked an internal workstream to define our overall purpose/vision (WHY), our common values (WHO), our operational strategy (HOW) as well as the expected outcomes (WHAT). Some would label this as a brand exercise, I would call it the formation of our manifesto and overall red thread for the years to come. The end result of this exercise is best summarized on Mark’s website. (http://emmarkjames.com/#/)
Q: What were the most important components in your marketing strategy at the time?
Arto Joensuu: On a broad level, we used to talk a lot about 2 types of marketing functions/skillsets: makers and spreaders. Makers were fundamentally responsible for bringing our brand purpose to life through content. Spreaders mastered distribution and optimization. Our content at large was divided into stock and flow content, signifying larger “stock” content pieces centered around wider themes, whilst the “flow” content was more reactive or trigger based.
Another important dimension to the strategy was a holistic understanding of the customer journey and user segments. The customer journey had to be looked at holistically, spanning across overall awareness generation, to engagement, conversion, retention, and monetization. The user segments themselves focused on finding the equilibrium between sellers and buyers as well as b2c vs b2b segments. This, in returned, ensured proper equilibrium between both entities, resulting in a vibrant marketplace where demand and supply met “eye to eye”.
This is where your analytics can play an important role in helping you navigate the waters across both, the acquisition and retention funnels. In addition to our own online metrics tools, I always found it extremely useful to map out our overall communication efforts across all channels. If we were doing PR or other on-ground activations, we could suddenly go back and start seeing patterns between overall awareness generation as well as actual engagement through organic visits and non-paid media.
It’s by no means rocket science, as long as you’ve established clear end action goals that you are monitoring. Is the registration process simple enough? What is our 30-day retention rate? Are the listers able to list their items at ease but with high quality? Do the items get sold? How often are they coming back?
At dubizzle, one of the key metrics we used to look at was what we called “ruffians” (another way of pronouncing RFNS, which meant returning, free, non SEO traffic). This metric was particularly important to us, as it signified a “quality returning visit” and was a good indication for true organic, non-prompted retention. For individual transactions, it was pretty clear that you had to ensure (and reward) for the quality for the listing (to get more views and leads) and simultaneously, ensure that the listing to sales cycle was as rapid as possible. If an item has been listed for several days and isn’t getting leads, we could trigger automated emails to our users, where we prompted an edit/enhancement to the listing (ad a better description/copy text or insert additional images, etc. If the quality score of the listing was good, then perhaps the price was wrong and we educated the seller with average sales prices for similar items he or she was trying to sell.
Q: Do you think your strategy can be replicated today?
Arto Joensuu: I believe that these fundamentals continue to remain relevant in today’s time and age. Organizations that are purpose driven and have their minds set across the entire value chain tend to find their way. I guess the important thing is to stay true to your “why” and not let that get diluted along the way.
When it comes to horizontal marketplaces, I think the same rules still apply in terms of getting the critical mass onto your platform and later monetizing and expanding into b2b verticals. Of course, in today’s time and age, we are seeing the emergence of more and more niche driven marketplaces, where volumes of users are not necessarily large but the engagement and the volume of transactions/retention are extremely high. A big enabler for this has been companies like Sharetribe, that in a similar manner to Automattic (the creators of WordPress) enable marketplaces around niche interests to become mainstream. We can see how social media is constantly evolving from niche players emerging and later getting acquired by larger players and becoming mainstream medias. The same early adopter audiences move on to new niche communities while the masses flock to the services orchestrated by the big internet players. The evolution is constant, and the overall classifieds industry is not immune to this disruption revolution around the corner. Big players need to find new ways to evolve the classifieds marketplace and overall core loop involved.
Q: One of your key responsibilities was to expand dubizzle geographically, can you share how you succeeded with the geographical expansions?
Arto Joensuu: Our regional expansion was a combination of sleepless nights, insane turnaround times, 2 political revolutions, a lot of Red Bull and an end result which sparked a nationwide movement. In other words, welcome to Egypt, basha!
In retrospect, (it’s always easy to be the Monday morning quarterback) there were a lot of elements that made our Egypt expansion a success. Here are a few things I personally felt that made a true difference for us.
1. Decide to win.
We knew that another large classifieds player was also entering the Egyptian market and we had very little time to turn things around. This meant that we needed to put our full weight behind this initiative and our previously crafted brand work really served us well in this context. A highly aligned team can make all the difference in the world when things get tough.
2. Acquire the sellers.
I guess we all know that classifieds marketplaces stride on large volumes of high-quality content. Content attracts buyers and buyers means successful re-distribution if items that people have fallen out of love with. We focused quite strongly on the general items for sale segment, meaning everyday household items that people no longer needed. A great way to do this is to introduce an element of lifestyle-driven marketing into the mix, where the seller represents an aspirational target group that in return attracts buyers into the marketplace. For example, a young family that is selling a baby carriage that their child either outgrew or was originally given as a “double gift” brings people in similar life stages together and can even result in new friendships being formed.
3. Become the talk of the town.
As dubizzle entered the Egyptian market, we wanted to create and engage in an overall society-wide conversation about the second-hand economy. In Egypt alone, the value of unused items people had in their homes was equivalent to the entire GDP of Sweden. If people would take action and sell the items they have fallen out of love with, more money is re-fueled into the economy hence improving the overall economy in the country. This meant an overall paradigm shift in the definitions of ownership as well as the new vs second hand thought process. By tapping into a universal topic that had an impact on the whole society, our dubizzle GM was a frequent visitor to talk shows where larger Egypt wide topics were discussed. Becoming the talk of the town isn’t about creating a clever marketing campaign, it’s really about creating a movement.
Q: How do you think marketplaces need to approach geographical expansions today?
Arto Joensuu: There’s always been active dialogue around the need to localize vs. going to market with a more globally led brand identity. This topic goes beyond brand identity, however. At dubizzle, we soon realized that our mainly desktop driven English site for the UAE would not cut it as we planned to enter mainly Arabic speaking markets. We needed to build our MENA sites from scratch, taking a mobile and Arabic first approach to the whole process. At the time, we realized that for example, Arabic font libraries that were mobile (or even desktop) optimized were scarce and in many instances illegible on mobile devices. Before thinking about a localized marketing campaign, we need to fix the basics and develop a user experience that didn’t get in the way of our core loop. We also noted things like email vs mobile number penetration across emerging markets. It was basically useless to have an email sign up and we went directly to mobile number-based registration methods as these were the common standard across the region. We were fortunate that these changes were made before we entered the Egyptian market with a bang. By having the fundamentals in place, we could shift our focus towards overall activation and awareness building.
Q: In marketing, it’s important to have a consistent tone and imagery. With a marketplace, you heavily rely on user-generated content. How do you ensure that the content submitted by users adheres to, or at least doesn’t break, your tone of voice?
Arto Joensuu: Content quality is a common theme/struggle for any classifieds business. The overall listing process is an obvious area where good content can be encouraged (and incentivized by for example giving the listing higher visibility within the marketplace). The move to mobile/app-based solutions allows for easier image uploading, but also the potential addition of other metadata that can make the discoverability and look & feel of the listing more attractive. I think that also the tone of voice across the overall category structure and content fields can have a big impact on the overall end quality of the listing itself. In recent years, we’ve seen market entrants into the classifieds space (such as Soma), who have taken the individual listing into a more shareable/interactive product card format. What this does, is that the product starts having a life of its own and can be embedded and promoted, liked and shared across multiple venues. This forces the content to be good if it wants to have legs to spread (and live beyond one-off transactions).
Q: How do you make sure that your front page, or first search page, is in line with the brand you want to portray?
Arto Joensuu: This is quite a large topic in itself but obviously, one dimension that differentiates a classifieds marketplace from a more traditional e-commerce marketplace is the overall transaction category structure. For ex. when you enter an e-commerce site, you’re pretty much already certainly looking for a specific item (or category of items in that segment). The overall search process is more structured, and the items displayed usually start from that user-generated search pattern. With classifieds, the process can be similar to an e-commerce play (you go in and search specifically) but there’s also a profound layer of random discovery. For example, you didn’t necessarily know that a 1979 Darth Vader helmet was for sale but you discover it by chance. The home page can serve this endless treasure hunt of discoveries by bringing high-quality content to the home page instead of immediately driving your users down the traditional search path. A lot of the mobile/app driven classifieds spin-offs are leveraging this in quite smart ways and the discoverability along with smart geo and metadata can make the overall user experience a unique one. The brand is the experience and this touches every aspect of the service.
Q: You’ve also helped marketplaces improve their lead generation. Is SEO still important? and do you have any tips on how marketplaces can improve their lead generation?
Arto Joensuu: I think SEO and SMO both continue to play an important role in overall lead generation. If you think of giants like YouTube, a big part of their content gets consumed outside of their own site/app via embedded links on other social channels and websites. This speaks to the fact that lead generation needs to evolve beyond optimizing what’s on your site and thinking about ways in which your user-generated content gets extra mileage through social recognition and distribution.
Q: Do you know of any new creative ways to improve SEO and lead generation?
Arto Joensuu: With reference to Soma’s interactive product card, if an item has a wider lifespan than an individual interaction, it starts accumulating equity throughout its entire lifespan. Here’s where I think the next big thing in classifieds and e-commerce could potentially reside in. Picture this scenario:
When an e-commerce player sells a new mobile phone, they have the transactional data of the one-off sale, after which the item pretty much disappears off the digital grid until the owner decides to sell it on a classifieds site. When the item is posted online, the classifieds player gets a small piece of the item lifetime, as they know when the item was sold and for what price. Then again, it goes off the grid until it’s maybe sold for the 3rd time or disposed for recycling.
What if these items had a digital identity (aka an interactive product card) from the get-go? This would fundamentally bridge the gap between e-commerce and classifieds and could even extend into the whole sustainability piece at the end of life stages of the manufactured device. Along the way, item sale and resale value would be tracked, the item would form “link bait” of its own, as the IIC could be liked, shared or promoted by man and machine alike. Manufacturers would get valuable information on their product resale value, quality, “life expectancy” and distribution. Classified players would basically have multiple touchpoints to the value chain as technically the item is never deleted once sold. This is something I believe has immense potential in the future.
Q: When is it important to optimize monetization? and what are the ‘must have components’ in a successful monetization strategy?
Arto Joensuu: Monetization basically contains two dimensions, the b2b, and b2c sides. I guess that with any of the 2, it really comes down to a healthy equilibrium of buyers vs sellers. Traditionally it was about getting the needed b2c sellers and buyers onto the platform, which in return would bring the b2b players onboard and this would be the first segment that you monetize. Once you’ve become the clear market leader, the b2c monetization kicks in towards the later stage of monetization. The industry has obviously evolved from this and you start seeing rapid verticalization of certain segments (ex. property, cars, jobs) instead of pursuing with a unified horizontal classifieds approach only. You can also start seeing early stage monetization happening with more niche classifieds players where highly specialized b2c groups start forming around specific interest areas like fashion, watches, collectibles, etc.
Perhaps one of the toughest transitions in the abovementioned monetization streams is related to going from b2b monetization to b2c side monetization. There’s always an element of fear that by putting up a paywall to a b2c category, you will lose traffic and users to a competitor. When dubizzle decided to monetize its cars section on the b2c side, the team spent a lot of time evaluating the overall transition and ultimately, the overall used car ecosystem/landscape within the UAE. What we discovered quickly is that b2c users listing their cars on the marketplace received substantially more leads than the other platforms and that the end user was (on average) able to sell their car at a higher price than by going through a 3rd party. We also ran a series of A/B tests to identify the right price point for the listing fee and mapped out the various payment solution providers that would fit our user needs. In the end, the launch was successful and paved the road towards monetizing across other categories as well. In the end, I think it’s really about perceived value for your offering and if the marketplace works, people are ready to pay a small fee to the marketplace enabler.
Q: Trust is key for a successful marketplace, what’s your view on trust and how do you think marketplaces can build a safe platform?
Arto Joensuu: Trust is important. Of course, the definition of trust is probably universal to a degree but the ways in which you address this can vary greatly from country to country. There’s always been a debate about whether buyer profiles should also be registered/verified profiles to avoid fraud. Should the facilitator act as an escrow that holds onto the money until the transaction is completed and validated by both parties. Can we increase trust by having seller reviews and ratings etc. etc. Customer support and overall communication obviously play an important role here and educating the user based on potential pitfalls is very important. Companies such as the one you represent play an important role in preventing fraudulent or bad listings to the marketplace. I don’t personally have a silver bullet answer to the whole equation, to be honest. Maybe you should answer this question instead 😉
Q: How can you differentiate yourself as a marketplace in 2019, when there are a ton of new marketplaces popping up?
Arto Joensuu: I think this comes back to the “WHY” your company exists and what’s the deeper substance behind what you are trying to achieve. People don’t buy what you do, they buy why you do it and having this clear vision filter across everything you do creates differentiation. This might also mean that you need to be willing to sacrifice your current cash cows (and create new ones in the long run) by continuously innovating and finding ways to disrupt existing business models. Perhaps a point to make here is that it’s not about disruption “for the sake of disruption” but instead, finding new ways of bringing your “WHY” to life. Think about Kodak. If their true purpose was to enable people to capture their most precious moments in life and re-live them through pictures, they should have been all over the digital camera (which they actually invented). Instead of embracing this new way of bringing their purpose to life, they never capitalized on this new innovation because (at least in the short run) it would cannibalize their film business.
Arto Joensuu is a digital change agent with over 20 years of professional experience across startups as well as large multinational corporations. His professional expertise lies within a profound understanding of the digital landscape and its impact on companies both small and large. Whether it’s about leading a large corporation into the digital era, or helping startups cross the tipping point, Joensuu has been there in the trenches and has the battle scars to prove it. Throughout his career, he has held several leadership positions, ranging from pioneering in digital/mobile marketing at iobox/Terra Mobile during the late 90’s to spearheading the company-wide digital strategy and execution at Nokia. At dubizzle.com, Joensuu spearheaded a transformational re-branding initiative, which had profoundly positive implications across the entire company. This initiative led to a streamlined vision, common set of values and a cultural transformation that could serve as a platform for growth across the organization. He later continued as dubizzle’s CEO, leading the company’s monetization efforts, regional expansion, operational alignment with majority owner Naspers, as well as facilitating the early stage growth of classifieds spin-off Shedd. Today, Arto Joensuu is the founder and CEO of Working in Digital, a network of digital change agents that invest in early stage startups and actively support these organizations as non-executive directors. Current portfolio consists of:
Think the big tech players don’t tackle content moderation in the same way as your classifieds business? Think again! At a recent European Parliament conference, leading lights from some of the world’s best-known technology companies gathered to share their ideas, and challenges. But exactly what are they up against and how do they resolve issues?
No doubt about it: content moderation is a big issue – for classifieds sites, as well as content and social platforms. In fact, anywhere that users generate content online, actions must be taken to ensure compliance.
This applies to both small businesses as well as to the likes of Facebook, Medium, Wikipedia, Vimeo, Snapchat, and Google – which became quite clear back in February when these tech giants (and a host of others) attended the Digital Agenda Intergroup’s ‘Content Moderation & Removal At Scale’ conference, held at the European Parliament in Brussels on 5 February 2019.
What came out of the meeting was a frank and insightful discussion of free speech, the need to prevent discrimination and abuse, and the need to balance copyright infringement with business sensibilities – discussions that any online platform can easily relate to.
Balancing free speech with best practice
The conference, chaired by Dutch MEP, Marietje Schaake of the Digital Agenda Intergroup, was an opportunity to explore how internet companies develop and implement internal content moderation rules and policies.
Key issues included the challenges of moderating and removing illegal and controversial user-generated content – including hate speech, terrorist content, disinformation, and copyright infringing material – whilst ensuring that people’s rights and freedoms are protected and respected.
Or, as Eric Goldman, Professor of Law at the High-Tech Law Institute, Santa Clara University, put it ‘addressing the culture of silence on the operational consequences of content moderation’.
Addressing the status quo
Given the diverse array of speakers invited, and the sheer difference in the types of platforms they represented, it’s fair to say that their challenges, while inherently similar, manifest in different ways.
For example, Snapchat offers two main modes on its platform. The first is a person-to-person message service, and the other – Discover mode – allows content to be broadcast more widely. Both types of content need to be moderated in very different ways. And even though Snapchat content is ephemeral and the vast majority of it disappears within a 24-hour period, the team aims to remove anything that contravenes its policies within two hours.
By contrast, Medium – an exclusively editorial platform – relies on professional, commissioned, and user-generated content. But though the latter only needs to be moderated – that doesn’t necessarily make the task of doing so any easier. Medium relies on community participation as well as its own intelligence to moderate.
A massive resource like Wikipedia, which relies on community efforts to contribute information – rely on the same communities to create the policies by which they abide. And given that the vast wealth of information is available in 300 different language versions, there’s also some local flexibility in how these policies are upheld.
Given the 2 billion users it serves, Facebook offers a well-organized approach to content moderation; tasking several teams with different trust and safety responsibilities. Firstly, there’s the Content Policy team, who develop global policies – the community standards, which outline what is and is not allowed on Facebook. Secondly, the Community Operations team is charged with enforcing community standards. Thirdly, the Engineering & Product Team build the tools needed to identify and remove content quickly.
In a similar way, Google’s moderation efforts are equally as wide-reaching as Facebook. As you’d expect, Google has a diverse and multilingual team of product and policy specialists – over 10,000 people who work around the clock, tackling everything from malware, financial fraud and spam, to violent extremism, child safety, harassment, and hate speech.
What was interesting here were the very different approaches taken by companies experiencing the same problems. In a similar way that smaller sites would address user-generated content, the way in which each larger platform assumes responsibility for UGC differs, which has an impact on the stances and actions each one takes.
Agenda item 1: Illegal content – Inc. terrorist content & hate speech
One of the key topics the event addressed was the role content moderation plays in deterring and removing illegal and terrorist content, as well as hate speech – issues that are starting to impact classifieds businesses too. However, as discussions unfolded it seemed that often what should be removed is not as clear cut as many might imagine.
All of the representatives spoke of wanting to offer freedom of speech and expression – taking into account the fact that things like irony and satire can mimic something harmful in a subversive way.
Snapchat’s Global Head of Trust, Agatha Baldwin, reinforced this idea by stating that ‘context matters’ where legal content and hate speech are concerned. “Taking into account the context of a situation, when it’s reported and how it’s reported, help you determine what the right action is.”
Interestingly, she also admitted that Snapchat doesn’t tend to be affected greatly by terrorist content – unlike Google which, in one quarter of 2017 alone, removed 160,000 pieces of violent extremist content.
In discussing the many ways in which the internet giant curbs extremist activity, Google’s EMEA Head of Trust & Safety, Jim Gray, referred to Google’s Redirect program – which uses Adwords targeting tools and curated YouTube videos to confront online radicalization by redirecting those looking for this type of content.
Facebook’s stance on hate speech is, again, to exercise caution and interpret context. However, one of the other reasons they’ve gone to such efforts to engage a range of individual country and language experts in their content moderation efforts – by recruiting them to their Content Policy and Community Operations teams – is to ensure they uphold the rule of law within each nation they operate in.
However, as Thomas Myrup Kristensen – Managing Director at Facebook’s Brussels office – explained the proactive removal of content is another key priority; citing that in 99% of cases, given the size and expertise of Facebook’s moderation teams, they’re now able to remove content uploaded by groups such as Al-Qaeda and ISIS before it’s even published.
Agenda item 2: Copyright & trademark infringement
The second topic of discussion was the issue of copyright, and again it was particularly interesting to understand how large tech businesses curating very different types of content tackle the inherent challenges in similar ways – as each other and smaller sites.
Despite being a leading software developer community and code repository, the vast majority of copyrighted content on GitHub poses no infringement issues, according to Tal Niv, GitHub’s Vice President, Law and Policy. This is largely down to the work developers do to make sure that they have the appropriate permissions to do build software together.
However, when copyright infringement is identified, a ‘notice and takedown system’ comes into play – meaning the source needs to be verified, which is often a back-and-forth process involving several individuals, mostly developers, who review content. But, as a lot of projects are multilayered, the main difficulty lies in unraveling and understanding each contribution’s individual legal status.
Dimitar Dimitrov, EU Representative, at Wikimedia (Wikipedia’s parent company) outlined a similar way in which his organization relies on its volunteer community to moderate copyright infringement. Giving the example of Wikimedia’s media archive, he explained how the service provides public domain and freely licensed images to Wikipedia and other services.
About a million images are uploaded every six weeks, and they’re moderated by volunteers – patrollers – who can nominate files for deletion if they believe there’s any copyright violation. They can then put it forward for ‘Speedy Deletion’ for very obvious copyright infringement, or ‘Regular Deletion’ which begins a seven-day open discussion period (which anyone can participate in) after which a decision to delete or keep it takes place.
Citing further examples, Mr. Dimitrov recalled a drawing used on the site that was taken from a public domain book, published in 1926. While the book’s author had died some time ago, it turned out the drawing was made by someone else, who’d died in 1980 – meaning that the specific asset was still under copyright and had to be removed from the site.
Vimeo’s Sean McGilvray – the video platform’s Director of Legal Affairs in its Trust & Safety team – addressed trademark infringement complaints, noting that these often took a lot of time to resolve because there’s no real structured notice and takedown regime for these complaints, and so a lot of analysis is often needed to determine if a claim is valid.
On the subject of copyright specifically, Mr. McGilvray referenced Vimeo’s professional user base – musicians, video editors, film directors, choreographers, and more.
As an ad-free platform, Vimeo’s reliant on premium subscriptions, and one of the major issues is that users often upload their work for brands and artists as part of their showreel or portfolio; without obtaining the necessary licenses allowing them to do so.
He noted how to help resolve these issues, Vimeo supports users when their content is taken down – explaining to them how the copyright issues work, and walking them through Vimeo’s responsibilities as a user-generated content platform; whilst giving them all the information they need to ensure the content remains visible and compliant.
Looking ahead to sustainable moderation solutions
There can be no doubt that moderation challenges manifest in different ways and are tackled in numerous ways by tech giants. But the common factor these massively influential businesses share is that they take moderation very seriously and dedicate a lot of time and resources to getting it right for their users.
Ultimately, there continues to be a lack of clarity between what is illegal – according to the law of the land – and what constitutes controversial content. That’s why trying to maintain a balance between free speech, controversial content, and removing anything that’s hateful, radical, or indecent is an ongoing battle.
However, as these discussions demonstrate, no single solution can win in isolation. More and more companies are looking to a combination of machine and human moderation to address their content moderation challenges. And this combined effort is crucial. Machines work quickly and at scale, and people can make decisions based on context and culture.
Whatever size of business you are – from a niche classified site covering a local market to a multinational content platform – no-one knows your users better than you. That’s why it’s so critical that companies of all shapes and sizes continue to work towards best practice goals.
As Kristie Canegallo, Vice President, Trust and Safety, Google said “We’ll never claim to have all the answers to these issues. But we are committed to doing our part.”
Want to learn more about liability and the main takeaways from the content moderation at scale conference? Check out our interview with Eric Goldman.