It is no surprise that during the festive season, record numbers of people use the internet. Customers turn to e-commerce for all their Christmas shopping needs, dating apps see spikes in users as people swipe right to find new people to talk to and online gaming proves a popular option for those with extra time off during the holidays, looking to relax and take part in some online recreational fun.

It is a huge opportunity for online retailers, especially for online marketplaces, where people will be looking for the best deals available – but this also presents an opportunity for scammers and fraudsters, which is too good to miss, looking to take advantage of what can only be described as a chaotic season online.

Safety of the user, not profit for the platform

For brands, this is the time of year when the hard work put into creating good experiences to strengthen reputation really comes home. No one wants to be the brand to ruin Christmas or fall short at a critical moment. In online business, it should be the safety of the user, and not the profit for the platform, that comes first, putting user experience at the heart of their priorities to increase customer trust, acquisition, and retention.

Christmas is a high-pressure moment in the calendar, however, it’s also a time where good experiences will be more impactful, and negative ones will have greater consequences. Whilst individuals should naturally take more additional care online and remain vigilant, we should also expect platforms – whether shopping, dating, or gaming, to protect users from potential harm, even more so at times of peak traffic.

It’s the most fraudulent time of the year

All marketplaces need to take precautions to prevent fraud on their platforms in the lead up to Christmas and onwards, keeping user experience front of mind. Larger marketplaces may see a smaller percentage of fraudulent posts, but considering that they have a much larger user base, even small percentages can lead to thousands of users becoming victims and associating a negative experience with the site. These sorts of harmful experiences could risk long-term reputational damage and the potential for fraud on their platform to spiral out of control.

As passions run, it’s not surprising that bad actors take the opportunity to engage in scams and fraud during the festive period. Last year, we decided to prove it. Our moderators investigated nearly three thousand listings of popular items on six popular UK online marketplaces, in order to understand whether marketplaces have content moderation pinned down, or, whether a fraudulent activity was still slipping through the net. The findings revealed that 15% of items reviewed showed signs of being fraudulent or dangerous.

This year, the risk is set to be higher than ever: a survey from an independent UK parcel carrier Yodel suggests that almost a third of people plan to do the entirety of their festive shopping online this year, more than a fourfold increase from last Christmas.

Good user experience, not just for Christmas

While this spike in usage means that brands have to be more vigilant than ever, it’s also the case that these trends are unlikely to reverse. This makes this Christmas an important learning opportunity, which will stress-test businesses’ moderation systems at activity levels, which might soon become the norm.

A positive and seamless customer experience at Christmas will not only drive sales in the short term but will also help to engage customers, building an emotional bond with the brand and in turn, increasing customer loyalty. But it should be remembered – the importance of a positive customer experience this Christmas is not only essential for the festive season but throughout the year.

With seasonal greetings

Axel Banér

Sales Director – EMEA

The term ‘sharing economy’ is famously difficult to define. For some, it refers to any digital platform that connects people more directly than traditional business models. For others, a business is only true in the sharing economy if it enables people to make money out of things they would buy and own anyway.

What all forms of the sharing economy share, though, is a reliance on trust. Whether you are hailing a ride, staying in someone’s spare room, borrowing a lawn mower, or paying someone to do a small one-off job, you’re entering into a transaction that starts with a decision to trust a stranger.

The difficulty of encouraging that decision is exacerbated by the fact that, from the user’s perspective, getting this wrong is potentially a high-stakes issue: while sometimes it might mean merely getting a dissatisfying product, interactions like borrowing a car or renting a room can pose serious risks to health and wellbeing.

Content’s double-edged sword

The question for platforms, then, is what kinds of structures and tools best encourage both positive outcomes and – almost as importantly – a sense of trust amongst the userbase.

Alongside approaches like strict rules on what can be listed and physical checks of users’ offerings, many sharing economy platforms turn to user-generated content (UGC) for this purpose. User reviews, photos of what is on offer, communication options, and even selfies can all help to humanize the platform, validate that users are real people, and generate a sense of trustworthiness.

At the same time, however, allowing UGC can open the door to specific risks. Low-quality images, for example, can worry people and erode trust, while giving users more control over how listings are presented creates greater potential for scams, fraud, and fake profiles. A permissive approach to content can also lead to users conducting business off-platform, side-stepping both safety, and monetization systems.

This is why there is such a variety of approaches to UGC in the sharing economy. Where some platforms, like Airbnb, encourage users to share as much about themselves and their property as possible, others, like Uber, allow only a small selfie and the ability to rate riders and drivers out of five stars. In between these open and defensive approaches, there are any number of combinations of content and sharing permissions a business might choose – but what delivers the best outcome?

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Using the carrot, not just the stick

Intuitively, many might assume that the platforms which feel safest will be those with the strictest rules, only allowing interaction between users when absolutely necessary and banning those who engage in damaging behavior. In recent research, a group of organizational psychologists described this as ‘harsh’ regulation, as opposed to the ‘soft’ regulation of supporting users, encouraging interaction, and influencing them to engage in positive behavior.

Perhaps surprisingly, the research found that soft regulation has a stronger positive impact than harsh regulation. The sharing economy, after all, digitalizes something humans have always done in the physical world: try to help one another in mutually beneficial ways. Just as we take our cues on how to behave in everyday life from the people around us, seeing positive engagements on platforms sets a standard for how we treat each other – and trust each other – in digital spaces. Being able to talk, share, and humanize helps people to engage, commit, and trust.

This suggests that we may need to shift how we think about managing content in order to make the most of its potential to drive long-term growth. Content moderation is seen, first and foremost, as a way of blocking unwanted content – and that’s certainly something it achieves. At the same time, though, having clear insight into and control over how, when, and where content is presented gives us a route towards lifting the best of a platform into the spotlight and giving users clear social models of how to behave. In essence, it’s an opportunity to align the individual’s experience with the best a platform has to offer.

Ultimately, creating high-trust sharing economy communities is in everyone’s best interest: users are empowered to pursue new ways of managing their daily lives, and businesses create communities where people want to stay, and promote to their friends and family, for the long term. To get there, we need to focus on tools and approaches which enable and promote positive interactions.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

The Christmas season is here and while the festivities kick off online retailers hold their breath and wait to see whether all of the preparations they have diligently made will pay off in revenue and sales during this ‘Golden Quarter.’ Will the website be able to handle extra demand? Will all orders be able to be shipped before Christmas?

Yet, The National Cyber Security Centre (NCSC) has highlighted another pressing concern which can have a lasting impact on revenue. Last week it launched a major awareness campaign called Cyber Aware advising potential customers to be aware of an increase in fraud on online platforms this year. This is because millions of pounds are stolen from customers through fraud every year – including a loss of £13.5m from November 2019 to the end of January 2020 – according to the National Fraud Intelligence Bureau.

Fraud is a major concern for marketplaces who are aware of the trust and reputational damage that such nefarious characters on their platform can create. While consumer awareness and education can help, marketplaces know that only keeping one eye on the ball when it comes to fraud, especially within User Generated Content (UGC), is not enough. Fraudulent activity deserves full attention and careful monitoring. Trying to tackle fraud is not a one-off activity but a dedication to constant, consistent, rigorous, and quality moderation where learnings are continuously applied, for the on-going safety of the community.

With that in mind, our certified moderators investigated nearly three thousand listings of popular items on six popular UK online marketplaces, in order to understand whether marketplaces have content moderation pinned down, or, whether fraudulent activity is still slipping through the net. After conducting the analysis during the month of November, including the busy Black Friday and Cyber Monday shopping weekend, we found that:

· 15% of items reviewed showed signs of being fraudulent or dangerous, this rose to 19% on Black Friday and Cyber Monday

· Pets and popular consumer electronics are particular areas of concern, with 22% of PlayStation 5 listings likely to be scams, rising to more than a third of PS5 listings being flagged over the Black Friday weekend

· 19% of listings on marketplaces for the iPhone 12 were also found to show signs of being scams

· Counterfeit fashion items are also rife on popular UK marketplaces, with 15% of listings found to be counterfeits.

The research demonstrates that, even after any filtering and user protection measures marketplaces have a significant number of the products for sale on them are leaving customers open to having their personal details stolen or receiving counterfeit goods. We know that many large marketplaces have a solution in place already, but are still allowing scams to pass through the net, while smaller marketplaces may not have thought about putting robust content moderation practices and processes in place.

Both situations are potentially dangerous if not tackled. While it is certainly a challenging process to quickly identify and remove problematic listings, it is deeply concerning that we are seeing such a high rates of scams and counterfeiting in this data. Powerful technological approaches, using AI in conjunction with human analysts, can very effectively mitigate against these criminals. Ultimately, it should be the safety of the user placed at the heart of every marketplace’s priorities. It’s a false dichotomy that fail safe content moderation is too expensive a problem to deal with – in the longer term, addressing even small amounts of fraud that is slipping through the net can have a large and positive long term impact on the financial health of the marketplace through increased customer trust, acquisition and retention.

2020 was a year we would not want to repeat from a fraud perspective – we have not yet won the battle against criminals. As we move into 2021, we’ll be hoping to help the industry work towards a zero-scam future, one where we take the learnings and lessons together from 2020 to provide a better, safer community for users and customers, both for their safety, but also for the long term, sustainable and financial health of marketplaces.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

From ancient Greek merchants attempting to claim insurance by sinking their ships, to Roman armies ‘selling the Emperor’s throne’(!): since time began, whenever there’s been a system open to exploitation, there have been fraudsters willing to try their luck. And succeeding.

However, most historical crimes were essentially isolated events. While a single fraudster could be a repeat offender, compared to the sheer number of people who can be duped at scale across different digital channels – by largely ‘invisible’ cyber criminals – it’s clear that fraud has become much more of an everyday concern for all of us.

But how did we get to this point? What risks do we need to be aware of right now? What can we do about it?

Let’s consider the issues in more detail.

A History Of Digital Deviance

In a similar way to other forms of fraud, digital scams date back further than you might think. Email phishing allegedly first took place in the early 1970s, although it’s generally accepted that the term was coined and the practice became commonplace in the mid-1990s.

Since then, the online world has seen con artists try their hand at everything from fake email addresses to using information gleaned from massive data breaches; with $47 million being the most amount of money a single person has lost to an email scam

(Incidentally, the most famous email scam, the ‘419’, aka ‘Advance-fee’, aka ‘the Nigerian Prince’ scam surfaced as mail fraud some 100 years ago.

But email isn’t the only digital channel that’s been hijacked. The very first mobile phone scams came about during the high-flying 80s when they became available – way before they were popular.

Given the high cost of these now infamous brick-sized devices, the wealthy were pretty much only people in possession of them (so it makes sense that they quickly became fraud targets too).

SMS messages requesting funds be ‘quickly sent’ to a specific account by a ‘family member’ began to soon after, though again didn’t surge in number until well into the 90s when uptake soared.

Of course, these aren’t the only forms of online fraud that surfaced at the start of the Internet’s popularity. Password theft, website hacks, and spyware – among others – proliferated at an alarming rate around the world at a similar time.

So, if we take it that digital scams have been around for some 25 years, why do they persist – especially when awareness is so high –? One of the biggest problems we face today is the ease with which online fraud can take place.

Hackers, of course, continue to evolve their skills in line with advances in tech. But when you consider the number of sites that anyone can access – the marketplaces and classified sites/apps that rely on user-generated content – pretty much anyone can find a way to cheat these systems and those that use them.

Fraud Follows Trends

As we’ve explored previously, scammers operate with an alarming regularityall year round. However, they’re much more active around specific retail events – such as peak online shopping periods, like Black Friday, the January sales, back to school accommodation searches, and the Chinese New Year.

However, while 2020 was shaping up to be a landmark year for fraudsters, given the many different sporting and cultural events – such as the Euro 2020 and Copa Americas football tournaments, and of course, the Summer Olympics – it seems that fate had very different plans for all of us: in the form of the COVID-19 pandemic.

But true to form, scammers are not above using an international healthcare crisis to cheat others.COVID-19 has given rise to different challenges and opportunities for online businesses. For example, video conferencing services, delivery apps, dating websites, and marketplaces themselves have largely been in an advantageous position financially, given the fact they’re digital services.

However, given the knock-on economic factors of coronavirus and the danger of furlough drifting into long term unemployment – among other things – there may be more wider reaching behavioral shifts to consider.

That said, fraudulent behavior simply seems to adapt to any environment we find ourselves in. In the UK, research shows that over a third (36%) of people have been the target of scammers during lockdown – with nearly two thirds stating that they were concerned that someone they knew could be targeted.

Examples playing on fear of contamination include the sale of home protection products, while other more finance-focused scams include fake government grants (requesting personal information), help with credit applications (for a fee), and even investment opportunities promising recession-proof returns for those with enough Bitcoin to put into such schemes.

The Gap Is Closing

It’s clear that online fraud is closely aligned with wider trends. In fact, the ‘newer’ something is, the more likely scammers are likely to step in. Looking at the various timelines of many of these scams against when the technology itself was first invented, and it’s clear that as time progresses, the gap between the two is closing.

There’s a very good reason for this: the pace of adoption. Basically, the more people there are using a particular device or piece of software, the more prolific specific types of scam have become.

Consider the release of the iPhone XS back in 2013 and the popularity of mobile game, Pokemon Go, three years later. Both events provided fraudsters with enough of an incentive to target innocent users.

As with most (if not all) eCommerce scams, the launch of the iPhone XS played upon consumer desire and ‘FOMO’ (fear of missing out) to manipulate Apple enthusiasts into being able to get their hands on the latest technology ahead of the official launch date.

Not only that, but the accompanying hype around the launch proved the perfect time for fraudsters to offer other mobile phone-orientated scams.

In general new tech and trends lead to new scams. For instance, Pokémon Go gave rise to new scams such as the Pokémon Taxi (‘expert’ drivers literally taking users for a ride to locations where rare Pokémon were said to frequent) and the advanced user profile (basically paying for accounts that other players had already leveled up on).

The fact it was so new and its popularity surged in such a short period of time, it made it a whole lot easier for fraud to materialize. Essentially, there was no precedent set – no history of usage. No-one knew what exactly to anticipate. As a result, scams were materializing as quickly as new users signed up.

In one case, players were being tricked into paying for access as ‘new server space was needed’. Not paying the $12.99 requested would result in their hard fought Pokemon Go accounts being frozen. While that might be a small price to pay for one person, at scale it would mean a significant amount.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Moderation Methods

Regardless of which methods they use, fraudsters are ultimately focused on one thing. Whether they’re using ransomware to lock out users from their data, posting ‘too-good-to-be-true’ offers on an online marketplace, or manipulating lonely and vulnerable people on a dating app  – the end result is cold hard cash through scalable methods.

However, while hackers will simply barge their way into digital environments, those using online marketplaces and classifieds sites essentially need to worm their way into their chosen environment. In doing so they often leave behind a lot of clues that experienced moderators can detect.

For example, the practice of ad modification or use of Trojan ads on public marketplaces follow particular patterns of user behavior that can cause alarm bells to ring.

So what can marketplace and classified site owners do to stay ahead of fraudsters? A lot, in fact. Awareness is undoubtedly the first step to countering scams, but that alone will only raise suspicion than act as a preventative measure.

Data analysis is another important step. But, again, the biggest issue is reviewing and moderating at scale. When you have an international platform, how can a small moderation team police every single post or interaction when thousands are created every day?

This is where moderation technology – such as filters – can help weed out suspicious activity and flag possible fraud.

In order to stay ahead of fraudsters, you need a combination of human expertise, AI, and filters. While it’s possible for marketplace owners to train AI to recognize these patterns at scale, completely new scams won’t be picked up by AI (as it relies on being trained on a dataset). This is where experienced and informed moderators can really add value.

People who follow scam trends and spot new instances of fraud quickly are on full alert during big global and local events. They can very quickly create and apply the right filters and begin building the dataset for the AI to be trained on.

Ultimately, as tech advances, so too will scams. And while we can’t predict what’s around the corner, adopting an approach to digital moderation that’s agile enough to move with the demands of your customers – and with fast intervention when new scam trends appear – is the only way to future proof your site.

Prevention, after all, is much better than a cure. But where fraud is concerned, a blend of speed, awareness, and action are just as critical.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

The outbreak of COVID-19 or Coronavirus has thrown people all over the world into fear and panic for their health and economic situation. Many have been flocking to stores to stock up on some essentials, emptying the shelves one by one. Scammers are taking advantage of the situation by maliciously playing on people’s fear. They’re targeting items that are hard to find in stores and make the internet – and especially online marketplaces – their hunting ground, to exploit desperate and vulnerable individuals and businesses. Price gouging – or charging unfairly high prices – fake medicine or non-existent loans are all ways scammers try to exploit marketplace users.

In this worldwide crisis, now is a great time for marketplaces to step up and show social responsibility by making sure that vulnerable individuals don’t fall victim to corona related scams and that malicious actors can’t gain on stockpiling and selling medical equipment sorely needed by nurses and doctors fighting to save lives.

Since the start of the Covid-19 epidemic we’ve worked closely with our clients to update moderation coverage to include Coronavirus related scams and have helped them put in place new rules and policies.

We know that all marketplaces currently will be struggling to get on top of the situation and to help we’ve decided to share some best practices to handle moderation during the epidemic.

Here are our recommendations on how to tackle the Covid-19 crisis to protect your users, your brand and retain the trust users have in your platform.

Refusal of coronavirus related items

Ever since the outbreak started, ill-intentioned individuals have made the price of some items spike to unusually high rates. Many brands have already taken the responsible step of refusing certain items they wouldn’t usually reject, and some have set bulk-buying restrictions (just like some supermarkets have done) on ethical and integrity grounds.

Google stopped allowing ads for masks, and many other businesses have restricted the sale or price of certain items. Amazon removed thousands of listings for hand sanitizer, wipes and face masks and has suspended hundreds of sellers for price gouging. Similarly, eBay banned all sales of hand sanitizer, disinfecting wipes and healthcare masks on its US platform and announced it would remove any listings mentioning Covid-19 or the Coronavirus except for books.

In our day to day work with moderation for clients all over the world we’ve seen a surge of Coronavirus related scams and we’ve developed guidelines based on the examples we’ve seen.

To protect your customers from being scammed or victim of price-gouging and to preserve your user trust, we recommend you refuse ads or set up measures against stockpiling for the following items.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Optimize your filters

Ever since the crisis started, scammers have become more sophisticated as days go by, finding loopholes to circumvent security measures. By finding alternative ways to promote their scams, they use different wordings such as Sars-CoV-2 or describing masks by their reference numbers such as 149:2001, A1 2009 etc. Make sure your filters are optimized and your moderators continuously briefed and educated to catch all coronavirus-related ads.

Right now, we suggest that tweak your policies and moderation measures daily to stay ahead of the scammers. As the crisis evolves malicious actors will without doubt continue to find new ways to exploit the situation. As such it’s vital that you pay extra attention to your moderation efforts over the following weeks.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

The fraud landscape is continually shifting with fraudsters’ methods growing more sophisticated by the day. With cybercrime profits rising to $1.5 trillion last year, scammers are getting increasingly creative. Fraudsters are always improving on their deceptive and lucrative scamming methods alongside the continuous development of the Internet.
Let’s have a closer look at some of the most sophisticated scams surfacing on the Internet currently.

Fake website scams

Scammers might be getting more elaborate in their techniques, yet they take more precautions than ever before to avoid being exposed and caught by the authorities.

A current trend making the headlines involves scammers setting up fake websites to replicate reputable online stores or marketplaces while carefully executing their scams.

A news story broke out in Spain earlier this year and is considered to be the largest cyber-scam in the history of the country. A 23-year old, along with his accomplices, managed more than 30 fraudulent websites generating an income amounting to an estimated 300,000 € per month.

The fraudsters sold mainly consumer electronics by directing potential buyers to their own fraudulent pages which looked like replicas of reputable websites using companies’ logos and names to dupe users.

To avoid suspicion from the authorities, these sites would only stay live for a few days while being intensely advertised on search engines and social media before disappearing into thin air. The aim was to attract as many potential customers as possible in the shortest amount of time.

Victims were asked to send bank transfers for their orders, but of course, the goods would never arrive. The money received would be deposited into bank accounts set up by mules recruited to transfer money illegally on behalf of others. The scammers then only needed to withdraw the money at cashpoints.

Another technique used by fraudsters plays on the fact that most of us have been warned to check for secure websites when browsing the web. The presence of “https” and the lock icon are supposed to indicate that the network is secure, and users can share their data safely. However, this can be easily replicated. By using those security symbols, scammers make it easier for them to fool potential victims.

Fake website scams, also popping up on sharing economy websites such as Airbnb, shows that since people have become warier of online scams, fraudsters have taken their methods to the next level of sophistication.

Romance mule scams

Online dating sites have been one of romance scammers’ preferred hunting grounds to find their next victim.

The FBI’s online crime division recently flagged an alarmingly growing trend in the romance scam department. Online dating scammers have expanded their romance scam strategies by adding a new dark twist to it and taking advantage of their victims in a whole new way. This dating scam turns love-seekers into unwittingly recruited criminals, also known as ‘money mules’.

Fraudsters, under the guise of a relationship, are using online dating sites to recruit their next victim and dupe them into laundering stolen money.

Here’s how they work. The scammers, pretending to be American or European citizens living or working abroad, start grooming their victims for several months by establishing a supposedly trustworthy relationship with them. They then proceed to have their victim act as a financial middleperson in a variety of fraudulent activities. There are many ways victims can unwittingly help fraudsters. For instance, they can print and post packages or letters often containing counterfeit checks, or in other cases pick up money at Western Union and forward it to another place. They could even open a bank account under the pretense of sending and receiving payment directly, except that these accounts, unknowingly, are then used to aid criminal activities.

A study estimates that 30% of romance scam victims in 2018 have been used as money mules.

Indeed, romance scam victims are the perfect mules for these sorts of frauds. Blindly in love, victims will not doubt their lovers’ intentions and likely never steal the money or ask for anything in exchange for their services, unlike professional fraudsters.

This romance mule scam makes it complicated for local authorities to follow the money, and even if the victims get caught, they do not know the actual identity and location of the scammers. Tracking the fraudsters’ movements becomes an intricate or impossible task.

Romance mule victims often do not know they are a part of these fraud schemes or criminal activities until it’s too late. Despite the victims not losing any money, the FBI warns that they might face legal or financial consequences for participating in such a scheme.

The Telegraph reported a story about a 61-year-old British man who was tricked by fraudsters to fund terrorists in this exact way. He believed he was corresponding with a wealthy businesswoman he met on an online dating site. The woman claimed she needed her money in the UK so she could pay her European employees. The man would then send checks on her behalf, becoming inadvertently and unknowingly part of a fraud scheme.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Impersonation scams

Phishing scams are very well-known, however a variation, the ‘impersonation scam’ has boomed over the past few years impacting both users’ online safety and companies’ reputation.

These phishing emails might seem completely genuine as they look nearly identical to those of reputable and reliable websites, including Google or Apple, and often end up bypassing spam filters. Like many fraud schemes, impersonation scams are based on trust.

Last year, the popular streaming service Netflix was the target of an email phishing scam in Ireland which was sent to thousands of subscribers on the pretext of a maintenance and verification issue. In a similar way to the fake website scams previously mentioned, these malicious emails, looking like perfect replicas of Netflix emails featured a link to update credit card information or login credentials. However, the link did not direct users to the Netflix website but to a site managed by scammers.

Scams often target popular brands with a large user-base to lure subscribers into giving out personal information. Not only is this dangerous for the customer, but it also threatens brands’ reputation.

Staying ahead of the scammers

With fraudsters persistently refining their scamming techniques, companies must always be one step ahead in the prevention of these scams to protect their users and keep them safe on their site.

Marketplace leakage, which we wrote about in a previous article, refers to losing your users’ from operating on your platform and continue their conversations beyond your site’s security measures. This technique of luring users away from your site is used in the scam examples mentioned in this article, and dramatically increases the risk of your users being scammed. To ensure the safety of yours users, online marketplaces need to keep interactions on their platforms by preventing the display of personal details and hidden URLs.

Marketplace leakage can be avoided by improving your content moderation efforts. By ensuring accurate automated moderation you can instantly spot and prevent any content aimed at drawing users away from your platform.

To learn more about setting up efficient automated moderation filters to protect your users, check out our Filter Creation Masterclass.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Over the years of helping online marketplaces moderate their real estate sections, we’ve gathered a long list of best practices for content moderation rules and guidelines specific to this industry.

If you work with a real estate site or platform you know that you need to keep a close eye on the user-generated content published by users. If you don’t, your site quickly turns from a helpful tool for smooth interactions between renters and landlords to a scam infested spam forum.

Today we share some actionable guidelines on what to direct your content moderation efforts at. You can use them to increase the quality of listings, protect your users and hopefully increase the conversion rates of your real estate site.

Edit or reject ads with contact details

While the whole industry should slowly be moving towards sites that monetize and provide value through value-added services, most sites are not there yet.

Unless your site already has a unique and strong offering of additional value, it’s likely still relying on sellers who use it as a lead generator. If that’s the case you should remove all mention of phone nr, name, and email or physical addresses to prevent platform leakage.

Unconventional payment methods

Unless it’s your USP, all ads that mention unconventional payment methods should be removed. This is true both for swap and exchange suggestions, such as a car or cellphone in exchange for accommodation.

The rule applies to unorthodox payment methods such as Bitcoins and other electronic currencies.

We advise against allowing such reimbursement options as the value comparison can be hard to get right and there’s a risk one of the two parties will end up dissatisfied. You don’t want those negative feelings associated with your platform and you definitely do not want to get involved in disputes concerning unconventional payment methods.

Finally, there’s also the additional risk that some of the commodities, offered in exchange, has been acquired illegally and you don’t want your platform involved in what could essentially be seen as fence activities.

Whether you monetize your real estate platform by charging listing fees or not, you should remove listings with more than one item in.

If you charge a listing fee, sellers who post multiple items in one go are circumventing the fee and negatively impacting your revenue. If you don’t, listings with many different offerings are still really bad as they make it harder for users to find relevant results when searching for accommodation, decreasing the user experience.

Links or references to competitor sites

It goes without saying that it’s best practices for content moderation to remove or edit any mention of competitors immediately, particularly if they include outbound links.

It’s bad enough with bouncing visitors, it’s even worse if the content on your site is actively referring them to rivals in your space.

Pay attention to listing prices

We have an entire article focused on things to look out for to prevent scams on real estate sites, but one of the things we haven’t discussed in depth is listing prices.

Most areas will have a pretty baseline price range for similarly sized accommodations. For cities that are extra prone to be targeted by scammers, it’s a good idea to familiarize yourself with this range. Scammers often offer up apartments for rent at too good to be true prices. If you know the realistic range, it’s easier for you to catch them before they get to your customers.

We are currently working on building out a pricing database for some of the bigger cities in the world. If this project sounds interesting, be sure to subscribe to our blog and get informed when we have more information available.

Take a hard stance against discrimination

You’re probably already aware of the multiple lawsuits Airbnb has faced due to various instances of discrimination that’s occurred through their platform.

To avoid getting into the same legal trouble and the ensuing PR storm as well as to provide all your users the best possible experience through your site, we advise taking a hard stance against any discriminatory listings. Reject any listings that singles out people of specific race, religion, or sexual orientation etc.

Prohibit rentals that enable prostitution

For anyone who has followed the case of backpage.com and how its owners were indicted for earning over $500 million in prostitution-related revenue from the site, it should be second nature to have processes in place for finding and removing any content that facilitate prostitution.

Apart from the moral implications, allowing prostitution is illegal in many countries and could land your company (and you) in both legal and PR troubles.

If your platform isn’t offering hotel rooms or vacations homes, it’s often a good and safe practice to reject rooms-for-the-night type listings. That type of listings is often advertising accommodations used for indecent interactions.

Remove duplicate items

Users will sometimes submit the same listing multiple time. Why they submit duplicates may vary, but the most common reason is to try and bump up their ranking on your site. When users try circumventing rules, it’s never good as it usually impacts either user experience, violates legal commitments or, as in the case with duplicates, could get your site penalized in Google rankings.  

The best cause of action is to remove duplicates from your site directly before they get published, this way you ensure the quality of your site and avoid a messy search experience for other users.

We’ve written more about why duplicate content is bad for real-estate sites and how to remove it here.

Re-categorize listings placed in the wrong category

Vacation homes in the permanent residency category or for-sale houses in the for-rent section, all contribute to irrelevant search results and negative user experiences.

It’s important to remove or re-categorize misplaced listings quickly to ensure a good experience for users.

Reject listings with poor descriptions

Depending on the category, the required details on the commodity for rent or sale may differ. What’s always true though is that the description needs to be accurate and descriptive. Information like location, price, minimum rent time etc. should be a given. But sellers are often in a hurry and don’t want to spend too much time on the admin work that goes into writing a good listing that converts.

Make sure you educate your users to create proper descriptions and titles for their listings, otherwise both bounce rate and conversion rates may suffer. In a study we did on user behavior we found that irrelevant content leads to 73% of users never returning to the site again.

Following these guidelines outlined above will help you eliminate fraudulent listings and improve the quality of content as well as the overall user experience of your site.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

4 most common refusal reasons on real estate sites in 2019

Now that we have gone through the best practices for content moderation let’s also quickly disclose where we find the biggest content moderation challenges lie for real estate sites.

In 2018 these were the top 4 rejection reasons for the real estate sites we help.

As you can see most of the rejected items affect either user experience, and as a result conversion rate, or they impact your revenue more directly, as is the case with multiple products, where users circumvent the listing fee.

Curious about which other listings we reject for real estate sites? Check out our presentation 9 listings to remove from your real estate site to optimize for monetization from Property Portal Watch in Bangkok 2018 where we go into more details.

Want to know more about best practices for content moderation or expert advice on how to set up and optimize the content moderation on your site? Get in touch and hear how we can help out.

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

On a yearly basis, we deliver a scam awareness calendar to help online marketplaces prepare for scam spikes in the year to come. We base the scam calendar on trend observations from previous years and analysis of major happenings in the coming year. Our trust and safety team is working day-by-day with analyzing data to find fraudulent behaviors, and proactively supports our clients with information to help them stay ahead of scammers.

Fraudulent behaviors on marketplaces are constantly fluctuating, as we witness periods of increased and decreased scams. Scam spikes are typically triggered by holiday seasons, festivals, events and other large happenings in a year.

For you and your moderation team to stay on top of the scam spikes, you need to be aware of when and where scammers might appear. In this article, we will share some of the most common types of scam for 2019 and when you are likely to see them spike. If you want to learn more about the specific scam spikes, visit our scam awareness calendar where we predict spikes on a month-by-month basis.

Tech release scams

We are spoiled as consumers with new tech releases every year. In so many ways it’s neat that we continue to develop and outperform our technical developments. And often, we witness competing companies triggering each other to step up their game and drive development. One of the most reoccurring battles between brands is between the two phone giants Apple and Samsung. When Samsung releases their phone of the year, Apple can’t wait to release theirs.pan>

These two annual releases are considered some of the most important product launches of the year, by tech enthusiasts and consumers. Unfortunately, this also attracts scammers looking to deceive eager buyers.

As with previous years, we’re expecting the scam spike in the weeks leading up to the launch of a new iPhone or Samsung. To protect your users, make sure to be on the lookout for pre-order listings, cheap prices compared to market price, phrases such as ‘item is new’ or ‘as good as new’ or ‘brand new in box, as well as deceiving phrases used in the description.

Samsung is rumored to release Samsung Galaxy S10 on March 8th, with prices starting at $719. Rumors are also floating online, that Samsung will launch the world’s first foldable smartphone in March this year.

Apple, on the other hand, usually host their big annual product release in early/mid-September, and if they stick to their tradition, we’re expecting their new iPhone to be launched on September 10th this year. Visit this page to stay on top of the latest news surrounding the next iPhone release.

Holiday booking scams

One of the most common actions targeted by scammers is vacation and holiday bookings. When we’re dreaming ourselfves away to various destinations in front of our computer or phone, scammers strategically expose us to exclusive vacation deals that looks stunning, but which in reality doesn’t exist. At Besedo we witness these types of scams on a daily basis, but April and August are considered peak season for holiday scams – when we book our summer and winter vacations.

Make sure your users stay safe on your site. Be on the lookout for fraudulent holiday rental ads and offers that are ‘too good to be true’. And more concretely, your moderation team need to look out for high quality or stock pictures, free email domains, IP’s, large group rentals, price below market, full payment in advance etc.

Want to learn more about holiday scams?

Check out this article: It’s that time of the year again, the peak season for vacation rental scams.

Shopping scams

Shopping, shopping, shopping. We all do it, we all (most at least) love it. Phenomena like Black Friday, Cyber Monday, after Christmas sales, Singles day etc. are periods where consumers are rushing to get exclusive deals and discounts.

While offline consumers are in the risking to be trampled in packed stores, online shoppers need to be vary of scammers trying to capitalize on the shopping frenzy by deceiving consumers with ‘super deals’. Be ready for a period of increased scams during and after the shopping peaks. Your team needs to be on the lookout for things like “too good to be true prices”, stock photos and phishing emails.

Learn more: Holiday shopping moderation guide and Online marketplace owners’ checklist for holiday shopping.

Big events scams

Every year there are multiple events taking place, everything from sports events to concerts and festivals. Unfortunately, most large events also attract a wave of scammers. In 2019 there are two major sports events, the Asian Cup and Copa America. For these kinds of events, your moderation team should be pay extra attention to ads with many available tickets for sale, low prices, miscategorized tickets, ultra-cheap airline tickets, address and phone number are geographically disconnected, and requests for bank transfer payment only etc.

Besides the two football tournaments mentioned above, there’s a lot of concerts and festivals already sold out, which means tickets may be for sale on your marketplace. Stay ahead of the scammers, learn more about ticket scams and how to keep your users safe.

Back to school scams

Being a student often comes with a tight budget and a need to find new accommodation, often in very specific and possibly unfamiliar areas. This, naturally, makes them vulnerable to potential fraudulent rental deals and loan offers. Make sure your moderation team pays attention to new users posting flats/flat shares, pricing, emails, stock photos, and dodgy loan offers.

New courses usually start twice a year, every January and September, and it is during these months we typically see an increased number of scammers trying to trick students of their money.

Here’s how to automatically reduce student accommodation scams.

Stay ahead of the scammers

Most of the scams we’ve listed will happen throughout the year and your team should always be looking out for them. However, by knowing when a spike is likely you can better prepare your team and you can staff accordingly.

By being aware of scam spikes and adjusting your moderation setup accordingly you can both keep your users safe, reduce time to site and shrinkage. If your team size isn’t flexible, a good way to manage spikes with minimal impact to the end user is to increase your automation levels when the volumes grow.

With the right setup you can automate up to 80% using filters alone and with tailored AI you can reach even better quality and levels.

Want to know more? Get in touch with a content moderation solution expert today or test our moderation tool, Implio, for free.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

The FIFA World Cup in Russia kicks off on June 14th, 2018. Fans from all around the world have eagerly awaited this event since the last tournament four years ago in Brazil. 

With the tournament only a few months away, ticket sales are red hot on the market, jerseys are in the printers and fans from different corners of the world are planning everything surrounding their trip.  

The high demand, creates desperation in the market, resulting in vulnerable buyers as scammers lurk in the waters ready to take advantage of the situation. In fact, leading up to the FIFA World Cup 2014 in Brazil we saw an increase in scams related to the event. Scammers are trying to deceive fans in different ways and they get more creative with each year. Let’s have a look at the three main areas scammers will target in relation to the 2018 FIFA World Cup.  

Ticket scams

In the middle of April, the official ticket sales enter the last sales phase, which runs on first come first served basis. This means that tickets might soon be sold out and that the only option to get a ticket is to buy it second-hand or through resellers. FIFA is taking ticket scams very seriously and are doing what they can to prevent it by offering an official ticket resale page. It’s even considered an administrative offense to transfer or resell tickets without FIFA’s consent, according to Russian law (Federal Law No. 13-FZ). 

However, it’s still likely that you’ll find tickets sold on your marketplace and among the group of legitimate sellers, scammers will try to trick buyers. There are many different ticket scams and key fraud markers to look for during these big tournaments. We wrote an article specific to ticket scams prior to the UEFA Euro Championship in 2016 and we highly recommend you to read it and prepare your site for ticket scams

Holiday rentalscams

Once the ticket is secured, fans start looking at the logistics of attending the event. Many travelers are booking flights, transportation and accommodation from overseas. This puts them at risk of getting scammed since they’re not familiar with the area and will have a harder time telling whether a rental is genuine or not. Even though the general advice and messaging is “don’t book from sites you cannot trust” or “never pay by any other means than credit cards and never more than a month in advance”, people are still getting scammed. In 2016, nearly US$4 billion was lost by consumers due to misleading bookings. 

In the time leading up to the tournament, expect the number of holiday rental scams to increase making it even more important for you and your moderation team to stay on top of how to spot them. Here is a list of 10 actions to reduce vacation accommodation scams on your site.   

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

Counterfeit items

As is usual with big events there will be a lot of merchandize available for purchase, both online and offline. Jerseys, hats and scarfs, to name a few, are products that usually are highly sought after during sports events.  

Counterfeited products are a real issue for online marketplaces and during the FIFA World Cup, we’re expecting to see a significant increase, especially for fake jerseys. It’s important for online marketplaces to be aware if they sell counterfeited products on their site, especially since brands these days are stepping up their initiatives against counterfeits and are increasingly filing lawsuits to stop the practice. One example is the claim Chanel won against sellers on Amazon in 2017. 

Ending up getting sued by a large brand is not only bad economically, it is also a dent in the reputation and trustworthiness that most online marketplaces spend so much time building.  

To help you act and prevent sales of counterfeited products on your site, we’ve put together a counterfeit filter creation checklist. The short guide will teach you how to set up accurate filters to automatically catch counterfeit items on your site: https://www.besedo.com/counterfeit-filter-checklist/   

During big events, like the FIFA World Cup, customers from around the world come to your site to purchase their tickets, plan their trips or to buy their national teams’ jersey. It’s in your hands to make their experience on your site safe and smooth. 

Want to learn more about content moderation and how we can help? Get in touch with us today.  

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background

Real estate listings’ sites are just as susceptible to scams as other online marketplaces. Here are some tips on how to prevent fraud and what content moderation teams should look for.

When people start looking for homes to rent or buy, the internet is the first place they begin their search. And, as with other types of classified sites, scammers are also using real estate listings to extract money from potential buyers and renters.

The problem, as with many online marketplace scams, is that they’re not always that easy to detect. Here are some things real estate site owners should be aware of from a content management perspective, and a few ways to keep your users safe.

Fake Photos

It almost goes without saying, but a lack of photography on a property website is completely nuts! But harder to detect, and a lot more common are instances when scammers list pictures of properties they don’t own. In fact, often they’ll use the same pictures on different sites, listed in several different locations. A way to prevent this is to set up a filter that detects words and phrases commonly used in scams – which trigger manual review when they’re posted. Moderators can then do a quick Google search to see if the image is also posted elsewhere and to confirm the validity of the listing.

Overseas Landlords

While many genuine investors and landlords will own property in other countries, being able to flag discrepancies between where a landlord or seller says they are, and where they actually are, offers your users an additional layer of security. That’s one of the reasons we introduced geolocation filters to our Implio tool last year. Additionally, scammers operating under this guise will typically claim to be from a trustworthy organization: such as the UN, the military, or may even claim they’re working away as missionaries. Trigger terms like these should be on your moderation filter list.

Multiple Listings and duplicate images

Some landlords or vendors may list the same property more than once. This can be common when they’re particularly keen to promote a specific property using different keywords, and while this should also be handled through good moderation practices, it isn’t strictly deceptive. However, when the same piece of real estate is listed as being available in different locations, then a scammer is most definitely at work.

Requested Payments

While the topics listed above are arguably easier to screen for, less obvious, and perhaps more common, are requests for payment; like an upfront holding fee deposit or a payment demand for a showing. In instances like these, if your site has an in-app messaging service, with the right moderation service, you’ll be able to flag telltale requests for payment via PayPal, Western Union, or MoneyGram; unsolicited payments or overpayments made to the user.

It’s always important to reinforce to your users that extreme caution must be exercised when transferring any sum of money, particularly overseas. But scammers are savvy and many will encourage users to take the conversation over to personal email. In cases like these, the best thing you as the site owner can do is to raise awareness among your users.

Ultimately, the vast majority of fraudsters are trying to get users to send them money for a property that probably doesn’t exist or has nothing to do with them whatsoever. Detection is always the best form of prevention. To find out more about how Besedo can help your real estate business, please get in touch.

Or learn more about more content challenges for real estate sites.

Learn how to moderate without censoring

Why moderating content without censoring users demands consistent, transparent policies.

Untitled(Required)

This is Besedo

Global, full-service leader in content moderation

We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.

Form background