Scams and frauds are rife across platforms that rely on user-generated content, as is the case for online marketplaces and classifieds sites. While content moderation is an excellent form of prevention, scammers are always evolving their methods – which means extra measures often need to be taken to keep up with them.
Besedo’s Trust and Safety team ensures that clients provide their customers with a safe and secure online environment – supporting those on the moderation front line as they fight fraud and scams. Here’s how they do it.
Scams and frauds are rife across platforms that rely on user-generated content, as is the case for online marketplaces and classifieds sites. While content moderation is an excellent form of prevention, scammers are always evolving their methods – which means extra measures often need to be taken in order to keep up with them.
Combatting fraud takes an extra layer of expertise: which is exactly why Besedo’s Trust and Safety team was set up. These dedicated experts work directly with our clients and their moderation teams to ensure users remain safe online. They are undertaking some serious research and analysis to stay one step ahead of scammers and on top of current trends. They also advise clients about immediate threats and suspicious behavior on their site and what’s happening on other online marketplaces.
Think like a scammer
Besedo’s Head of Implementation and Support, Jessica Kubacki, heads up the Trust and Safety team. She makes sure the correct processes are followed and that each client gets the right information at the right moment. Scams are time sensitive, so the longer a client is in the dark, the more damage scammers can do.
“When the Trust and safety team is assigned to a client, one of the first things we do is find out when the majority of content is posted,” she says. ”We then closely monitor activity; making sure that we’re on the website at the right times of day, tracking first time users and known scammers, doing what’s necessary to prevent fraud.”
Part of prevention involves getting inside a scammer’s mind – trying to understand their motives and any loopholes they might exploit.
“To be successful, you need to start thinking like a scammer,” Jessica explains. “How would you get money from people? What methods would you take? How would you use an online marketplace? Over time, thinking like this helps the team make better, more accurate decisions quickly.”
Offers that are ‘too good to be true’
Taking an approach like this helps the team sharpen their instincts and gives them a broader perspective on fraudsters’ operations. However, it also helps to adopt a user’s mindset; to better understand why these scams work.
“I always advise the team to think about the kinds of offers that appeal to them personally, and to also be aware of what we call ‘fraud markers’,” says Jessica. “A lot of people are intrigued by posts offering things like cheap holidays or investment opportunities. While many online offers are legitimate, with scams you’ll often see the same telltale fraud markers time and again. A very common fraud marker is the IP address that the item was posted from, if it differs significantly from what would be expected it is a warning sign. There are hundreds of such fraud markers, and the more experienced you become as a safety expert, the better you will be able to spot them.
“We also apply the ‘too good to be true’ rule. For example, say you’re looking to rent a flat in Paris for the summer, and spot an ad online offering rock bottom prices. When you take a closer look at the text, you’ll see words like ‘sea view’ (Paris is a good couple of hours from the coast). Also the ad may be oddly worded – like it’s been cut and pasted from Google Translate – or it may ask for a deposit upfront. And sometimes the person posting will have duplicate ads running… about properties in different cities. These are the kind of things we’re looking for.”
Clearly, this approach works. On an average day, the Trust & Safety team find that around 15%-20% of the content they analyse is fraudulent: preventing over a million instances of fraud each month.
There are many types of scams, but the worst are account takeovers and Trojan frauds.
Account takeovers are exactly as they sound; when scammers hack into user accounts using different methods. While a lot of sites are swift to respond to suspicious activity; the result can be that legitimate users can get blocked from the site, which in turn can affect the site’s reputation and churn rate. The best way to deal with a threat like this is to pay extra attention to first time users, according to Jessica.
“We ask our clients not to let first time users pass through moderation filters automatically,” she says. “The trust and safety team vet all new users where possible, ensuring that they are genuine users before allowing them to post anything to the site. Consistency is key to prevention.”
However, scammers’ have started finding ways around the new user vetting. They use what we call Trojan Frauds. In an online marketplace, a scammer may start by posting an ad offering something small or insignificant – like a pen or a t-shirt – to have an ad accepted and for the scammer to be deemed a genuine user. Little by little they begin to post more ads and now the ads are fraudulent. They will advertise items and accept payment from buyers, but never actually deliver the goods.
Beware of the ‘Catfish’
On dating sites ‘Catfishing’ – in which a scammer uses a picture of an attractive person to lure someone into a false sense of security – is the main moderation challenge.
“Profile pictures of men in uniform are common; as are images of, erm, women with ‘sizeable attributes’ (!),” Jessica explains. “Some dating websites keep records of these kinds of pictures, which are often stock photos or images of minor celebrities, but scammers have become wiser here as well and started snagging pictures from normal people’s social media accounts to make detection harder. This is why it is important to also look at other fraud markers like linguistic patterns, IP’s etc.
Professional fraud prevention
Besedo’s Trust and Safety agents work tirelessly to prevent these scams from ever reaching the end-user. They do this in a number of ways. They can work with a client’s in-house manual moderation team, providing tools that identify fraud markers in text and images. They can also work as a point of escalation, reviewing ads that both users and moderation teams have identified; monitoring first-time user behavior; and delving into contacts between users within a specific platform’s messaging platform, to review any potentially fraudulent activity.
“We have the resources, experience, and expertise needed to reduce fraud in online marketplaces and classified sites significantly,” Jessica explains. “Ultimately, we do everything we can to stop fraudulent activity from going live on our client’s sites, which ensures users remain safe and leads to increased trust, loyalty, and conversion.”
The DSA: An executive summary of the new online rules for platform businesses
Here’s a friendly guide, an executive summary if you will, to be compliant with the Digital Services Act for online businesses.
Content filtering vs content moderation: The key to scaling
Content filtering is only part of the content moderation process, but it’s an important gatekeeper that allows platforms to scale. Let’s have a closer look.
Misinformation vs disinformation: What is the difference and how do they interact?
Learn about misinformation and disinformation, how they interact, how false information spreads, and an unusual example from 1835.
Report: The effects of fraud and poor content quality on online marketplaces
The new Besedo report highlights the effects of fraud and poor content quality on online marketplaces. Get easy-to-understand graphs, stats, and insights.
How to Qualify Marketplace Sellers to Improve Conversion and Retention Rates
Master the art of qualifying sellers for better conversions and long-term user loyalty. Dive in to elevate your marketplace game.
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.