We are seeing an increase in legislation aimed at the digital world across the globe. What does that mean for online marketplaces, are there any trends we can see already now and what can we expect from the future? We’ve taken a deep dive into the legal pool to see if we can make sense of it all.
The line between the digital and offline society gradually gets blurrier as human interaction increasingly happen on and jump between online platforms and digital spaces.
Unsurprisingly this merge of tech-driven and traditional doesn’t always happen smoothly. Governments have been particularly slow at catching up to the new world order leaving the digital society to its own devices when it comes to upholding law and order.
Recent events, like election meddling, the increased suicide rate attributed to cyberbullying and clashes between online and offline workforce has however kickstarted government involvement across the globe and we are starting to see an increased interest in, and legislation aimed at taming the digital world.
For those of us who operate in the space and must navigate the legislative jungle, it can be a challenge as politicians scramble to catch up and implement regulations.
With so much going on, it can also be hard for site owners to keep track of these different developments. But doing so is critical to stay compliant and it’s especially important for digital businesses looking to scale or expand into new markets.
It’s increasingly clear that there’s going to be a conflict of interest as user privacy, businesses goals, and government interests clash. Because of the complexity of the digital landscape and as many politicians don’t really understand the inner workings of the Internet and the businesses that operate through it, many laws come out vague, impossible to fulfill or are drawn up without a true understanding of the full impact they will have. This means that many of the recent legislative initiatives are hard to interpret and often highly controversial. Operating in this environment making sure your business adhere to all relevant laws can be a legal minefield.
It also raises the question of just how effective are the different regulations are? Do they really tackle the problems they mean to solve? Are they too fixated on holding online marketplaces and other digital players accountable for harmful user-generated content (UGC)? To what extent do they curb users’ rights rather than empower them? Can there ever be a ‘one size fits all’ solution that works both at a global and a local level?
Let’s take a closer look at some regulatory developments from around the world, consider the most prominent global trends in online safety legislation, and speculate what’s coming next.
Online regulations around the world
The following stories feature synopses of some of the most interesting safety-related stories from the last year or so. They all impact online marketplaces and classified sites in different ways; evidencing the complexities associated with featuring and curating UGC.
India: Banning sales of exotic animals
Online marketplaces in India have cracked down on attempts by users to disguise the illegal sale of rare and exotic animals (and their parts).
This comes after sites such as Amazon India, eBay, OLX, and Snapdeal were revealed to be among over 100 marketplaces where such items can be bought (an issue we covered in a blog post a while back).
Many items are listed under code names – such as ‘Australian Teddy Bear’ for koalas and the Hindi term for ‘Striped Sheet’ in place of tiger skin – but bigger sites are now actively working with government and wildlife protection officials to weed out offending posts.
EU: Take down terror content sooner or face fines
In April this year, the European Parliament voted in favor of a law which would give online businesses one hour (from being contacted by law enforcement authorities) to remove terrorist-related content, which remains more dangerous the longer they’re kept live online.
Failure to comply with the proposed ruling could incur businesses a fine of up to 4% of their global revenue. However, for smaller sites, a 12 hour grace period could be put in place.
US: Safeguarding Children’s Data From Commercial Availability
In America, online shopping giant, Amazon, recently attracted scrutiny over the launch of its brightly-colored kids’ Echo Dot Alexa device – and the use and storage of children’s data.
Despite the company’s assertion that its services comply with child protection legislation, privacy advocates and children’s rights groups are now urging the US Federal Trade Commission to investigate.
Canada: Illegal Online Sales Of Legal Marijuana Sparks Cybersecurity Worries
America’s northern neighbor made medical and recreational cannabis completely legal last year. Since then, the Canadian government has taken significant steps to regulate the sale and distribution of marijuana – restricting it to licensed on- and offline dispensaries.
However, unlicensed black market Mail Order Marijuana services (MOMs) still dominate online sales – given their ability to undercut regulated sales on price, as well as their broader product variety and availability.
While many lawmakers are content to dismiss this gray area as ‘teething issues’, law enforcement agencies are taking it more seriously, citing cybersecurity concerns: as in many cases, buyers are essentially financing and handing their data to, organized crime syndicates.
Britain: An online safety paradise?
In the UK, there have been several interesting developments in the online safety space. Firstly, in a bid to prevent youngsters from accessing sexual content online Britain is banning access to online pornography for those who can’t legitimately verify that they’re of adult age.
In addition, a government whitepaper issued in April aims to make Britain the safest place to be online and calls for an independent regulator to ‘set clear safety standards, backed up by reporting requirements and effective enforcement powers’.
The paper, titled ‘Online Harms’ sets out plans to take tech companies beyond self-regulation to develop ‘a new system of accountability’. This would see a number of key developments take shape, including social media transparency reports; greater scrutiny checks to prevent fake news from spreading, and a new framework to help companies incorporate online safety features into apps and other online platforms.
Learn how to moderate without censoring
Why moderating content without censoring users demands consistent, transparent policies.
Which trends will impact online marketplaces & classifieds sites the most?
It’s clear that there’s a lot of hype around online safety. But reading between the lines, it’s crucial to keep in mind the issues that are most likely to have a bearing on UGC-focused companies operating online.
Safety first, but liability still a grey area
Safeguarding users seems to be a prominent issue in all of this. However, there’s also an overwhelming need to protect the innocent victims featured in malicious and harmful user-generated content – as is the case with sex trafficking, revenge porn, and even exotic animals being sold.
However, there’s a strong argument in that unless there’s clear evidence of a crime, the true perpetrator cannot be punished. A piece of UGC provides proof that could hold criminals accountable.
But should facilitation and curation of harmful content be punishable? As we discussed in our recent video interview with Eric Goldman, law professor at Santa Clara University School of Law and co-founder of four ‘Content Moderation at Scale’ conferences, there’s a marked difference between how moderation, liability, and activity are treated, which has a number of bearings on how companies operating online should behave.
For example, in the US, the Communications Decency Act (aka Section 230) relinquishes users and site owners of any wrongdoing and therefore responsibility. However, the clause here is that the site itself is free to remove ‘obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable’ content.
In other countries, as in the UK and EU, where governments are setting their own frameworks, online marketplaces face prosecution in the event of a breach. The danger here is that companies instead focus on compliance rather than the needs of their customers and communities.
Limits on personal freedoms drive dangerous workarounds
Although the safety and liability message is being heard loud and clear, the need to balance personal freedoms with the eradication of harmful content is a key concern. While the intent is protection, the notion of ‘enforcement’ remains at odds with the notion of individual freedom.
For example, Britain’s online porn ban could arguably push youngsters into more nefarious ways of circumventing the restrictions. For example, the easiest way for users to bypass online blocks is to use TOR browsers and Virtual Private Networks.
As demonstrated in Canada, forcing users to explore darker, more unregulated areas of the web potentially makes users more vulnerable to attack by cybercriminals.
International enforcement remains problematic
Perhaps one of the biggest trends we can see which is of particular concern to online marketplaces is being able to monitor, regulate, and abide by laws across different areas.
Laws pertaining to the sale of weapons, drugs, and other restricted items differ between countries, regions, and states. In addition, age restrictions can vary too.
For example, in Canada, edible marijuana products aren’t yet legal – and therefore cannot be sold online. However, in the US states where recreational cannabis is legal, so too are ‘edibles’.
While it’s not hard to imagine that an online age/ISP/location verification (or a simple ‘Where We Deliver’ policy) would solve such issues, the fact remains that these factors have major ramifications for sites that operate internationally.
And given that there’s rife speculation that Amazon could soon sell cannabis, it’s only a matter of time before these issues take center stage – which can ultimately only be positive for governments and marketplaces alike.
One size doesn’t fit all
Scale is also an important factor to keep in mind. Laws and regulations that are designed to curb the huge amount of data that larger marketplaces curate can’t be deployed in the same way by smaller outfits. And vice versa.
Governments are holding online businesses to task for failing to police their sites appropriately. And while they’re maybe right to do so, it can be tough for marketplaces of all sizes to employ enough resources to professionally cover content moderation needs.
Ultimately, we’re still in the ‘Wild West era’ of online regulation. What’s acceptable is very much culture-led; which is why we continue to see such diverse developments at a global and local level.
For example, in Thailand, where the King is held in utmost regard, any content pertaining to him must be strictly moderated and often removed – unthinkable even in another ‘royal’ nation like the UK. General common sense can’t prevail in such a disparate regulatory environment where user attitudes are so polarized.
In addition, the involvement of governments in setting a best practice framework all too often means that those championing issues like censorship, privacy, and accessibility online aren’t the experts in these matters.
What we’d hope is that moving forward, governments continue to work proactively alongside large and small industry players to understand the true nature of the challenges they face, and foster better relationships with them, in order to create an effective, lasting, best practice solution that benefits users but is also realistically achievable by the online businesses.
We saw this recently at a European Parliament-run content moderation conference, where leading lights from some of the world’s best-known technology companies gathered to share their ideas and challenges with politicians.
However, variety (as they say) is the spice of life. Standardizing the international regulatory environment wouldn’t be effective given the rich diversity of content moderation practices and culturally driven needs. What could work though is an adaptable set of guidelines that nations could adopt and customize to suit their user base – a framework that could be informed by both users and online marketplace owners themselves to map out the limit of acceptability. The only problem could be that the nature of UGC constantly change in line with the way in which technology impacts our lives.
All things considered, going forward online marketplaces and classifieds sites will need to pay even closer attention to the trends, safety regulations, and legislation being set locally and globally.
Otherwise, they may quickly be shut down for being non-compliant.
The new laws can be hard to navigate, and it can be even harder to implement the actions, manpower, and tech needed to be compliant. Companies Besedo are set up to help businesses like yours get everything in place in a fraction of the time and at a lower cost than having to go it alone.
If you feel you could use a hand with your content moderation strategy let us know and we’ll be happy to review your current setup and suggest beneficial alterations.
The latest around content moderation, straight in your inbox
Subscribe to get our newsletter to stay updated.
Welcome to the Age of Fake Dating Profiles
With the rise of online dating comes the problem of fake profiles. So why do people create fake dating profiles and what is done to stop it?
How Bad UX Can Ruin Your Online Brand
With user-generated content platforms you’re essentially handing over a massive chunk of your user experience to your community.
The World’s Top Online Marketplaces 2022
Find out which online marketplaces are the biggest in various countries, categories, and more in our list of the biggest marketplaces online.
How can dating apps be flirty but not dirty?
Evolution of language, part three
Making sure dating apps are about ’amore’ not fraud
Evolution of language, part two
Evolution of language, part one
All change: a quick look at content moderation’s big trends
How to not be the brand that ruins Christmas
This is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.