🚀 Free eBook: Build vs. Buy – The Case for Outsourcing Content Moderation Download → ×

Online safety & legislation trends impacting online marketplaces

Contents

    We are seeing an increase in legislation aimed at the digital world worldwide. What does that mean for online marketplaces? Are there any trends we can see already now, and what can we expect from the future? We’ve deeply explored the legal pool to see if we can make sense of it all.

    The line between digital and offline society gradually gets blurrier as human interaction increasingly happens on and jumps between online platforms and digital spaces.

    Unsurprisingly this merge of tech-driven and traditional doesn’t always happen smoothly. Governments have been particularly slow at catching up to the new world order leaving the digital society to its own devices when it comes to upholding law and order.

    Recent events, like election meddling, the increased suicide rate attributed to cyberbullying and clashes between online and offline workforce has however kickstarted government involvement across the globe and we are starting to see an increased interest in, and legislation aimed at taming the digital world.

    For those of us who operate in the space and must navigate the legislative jungle, it can be challenging as politicians scramble to catch up and implement regulations.

    With so much going on, it can also be hard for site owners to track these developments. But doing so is critical to stay compliant and it’s especially important for digital businesses looking to scale or expand into new markets.

    It’s increasingly clear that there will be a conflict of interest as user privacy, business goals, and government interests clash. Because of the complexity of the digital landscape and as many politicians don’t understand the inner workings of the Internet and the businesses that operate through it, many laws come out vague, impossible to fulfill or are drawn up without a true understanding of the full impact they will have.

    Many recent legislative initiatives are hard to interpret and often highly controversial. Operating in this environment and ensuring your business adheres to all relevant laws can be a legal minefield.

    It also raises the question of just how effective the different regulations are. Do they tackle the problems they mean to solve? Are they too fixated on holding online marketplaces and other digital players accountable for harmful user-generated content (UGC)? To what extent do they curb users’ rights rather than empower them? Can there ever be a ‘one size fits all’ solution that works both globally and locally?

    Let’s examine some regulatory developments worldwide, consider the most prominent global trends in online safety legislation, and speculate about what’s next.

    Online regulations around the world

    The following stories feature synopses of some of the most interesting safety-related stories from the last year or so. They all impact online marketplaces and classified sites in different ways; evidencing the complexities associated with featuring and curating UGC.

    India: Banning sales of exotic animals

    Online marketplaces in India have cracked down on users’ attempts to disguise the illegal sale of rare and exotic animals (and their parts).

    This comes after sites such as Amazon India, eBay, OLX, and Snapdeal were revealed to be among over 100 marketplaces where such items can be bought (an issue we covered in a blog post a while back).

    Many items are listed under code names – such as ‘Australian Teddy Bear’ for koalas and the Hindi term for ‘Striped Sheet’ in place of tiger skin – but bigger sites are now actively working with government and wildlife protection officials to weed out offending posts.

    EU: Take down terror content sooner

    In April this year, the European Parliament voted in favor of a law that would give online businesses one hour (from being contacted by law enforcement authorities) to remove terrorist-related content, which remains more dangerous the longer they’re kept live online.

    Failure to comply with the proposed ruling could incur businesses a fine of up to 4% of their global revenue. However, a 12-hour grace period could be put in place for smaller sites.

    USA: Safeguarding children’s data from commercial availability

    In America, online shopping giant Amazon recently attracted scrutiny over the launch of its brightly-colored kids’ Echo Dot Alexa device – and the use and storage of children’s data.

    Despite the company’s assertion that its services comply with child protection legislation, privacy advocates and children’s rights groups are now urging the US Federal Trade Commission to investigate.

    Bills called COPPA 2.0 and KOSPA are also being passed through Congress now.

    America’s northern neighbor made medical and recreational cannabis completely legal last year. Since then, the Canadian government has taken significant steps to regulate the sale and distribution of marijuana – restricting it to licensed on- and offline dispensaries.

    However, unlicensed black market Mail Order Marijuana services (MOMs) still dominate online sales – given their ability to undercut regulated sales on price, as well as their broader product variety and availability.

    While many lawmakers are content to dismiss this gray area as ‘teething issues’, law enforcement agencies are taking it more seriously, citing cybersecurity concerns: as in many cases, buyers are essentially financing and handing their data to, organized crime syndicates.

    UK: An online safety paradise?

    In the UK, there have been several interesting developments in the online safety space. Firstly, in a bid to prevent youngsters from accessing sexual content online, Britain is banning access to online pornography for those who can’t legitimately verify that they’re of adult age.

    In addition, a government whitepaper issued in April aims to make Britain the safest place to be online and calls for an independent regulator to ‘set clear safety standards, backed up by reporting requirements and effective enforcement powers’.

    The paper, ‘Online Harms, ’ sets out plans to take tech companies beyond self-regulation to develop ‘a new system of accountability’. This would see a number of key developments take shape, including social media transparency reports, greater scrutiny checks to prevent fake news from spreading, and a new framework to help companies incorporate online safety features into apps and other online platforms.

    It’s clear that there’s a lot of hype around online safety. But reading between the lines, it’s crucial to keep in mind the issues that are most likely to have a bearing on UGC-focused companies operating online.

    Safety first – liability still a gray area

    Safeguarding users seems to be a prominent issue. However, there’s also an overwhelming need to protect the innocent victims featured in malicious and harmful user-generated content – as is the case with sex trafficking, revenge porn, and even exotic animals being sold.

    However, there’s a strong argument in that unless there’s clear evidence of a crime, the true perpetrator cannot be punished. A piece of UGC provides proof that could hold criminals accountable.

    But should facilitation and curation of harmful content be punishable? As we discussed in our recent video interview with Eric Goldman, law professor at Santa Clara University School of Law and co-founder of four ‘Content Moderation at Scale’ conferences, there’s a marked difference between how moderation, liability, and activity are treated, which has a number of bearings on how companies operating online should behave.

    For example, in the US, the Communications Decency Act (aka Section 230) relinquishes users and site owners of any wrongdoing and responsibility. However, the clause here is that the site itself is free to remove ‘obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable’ content.

    In other countries, such as the UK and EU, where governments are setting their own frameworks, online marketplaces face prosecution in the event of a breach. The danger here is that companies focus on compliance rather than the needs of their customers and communities.

    Limits on personal freedoms drive dangerous workarounds

    Although the safety and liability message is being heard loud and clear, the need to balance personal freedoms with the eradication of harmful content is a key concern. While the intent is protection, the notion of ‘enforcement’ remains at odds with the notion of individual freedom.

    For example, Britain’s online porn ban could arguably push youngsters into more nefarious ways of circumventing the restrictions. For example, the easiest way for users to bypass online blocks is to use TOR browsers and Virtual Private Networks.

    As demonstrated in Canada, forcing users to explore darker, more unregulated areas of the web can potentially make them more vulnerable to cybercriminals’ attacks.

    International enforcements

    Perhaps one of the biggest trends we can see, which is particularly concerning to online marketplaces, is the ability to monitor, regulate, and abide by laws across different areas.

    Laws governing the sale of weapons, drugs, and other restricted items differ between countries, regions, and states. Age restrictions can also vary.

    For example, in Canada, edible marijuana products aren’t yet legal – and, therefore, cannot be sold online. However, in the US states where recreational cannabis is legal, so too are ‘edibles’.

    While it’s not hard to imagine that an online age/ISP/location verification (or a simple ‘Where We Deliver’ policy) would solve such issues, these factors have major ramifications for sites that operate internationally.

    And given that there’s rife speculation that Amazon could soon sell cannabis, it’s only a matter of time before these issues take center stage – which can ultimately only be positive for governments and marketplaces alike.

    One size doesn’t fit all

    Scale is also an important factor to consider. Laws and regulations designed to curb the huge amount of data that larger marketplaces curate can’t be deployed in the same way by smaller outfits, and vice versa.

    Governments are suing online businesses for failing to police their sites appropriately. While they may be right to do so, it can be tough for marketplaces of all sizes to employ enough resources to professionally manage content moderation needs.

    Looking ahead

    Ultimately, we’re still in the ‘Wild West era’ of online regulation. What’s acceptable is very much culture-led, which is why we continue to see such diverse global and local developments.

    For example, in Thailand, where the King is held in the utmost regard, any content pertaining to him must be strictly moderated and often removed—unthinkable even in another ‘royal’ nation like the UK. General common sense can’t prevail in such a disparate regulatory environment where user attitudes are so polarized.

    In addition, governments’ involvement in setting a best practice framework all too often means that those championing issues like censorship, privacy, and accessibility online aren’t the experts in these matters.

    We hope that moving forward, governments will continue to work proactively alongside large and small industry players to understand the true nature of the challenges they face and foster better relationships with them in order to create an effective, lasting, best-practice solution that benefits users and is also realistically achievable by online businesses.

    We saw this recently at a European Parliament-run content moderation conference, where leading lights from some of the world’s best-known technology companies gathered to share their ideas and challenges with politicians.

    However, variety (as they say) is the spice of life. Standardizing the international regulatory environment wouldn’t be effective given the rich diversity of content moderation practices and culturally driven needs.

    What could work, though, is an adaptable set of guidelines that nations could adopt and customize to suit their user base – a framework that could be informed by both users and online marketplace owners themselves to map out the limit of acceptability. The only problem could be that the nature of UGC constantly changes in line with the way in which technology impacts our lives.

    All things considered, going forward, online marketplaces and classified sites will need to pay even closer attention to the trends, safety regulations, and legislation being set locally and globally.

    Otherwise, they may quickly be shut down for being non-compliant.

    The new laws can be hard to navigate, and it can be even harder to implement the actions, manpower, and tech needed to be compliant. Companies Besedo are set up to help businesses like yours get everything in place in a fraction of the time and at a lower cost than having to go it alone.

    If you feel you could use a hand with your content moderation strategy let us know and we’ll be happy to review your current setup and suggest beneficial alterations.

    Contents