User-generated content, or UGC, can open businesses up to a great deal of risk. While the text, video, and audio that a user creates is, generally, tied to their profile and listed under their name, all of that content is nonetheless presented as part of the business’s identity. Even if it’s only unconscious, interacting with unwanted content can significantly affect a brand’s reputation – and if a user is led off-platform by a piece of UGC, any negative consequences of that are likely to still be associated in their mind with the platform they originally came from.

In spite of this, the adoption of UGC in online business is continuing at a high pace. This is obviously the case for companies for which content and interaction is their raison d’être, such as social media networks, but it might make less obvious sense for those where this content is optional, such as online retail. If you can trade without it, why take the risk?

The consequences of creativity

There are many good answers to this questions, of course. UGC brings users’ endorsements of the platform’s value into the heart of the user experience. It creates feedback loops of interaction which can encourage people to stay on the platform. It enables businesses to develop a richer identity and culture with less up-front investment.

It could be argued, however, that these and other answers are really just examples of a larger, more fundamental point about the value of UGC. Where interacting with a traditional business means choosing options from a menu of what that business thinks a person might like, UGC capabilities give people a much more authentic sense of choice and agency. This freedom to shape one’s own path is what leads to all other outcomes, whether positive or negative.

People, in short, love to create, and it is the emotional driver of creativity that businesses are tapping into when they allow users to set up their own store front, or create and join sub-communities, or craft an online dating profile which really feels like them.

As in the physical world, however, that freedom and agency does not come without potential issues. The freedom for users to present their authentic selves could be misused to imitate someone else. The creative leeway for sellers to brand their online shopfronts could be misused as an opportunity to lead buyers away to other platforms.

The consequences of creativity, then, are emotional fulfilment – but also a serious threat to a business’s sustainability.

The art of moderating art

Understanding creativity as the driver of the value businesses can glean from UGC, however, has important consequences for how we might think about managing and moderating that content.

In the example of users being led off-platform, for instance, the immediate consequences might include revenue loss, as users transact outside of the platform’s channel, and reputational damage when users who are out of reach of the platform’s protections suffer losses. A traditional view of content moderation might be to maximise the ability to identify and eliminate these interactions; a well-trained AI system can spot signs of such activity, such as disguised URLs or phone numbers, and elevate cases to a human team of moderators when the nature of the interaction is ambiguous.

A mature approach might further use those tools to generate insight into how a platform is performing, and what the context of inappropriate actions tends to be. If issues are consistently being flagged around a particular product category, or in certain markets, businesses can take action such as modifying the user interface or adding targeted warning messages to make those events less likely.

If, on the other hand, we see what users are doing on a platform not just as interaction, but as creativity, that might point us towards the need to use moderation in a way which maximises their scope for self-expression. Rather than relying only on the ‘stick’ approach of punishing bad content – which will always shift the experience closer to the traditional model of having limited options from a business – we can also look to offer a ‘carrot’ approach which avoids a sense of limitation on what users can do.

This might, for instance, involve automatically promoting content which closely matches the brand’s values to the forefront of a user’s experience, giving them a clear social model of how they could or should behave on the platform. It might respond to potentially problematic content by asking the user to reconsider their approach, rather than immediately putting it in a queue for approval by a human moderator. It might even allow people to manage what kinds of content they are comfortable seeing, giving other users greater leeway to express themselves freely.

Ultimately, the goal of offering UGC options is to attract and retain the users who best match a brand’s personality and values. That means allowing them to exercise their creative instincts – and content moderation tools can be just as valuable here as they are for limiting inappropriate speech.

 

Find out more about working with us and request a demo today.

Otis Burris

By Otis Burris

VP – Partnerships, Mergers & Acquisitions

In the technology industry, we like to talk a lot about ‘disruption’. Digitalizing something, we sometimes imagine, means that it will play by completely new rules, and moving it online means that we should forget everything we thought we knew about it.

In some ways this is true – but while the technology which underpins a sector might change, it is still ultimately selling to human beings. This means that, to a greater degree than we often acknowledge, the underlying nature of digital businesses is inherited from their pre-internet, analogue reality. The buying motivations for food or clothing, for example, are similar whether the shopper is on the high street or on an app.

Gaming is no different in this regard. Play is at least as old as civilization, and the emotional motivators of competition, cooperation, and self-improvement are as present in esports and online gaming as they are in any previous form of gamesmanship. Likewise, while history doesn’t record how the earliest sportspeople spoke to one another, it’s a fair guess that verbal communication has always been a key ingredient of the pleasure we get from it: the meta-game of talking to ones opponent is a fundamental ingredient.

The challenge of words and play

At the same time, online gaming does deliver something new in its ability to match players from across the globe, the speed at which communities develop and evolve around games, and – importantly – the responsibilities that a business therefore has to its game’s userbase.

The ability of online gaming to transcend geography sets it apart from non-digital play, where only elite players are likely to face international opponents. While this adds to their appeal, it also, in common with many online platforms, creates the potential for users to engage in damaging behaviors.

The fact that verbal abuse – often targeting things like a player’s age, gender, or ethnicity – is widespread in online gaming and difficult to combat has been widely recognized in academic studies. One recent analysis of how players interact in Dota 2, for example, worryingly found that, while younger players are more likely to be penalized for ‘communications abuse’, older players are more likely to actually take part in verbal abuse, suggesting that systems for managing abuse are not being properly targeted.

This difficulty in moderation is exacerbated by both the ambiguity of acceptable speech and the specific, rapidly changing language which is used in online games. “git gud”, for example, might be used abusively against a particular person, or just to bemoan one’s own lack of skill; “get rekt” might be part of a perfectly acceptable victory celebration, or a part of focused negative attention on an opposing player.

It’s an environment which poses real challenges to both AI and human moderation approaches. For humans, the speed and volume of interactions makes thorough oversight difficult; for AI, the variability of the language and its contextual nuance makes keeping up a tall order.

The power of speech

Ultimately, however, there is a clear need to carry the verbally interactive aspects of play forward into online gaming. A successful online game is one that not only brings players in, but keeps them engaged, constructing a community around the competition. While the negative potential of communication is a risk to that, the ability to communicate is also a key ingredient for sustainable growth.

Getting it right, therefore, means mirroring the complex nature of these interactions with a nuanced approach to content moderation. That means developing systems which combine the best of human and automated oversight in bespoke ways which are specific to the nature and dynamics of the game’s community.

Rules, after all, are also part of the essence of games. Just as referees in real-world sports keep play within appropriate limits, moderation of speech in online gaming can be seen not just as a way of punishing negative behavior, but as an opportunity to enable positive interactions. Sometimes, a positive interaction will be one in which players have the space and opportunity to taunt, criticize, and motivate one another; content moderation needs to evolve to keep up with that fact.

 

Find out more about working with us and request a demo today.

 

Petter Nylander - Besedo Chairman

By Petter Nylander

CEO

Does UGC damage trust in the sharing economy?

The term ‘sharing economy’ is famously difficult to define. For some, it refers to any digital platform that connects people more directly than traditional business models. For others, a business is only truly in the sharing economy if it enables people to make money out of things they would buy and own anyway.

What all forms of the sharing economy share, though, is a reliance on trust. Whether you are hailing a ride, staying in someone’s spare room, borrowing a lawn mower, or paying someone to do a small one-off job, you’re entering into a transaction which starts with a decision to trust a stranger.

The difficulty of encouraging that decision is exacerbated by the fact that, from the user’s perspective, getting this wrong is potentially a high-stakes issue: while sometimes it might mean merely getting a dissatisfying product, interactions like borrowing a car or renting a room can pose serious risks to health and wellbeing.

Content’s double-edged sword

The question for platforms, then, is what kinds of structures and tools best encourage both positive outcomes and – almost as importantly – a sense of trust amongst the userbase.

Alongside approaches like strict rules on what can be listed and physical checks of users’ offerings, many sharing economy platforms turn to user-generated content (UGC) for this purpose. User reviews, photos of what is on offer, communication options, and even selfies can all help to humanise the platform, validate that users are real people, and generate a sense of trustworthiness.

At the same time, however, allowing UGC can open the door to specific risks. Low-quality images, for example, can worry people and erode trust, while giving users more control over how listings are presented creates greater potential for scams, fraud, and fake profiles. A permissive approach to content can also lead to users conducting business off-platform, side-stepping both safety and monetisation systems.

This is why there is such a variety of approaches to UGC in the sharing economy. Where some platforms, like Airbnb, encourage users to share as much about themselves and their property as possible, others, like Uber, allow only a small selfie and the ability to rate riders and drivers out of five stars. In between these open and defensive approaches, there are any number of combinations of content and sharing permissions a business might choose – but what delivers the best outcome?

Using the carrot, not just the stick

Intuitively, many might assume that the platforms which feel safest will be those with the strictest rules, only allowing interaction between users when absolutely necessary and banning those who engage in damaging behaviour. In recent research, a group of organisational psychologists described this as ‘harsh’ regulation, as opposed to the ‘soft’ regulation of supporting users, encouraging interaction, and influencing them to engage in positive behaviour.

Perhaps surprisingly, the research found that soft regulation has a stronger positive impact than harsh regulation. The sharing economy, after all, digitalises something humans have always done in the physical world: try to help one another in mutually beneficial ways. Just as we take our cues on how to behave in everyday life from the people around us, seeing positive engagements on platforms sets a standard for how we treat each other – and trust each other – in digital spaces. Being able to talk, share, and humanise helps people to engage, commit, and trust.

This suggests that we may need to shift how we think about managing content in order to make the most of its potential to drive long-term growth. Content moderation is seen, first and foremost, as a way of blocking unwanted content – and that’s certainly something it achieves. At the same time, though, having clear insight into and control over how, when, and where content is presented gives us a route towards lifting the best of a platform into the spotlight and giving users clear social models of how to behave. In essence, it’s opportunity to align the individual’s experience with the best a platform has to offer.

Ultimately, creating high-trust sharing economy communities is in everyone’s best interest: users are empowered to pursue new ways of managing their daily lives, and businesses create communities where people want to stay, and promote to their friends and family, for the long term. To get there, we need to focus on tools and approaches which enable and promote positive interactions.

 

Find out more about working with us and request a demo today.

 

Otis Burris

By Otis Burris

VP – Partnerships, M&A

For many, online dating is now the default way to meet new people. As we become increasingly time-poor, digital devices act as a way for us to navigate our day-to-day lives and how we interact with others, including our relationships.

Despite attitudes towards dating apps becoming more positive and platforms gaining popularity in recent years, throughout their short history, they have attracted a great deal of attention on the risks they pose to users. While dating apps are an incredibly convenient way to maintain our love lives, they come with their own threats.

Risk vs risqué

Like any form of dating, connecting with strangers doesn’t come without risk. This is also the case when using an online dating platform. The exchange of information be it a phone number, an address and other personal details can be exploited, if placed in the wrong hands. Dating scams, catfishing, and abuse attract headlines – and for platforms, advertising, misuse, and nudity also threaten to damage the user experience and brand reputation.

Finding the right balance in restricting content to protect users and allowing organic interactions to flourish is crucial to enable platforms to grow and realise true potential. The power of online dating is its ability to make connections virtually, while the freedom which makes it possible to engage in negative interactions is also what makes it possible to have genuine, authentic and meaningful relationships.

Growing a dating platform means harnessing the opportunities in the content it creates. Platforms cannot be seen to ‘scaremonger’ users, but it’s imperative they provide substantial safety features and guidelines to protect users and brand reputation, whilst using technology to enhance user experience and focus on retention to grow their platforms.

Creating a safe space, without killing the mood

The recent context of lockdowns demonstrated the power of online dating; even without in-person interaction, it functioned as a place to make human connections. It works best, therefore, when it delivers the same surprise, joy, and meaningfulness of speaking to someone new in real life.

With online dating, it is tempting to see shutting down opportunities to interact as the only way to remove risk. But this isn’t what users want. They want to feel protected and trusting of the systems in place to be able to interact in confidence. It is now an expectation, not a ‘nice to have’, for platforms to filter out all harmful content from fake profiles to indecent imagery. Providing a sophisticated app to allow users to interact with who they choose is likely to result in increased brand loyalty as opposed to blocking all connections which could be deemed as harmful.

An engaging and reliable messaging experience is the foundation of retention on a successful dating platform. Creating a positive space to connect, however, relies on really understanding how people use the platform and what works for them. With many users engaging in conversations to meet new partners, its important technology doesn’t get in the way and ‘kill the mood’, with an unstable or over-censored chat platform.

Content moderation can help strike the right balance. As well as blocking the most objectionable – or illegal – content, it delivers insight that enables dating sites to encourage sincere, positive behaviours. Online dating is a space of rapid innovation and as brands create new ways to help people connect more effectively, platforms need to ensure interactions remain safe, with custom moderation approaches.

 

Ultimately, stopping deceitful users from harming the user experience and removing unwanted content to keep people safe will protect brand reputations. With content moderation, your dating site can become the brand you want it to be.  

 

Find out more about working with us and request a demo today.

edmond vassallo

By Edmond Vassallo

Head of Customer Success

Why staying ahead by moderating new content formats is key to business growth

Media consumption is rapidly shifting from text to images to video. Social video now generates 1200% more shares than text and image content combined. It won’t stop there: this year we have also seen audio formats like podcasts and voice notes drive increasing amounts of engagement.

Many companies have responded to changes in content consumption and enriched their customer experience by adding video formats. Since Instagram introduced Stories and Reels, more marketplaces have incorporated videos of products, as well as allowing users to post video reviews so shoppers can see items in action. Adapting the user experience to new behaviours drives business results, too: consumers who end up on an e-commerce site through a user-generated video are 184% more likely to purchase – and spend 45% more.

As new types of content are added to enhance the customer experience and drive business growth, businesses become ever more shaped by the content their users share. This is a powerful opportunity to take the transformative interactivity of digital technology to the next level, building a brand’s userbase into its identity and value. It’s also a risk, of course – no matter the format, if content is harmful to the user, it’s harmful to the brand.

This is why content moderation technology is critical. If content is not moderated, harmful content goes undetected and valuable content goes underutilised. When a user sees content which damages their experience, their customer loyalty is then impacted and ultimately business growth is diminished.

Act now – reacting to harmful content is too late

Businesses must have proactive control over the user-generated content (UGC) on their website. It’s not enough to react when a harmful video appears on the site, as the damage will already be done. Businesses need to take a preventative approach where they stay ahead of damaging UGC to protect their users.

As more and more user generated content is created, especially in several different formats, it becomes harder for businesses to keep up with content moderation needs. Businesses must constantly adapt their content moderation efforts to keep up with new consumer behaviours, but this product development takes time. Developer teams are then placed under pressure to deliver constant innovation to be able to handle all types of content. This is a huge challenge, which is time consuming and requires constant agility, but if not addressed the potential for damage to the customer experience is huge.

CTOs and the tech teams they lead have a lot on their plate. Across different industries we can see incredible innovation happening with UGC, making the customer experience more sincere, more useful, more delightful, and more meaningful by empowering user interaction. Managing that content, however, can lead to an unpleasant decision between building that capability in house (and stretching developers yet thinner) or buying a solution off-the-shelf (and risk discovering that it is not truly fit for purpose).

Lean on an expert business partner

Technologies like video UGC are powerful growth drivers – but without control over what appears on the platform, that growth is unsustainable. So how can businesses ensure their teams can keep up with moderation needs whilst reaping the benefits new formats provide to the user experience at the same time?

Working with a business partner who can offer content moderation tools that address new formats like video is extremely valuable. At Besedo we are currently developing moderation for video content. Outsourcing means the partner is a source of innovation for the business – but we also take the pressure in terms of product development and our teams specialise in content moderation alone.

At the same time, Besedo offers a partnership, not just off-the-shelf tools, so moderation is fit-for-purpose and tailored to needs. We realise that every company’s UGC needs are different and build customised content moderation solutions that address the differing needs for businesses; from dating apps where users are connecting, to marketplaces where sellers are interacting with customers.

An expert partner can ensure moderation of all types of UGC proactively, too. We use a combination of artificial intelligence and human moderation to ensure no bad content slips through the cracks. Businesses need a trusted partner where they know their user experience will be safe.

When harmful content is prevented, the user experience improves, users will stay loyal, and lifetime user value increases. Being able to innovate your approach to content moderation to keep up with users’ content consumption habits is key to sustainable growth.

 

Find out more about working with us and request a demo today.

Maxence-Bernard | Besedo

By Maxence Bernard

Global Head of Product – Besedo/ Implio

What social shopping teaches us about the future of content

In June, Etsy committed to one of the largest ecommerce industry acquisitions of all time when it announced it would pay $1.625bn for the UK-based fashion marketplace Depop. It’s a bold move that represents an enormous vote of confidence in the future of peer-to-peer selling.

Marketplace platforms like Depop are not, of course, entirely new. In fact, eBay might have been the first truly breakout dotcom success, making the idea of buying online familiar to millions. Even so, we can foresee a much more social future for retail. Depop, after all, is distinguished partly by a user experience which is much more like using social media than traditional retail channels. A big component of what Etsy is buying isn’t just the revenue that Depop draws, but its style and userbase, which The Guardian describes as ‘mostly under 26 and mark[ing] out the future direction of retail: more online, sustainable and social’.

Between the acceleration of digital adoption brought about by the pandemic and a growing consumer concern with sustainability, the appeal of Depop is clear. This does not, however, mean that things will necessarily be plain sailing for Etsy’s new acquisition. Indeed, brands with similar offerings to Depop have struggled recently, with Poshmark losing over half of its value since going public at the start of the year.

A question of freedom and trust

The question, then, is how these business can become truly sustainable – not only reducing their users’ carbon footprints, but finding the growth they need to thrive in the long term and continuing to demonstrate positive social impact. It’s something that many CFOs and finance teams, of these and other disruptive industry players, will be weighing up as they balance the pressures for short-term expansion and long-term security.

One key to doing that successfully will be to fully understand that for these businesses revenue is driven not just by the quality of the physical products people are trading, but by the quality of the user-generated content (UGC) that people create to represent them.

The UGC-powered revenue model is unique to the internet age; it’s the bread and butter of the social networks which Depop’s interface aims to mirror. While other forms of mass media have opened the door to interaction with their audiences (as with readers’ letters to newspapers), only since the internet became ubiquitous has it been possible to place that content centre stage. For social media, that means relying on your users to share engaging ideas. For peer-to-peer selling, that means handing your users the task of creating engaging merchandising content.

At the same time, all of the requirements placed on ‘traditional’ retail – whether in-store or online – are still in play for peer-to-peer selling. Shoppers need to be able to trust that the products they see are being accurately represented. They need to believe that the prices they are paying are fair. The need to know that they will be supported if and when things go wrong.

All of this means that the highest of business standards will be required of something which is, by its nature, difficult to control. Much of the commercial power of UGC lies in the fact that it forms a personal connection, and – just as when we talk to people socially – the results can be surprising, joyful, and at times anarchic.

Building sustainable growth for long term success

Sustainable growth, then, will mean finding a way to have it both ways, upholding UGC’s potential for self-expressiveness and imbuing it with the reliability that retail demands. It would be a mistake to see this as something which can be figured out on the fly, waiting to see what kinds of problems arise and then developing responses to them. While there are times that customers will accept this style of working, a poor interaction with a retail business is likely to mean losing not just that customer, but a build-up of negative brand perception that can be fatal.

A proactive and preventative approach will place the quality of UGC at the heart of metrics like brand loyalty and user lifetime value, seeing it as a precondition for strong revenue, not as a secondary factor. No two businesses will be the same in this regard. For example, what constitutes a high-quality listing on Poshmark, which deals in homeware as well as fashion and focuses on luxury brands, will be different to Depop’s ideal listing of fun, youthful street fashion.

The contrast between the positioning of these two brands which offer fundamentally similar services highlights the fact that UGC is not simply a risk: while everyone is aware that bad content can damage a business, we also need to recognise that good content – however that is defined – is how these businesses thrive. This is content moderation as a core capability.

It’s not a challenge that’s unique to marketplaces. Other sectors, like online dating, are based on UGC, while others such as gaming are becoming increasingly reliant on it. Just as the future of shopping is more social, opportunities to find value in UGC are arising across industries. The implications for finance in these businesses will be a learning process: just as they now look at pipelines, funnels, and run rates, they may soon be tracking content health.

Find out more about working with us and request a demo today.

William Singam | Besedo

By William Singam

Regional Sales Director – APAC + France

The Future Of Dating Is Video: How Do You Keep Singles Safe Online?

When it comes to affairs of the heart – at a time when physical contact is off-limits – it’s time to get creative. And that exactly what online dating platforms are doing: using video interaction as the ‘date’ itself.

While it’s clear that dating is innovative, exciting, and evolving at a rapid pace, how can dating site owners ensure they keep users safe?

 

Necessity Breeds Dating Invention

Video dating is nothing new. Well, in its current, interactive form it’s brand new, but use of video as a way of introducing yourself to potential dating partners took off in the 1980s and 90s. But back then, agencies were involved. And like recruiters, it was their job to vet, interview, and engineer love matches based on compatibility and common likes and dislikes.

However, fast forward 35 years, and the ways in which we interact have shifted significantly. And they just keep innovating. Services like eHarmony, Tinder, and Bumble, each offer their own unique approach to self-service matchmaking. And while social media platforms (Facebook Dating, anyone?) have been dipping their toes into the dating pool for a little while now, nothing groundbreaking has taken the sector by storm.

Most industry insiders saw the use of video as an ‘add-on’ to dating platforms but no-one was entirely sure how this would play out. And then, in March 2020, the COVID-19 pandemic hit. Lockdowns ensued internationally. Suddenly video took on a whole new role.

Communication evolved in one major direction – online video calls and meetings. Replacing face-to-face with face-to-screen encounters in a time of social distancing represents a huge cultural shift, unimaginable back in 2019.

Whether we’re learning at home or working remotely, how we stay connected has changed significantly. Substituting in-person conversations with video meetings is now par for the course.

Despite the ensuing Zoom fatigue, being advised to stay at home has undoubtedly led to a spike in online dating. And with traditional dating venues no longer a COVID-safe option, video dating has organically risen to the forefront.

 

Why Video Dating?

While not every dating site or user is engaging with video dating yet, many are trying it out.  But what are the benefits of video dating? If your online dating platform is not already providing that service, are your users missing out?

Compared with traditional online dating, video dating has some great benefits. The most obvious reason to choose video dating is that it enables participants to experience the social presence that’s lacking in written communication. As a result, it can feel much more real and authentic than just exchanging messages or swiping photos.

With a video date, users have that experience of getting to know someone more slowly, finding out if they’re a good match in terms of personality, sense of humour, and other qualities. This means if you don’t click with someone, you’re more likely to find out sooner. Particularly at a time when in-person meetings are restricted, this is a huge advantage in terms of making the leap to meeting in person.

But swapping a bar or restaurant for a video meeting carries a different set of risks for participants. And for online dating platforms, video dating poses tough new challenges for content moderation. Especially when it comes to livestream dating with an interactive audience.

 

Dating Live & In Public

Dating in front of a live audience is nothing new. In the 1980s, television dating shows like ‘Blind Date’ in the UK experienced huge popularity. Contestants performed in front of a live studio audience and put themselves at the mercy of the general public – and the tabloid press(!).

In the 2010s, the television dating game show-style format was revived – though it followed a wider trend for ‘reality TV’ with dating shows such as ‘Love Island’ emerging and growing in popularity. However, the legacies of these shows have been tainted by a small number of poorly-vetted contestants – some even had previous convictions for sex offences – suffering serious mental-health conditions as a result of their appearance on the show.

Despite these warning signs, it seems inevitable that the trend for dating-related entertainment has been adopted by interactive online technologies – livestream dating. Often described as ‘speed dating in a public forum’, the trend for watching and participating in live video dating seems a logical extension of platforms like Twitch and TikTok.

But sites like MeetMe, Skout, and Tagged aren’t just a way of making connections – they’re also an opportunity for daters to generate revenue. Some platforms even provide users with the functionality to purchase virtual gifts which have real monetary value.

Needless to say, these kinds of activities continue to raise questions about users’ authenticity: in terms of dating in pursuit of love. This is why, over the last decade, many industries have made a conscious move towards authenticity – in order to build better brand trust. The dating industry is no different, especially since – despite exponential growth – there are still major retention and engagement issues.

Video offers that sense of authenticity, particularly as we’re now so accustomed to communicating with trusted friends and family via live video.

Dating also has universal appeal, even to people already in committed relationships. There is an undeniable voyeuristic aspect to watching a dating show or watching live streamed daters. And of course there are inherent safety risks in that.

Like other interactive social technologies, the livestream dating trend carries its own intrinsic dangers in terms of mental health and user experience. And just like any other interactive social media, there are always going to be users who are there to make inappropriate comments and harass people.

That’s where content moderation comes into play.

 

So How Can Content Moderation Support Safer Dating?

One-to-one video dating and livestream dating is happening right now. Who knows where they will evolve?

Setting your brand apart in an already crowded dating industry is becoming more complicated in a time when social media technologies are rapidly evolving. How will you stay ahead of the curve and keep your users safe?

Of course, video moderation is not the only challenge you’re going to face. The associated unwanted user-generated content that goes with running an online dating platform includes:

  • Romance scams
  • prostitution
  • online harassment
  • catfishing
  • profanity
  • nudity
  • image quality
  • underaged users
  • escort promotion.

After all, brand trust means a better user experience. And a better user experience increases user lifetime value – and revenue.

On average, 1 in 10 dating profiles created is fake. Scammers and inappropriate content hurt your platform’s reliability. Left unanswered, undesirable content undermines user trust and can take a heavy toll on your acquisition and retention.

– but it does mean taking a leap in terms of your overall digital transformation strategy, and adding AI and machine learning to your service.

With an all-in-one package from Besedo, you can get your content moderation in order across multiple areas. It’s built on over 20 years experience and now has manual video moderation capabilities.

This means you can now review play, pause, timestamp and volume control videos. More importantly you can delete videos which don’t meet your site’s standards for user-generated content. Take a look at our short video guide to discover more.

 

Make dating online safer. Find out more about working with us and request a demo today.

 

Martin Wåhlstrand

By Martin Wåhlstrand

Regional Sales Director – Americas

Why creating sustainable growth means looking beyond the digital present

Over the past decade, it has become common to suggest that every company is now a tech company.

The exponential growth in digital usage quickly outgrew what we traditionally think of as the technology sector and, for users, the agility of the internet didn’t stay confined to the online world. Technology has shifted expectations about how everything can or should work. Soon, companies selling everything from furniture to financial services started to look and act more like innovative tech companies. They find new ways to solve old problems through digital channels.

In other words, business leaders seeking to guarantee growth turned to digital technology – to the point that, now, the Chief Technology Officer is a key part of the C-suite.

After a year when we’ve all relied on the internet more than ever, in every aspect of our lives, growth through digital has never been more apparent. For business, digital communication has at times been the only possible way of staying in touch with customers, and there’s no sign that the CEO’s focus on agility and technology is fading. In recent surveys, IBM found that 56% of CEOs are ‘aggressively pursuing operational agility and flexibility’, PwC found that they see cyber threats as the second biggest risk to business, and Deloitte found that 85% think the pandemic accelerated digital transformation.

If the exponential growth of digital has made every company a technology company, though, it has also made terms like ‘technology’ and ‘agility’ less useful. If every CEO is pursuing a digital strategy, that term must be encompassing a vast range of different ideas. As we look towards the next decade of growth – focused on managing the challenge of achieving more responsible and sustainable business along the way – we will need to think carefully about what comes next once digitalisation is universal.

Supercharged tech growth has skyrocketed user-generated content

Of course, the importance of agile technology has never been the tech itself, but what people do with it. For customers we’ve seen tech innovation create new ways of talking, direct access to brands, and large changes in how we consume media and make purchases.

As digital channels take on a greater share of activity than ever, one of the effects of an exponential growth in digital is an exponential growth in user-generated content (UGC).

This user-led interaction, from product reviews to marketplace listings to social interactions, fully embodies the agility that companies have spent the last decade trying to bring to their processes; because it is made by people, UGC is rapid, diverse, and flexible by default. While it may be too soon to say that every business will become a content business, it’s clear that this will become an increasingly important part of how businesses operate. Certainly, it’s already a major driving force for sectors as diverse as marketplaces, gaming, and dating.

A UGC business must be protected to maximise opportunity

In the move towards UGC, a business’s user interaction and user experience will have consequences across the organisation – from profit margin, to brand positioning, to reputational risk, to technological infrastructure. Across all of these, there will be a need to uphold users’ trust that content is being employed responsibly, that they are being protected from malign actors, and that their input is being used for their benefit. Turning content into sustainable growth, then, is a task that needs to be addressed across the company, not confined to any one business function.

Marketers, for instance, have benefited from digitalisation’s capacity to make the customer experience richer and more useful – but it has also introduced an element of unpredictability in user interactions. When communities are managed and shaped, marketers need to ensure that those efforts produce a public face in line with the company’s ethos and objectives.

While tech teams need to enable richer user interaction, their rapid ascent to become a core business function has left them under pressure to everything, everywhere. Their innovation in how content is managed, therefore, needs a middle path between the unsustainable workload of in-house development and the unsustainable compromises of off-the-shelf tooling.

With the ultimate outcomes of building user trust being measured in terms of things like brand loyalty and lifetime user value, finance departments will also need to adapt to this form of customer relationship. The creation of long-term financial health needs investments and partnerships which truly understand how the relationship between businesses and customers is changing.

UGC as a vital asset for sustainable business growth

Bringing this all together will be the task needed to create sustainable growth – growth which is fit for and competitive in the emerging context of UGC, sensitive to the increasing caution that users will have around trusting businesses, and transparent about the organisations ethos, purpose, and direction. It will require not just investing in technology, but understanding how tech is leading us to a more interactive economy at every scale.

As digitalisation continues to widen and deepen, we may find UGC, and the trust it requires, becoming just as vital an asset for businesses as product stock or intellectual property. To prepare for that future and maximise their business growth from their UGC, businesses need to start thinking and planning today.

Petter Nylander - Besedo CEO

By Petter Nylander

CEO Besedo Global Services

If you keep your eye on content moderation as we do, you’ll be aware that the EU’s Digital Services Act (DSA) is on the road to being passed, after the European Commission submitted its proposals for legislation last December.

You’ll also know, of course, that the last year has been a tumultuous time for online content. Between governments trying to communicate accurately about the pandemic, a tumultuous US election cycle, and a number of protest movements moving from social media to the streets, it’s felt like a week hasn’t passed without online content – and how to moderate it – hitting the headlines.

All of which makes the DSA (though at least partly by accident) extremely well-timed. With expectations that it will overhaul the rules and responsibilities for online businesses around user-generated content, EU member states will be keen to ensure that it offers an effective response to what many are coming to see as the dangers of unmanaged online discourse, without hindering the benefits of digitalized society that we’ve all come to rely on.

There’s a lot we still don’t know about the DSA. As it is reviewed and debated by the European Council and the European Parliament, changes might be made to everything from its definition of illegal content to the breadth of companies that are affected by each of its various new obligations. It’s absolutely clear, though, that businesses will be affected by the DSA – and not only the ‘Very Large Platforms’ like Google and Facebook which are expected to be most heavily targeted.

Many people looking at the DSA will instinctively think back to the last time the EU made significant new law around online business with the GDPR. The impact of that regulation is still growing, with larger fines being levied year-on-year, but it’s perhaps more important that internet users’ sense of what companies can or should do with data has been shifted by the GDPR. Likewise, the DSA will alter the terrain for all online businesses, and many industries will have to do some big thinking over the coming years as the act moves towards being agreed upon.

Content moderation, of course, is our expertise here at Besedo, and making improvements to how content is managed will be a big part of how businesses adapt to the DSA. That’s why we decided to help get this conversation started by finding out how businesses are currently thinking about it. Surveying UK-based businesses with operations in the EU across the retail, IT, and media sectors, we wanted to take the temperature of firms that will be at the forefront of the upcoming changes.

We found that, while the act is clearly on everyone’s radar, there is a lot of progress to be made if businesses are to get fully prepared. Nearly two-thirds of our respondents, for example, knew that the DSA is a wide-ranging set of rules which applies beyond social media or big tech. However, a similar proportion stated that they understand what will be defined as ‘illegal content’ under the act – despite the fact that that definition is yet to be finalized.

Encouragingly, we also found that 88% of respondents are confident that they will be ready for the DSA when it comes into force. For most, that will mean changing their approach to moderation: 92% told us that achieving compliance will involve upgrading their moderation systems, their processes, or both.

As the DSA is discussed, debated, and decided, we’ll continue to look at numbers like these and invite companies together to talk about how we can all make the internet a safer, fairer place for all its users. If you’d like to get involved or want insight on what’s coming down the road, our new research report, ‘Are you ready for the Digital Services Act?’, is the perfect place to start.

March marks the 1-year anniversary of WHO declaring Covid 19 a global pandemic. While vaccines are now being rolled out and a return to normality is inching closer, online trade is still heavily influenced and characterized by a year in and out of lockdown. And so are the content moderation challenges we meet in our day-to-day work with platforms across the globe.

Shortage in graphics cards increases electronic frauds.

Whether for work or entertainment, being homebound has caused people to shop for desktop computers at a level we haven’t seen for a decade. For the past 10 years, mobile-first has been preached by any business advisor worth listening to, but lockdowns have given desktop computers a surprising comeback and increased demands for PC parts.

The increased interest in PCs combined with the late 2020 release of the new console generation and the reduced production caused by pandemic mandated lockdowns has created an unexpected niche for scammers.

Google trend for buying graphics cardWe’re currently seeing a worldwide shortage of graphic cards, needed for both consoles and desktop computers and scammers haven’t wasted a second to jump on the opportunity.

In March we’ve seen a significant increase in fraud cases related to graphics cards with gaming capabilities. In some cases, more than 50% of fraud cases we deal with have been related to graphics cards.

Puppy scams are still sky-high.

In March we post-reviewed puppy scams on 6 popular online marketplaces in the UK. We found that almost 50% of live listings showed signs of being fraudulent.

Pet trade has exploded since the beginning of the pandemic and scammers are still trying to take advantage of those looking for new furry family members.

Sleeper accounts awaken.

Our moderators warn that this month they’ve seen an increase in sleeper accounts engaging in Trojan scams. The accounts post a low-risk item, then lays dormant for a while before they start posting high-value items. The method is used to circumvent moderation setups that only moderate the first items posted by new accounts.

High-risk items posted by these accounts are often expensive electronics in high demand, such as cameras or the Nintendo Switch.

April is looking to be an interesting month in terms of content moderation challenges. With many countries tentatively opening up and others concerned about a 3rd wave, we recommend that all marketplace owners keep a close eye on corona-related scams. From masks to fake vaccines and a potential incoming surge of forged corona passports staying alert, up to date, and keeping your moderators educated will be as important as ever.

If you need help reviewing your content moderation setup or are looking for an experienced team to take it off your hands, we’re here to help.