Contents
User-generated content, or UGC, can open businesses to a great deal of risk. While the text, video, and audio a user creates are generally tied to their profile and listed under their name, all of that content is nonetheless presented as part of the business’s identity.
Even if it’s only unconscious, interacting with unwanted content can significantly affect a brand’s reputation – and if a user is led off-platform by a piece of UGC, any negative consequences are likely to be still associated in their mind with the platform they originally came from.
In spite of this, the adoption of UGC in online business is continuing at a high pace. This is the case for companies where content and interaction are their raison d’être, such as social media networks, but it might make less obvious sense for those where this content is optional, such as online retail.
If you can trade without it, why take the risk?
The consequences of creativity
There are many good answers to this question, of course. UGC brings users’ endorsements of the platform’s value into the heart of the user experience. It creates feedback loops of interaction that can encourage people to stay on the platform. It enables businesses to develop a richer identity and culture with less up-front investment.
It could be argued, however, that these and other answers are just examples of a larger, more fundamental point about the value of UGC. Where interacting with a traditional business means choosing options from a menu of what that business thinks a person might like, UGC capabilities give people a much more authentic sense of choice and agency.
This freedom to shape one’s own path leads to all other positive or negative outcomes.
People, in short, love to create, and it is the emotional driver of creativity that businesses are tapping into when they allow users to set up their storefronts, create and join sub-communities, or craft an online dating profile that feels like them.
However, freedom and agency do not come without potential issues as in the physical world. The freedom for users to present their authentic selves could be misused to imitate someone else. The creative leeway for sellers to brand their online shopfronts could be misused as an opportunity to lead buyers away to other platforms.
The consequences of creativity, then, are emotional fulfillment and a serious threat to a business’s sustainability.
The art of moderating art
However, understanding creativity as the driver of the value businesses can glean from UGC has important consequences for how we might think about managing and moderating that content.
In the example of users being led off-platform, for instance, the immediate consequences might include revenue loss, as users transact outside of the platform’s channel, and reputational damage when users who are out of reach of the platform’s protections suffer losses.
A traditional view of content moderation might be to maximize the ability to identify and eliminate these interactions; a well-trained AI system can spot signs of such activity, such as disguised URLs or phone numbers, and elevate cases to a human team of moderators when the nature of the interaction is ambiguous.
A more mature approach
A mature approach might further use those tools to generate insight into how a platform performs and the context of inappropriate actions. Suppose issues are consistently flagged around a particular product category or in certain markets. In that case, businesses can take action, such as modifying the user interface or adding targeted warning messages to make those events less likely.
If, on the other hand, we see what users are doing on a platform not just as an interaction but as creativity, that might point us toward the need to use moderation in a way that maximizes their scope for self-expression. Rather than relying only on the “stick” approach of punishing bad content – which will always shift the experience closer to the traditional model of having limited options from a business – we can also look to offer a ‘carrot’ approach which avoids a sense of limitation on what users can do.
This might, for instance, involve automatically promoting content that closely matches the brand’s values to the forefront of a user’s experience, giving them a clear social model of how they could or should behave on the platform.
It might respond to potentially problematic content by asking the user to reconsider its approach rather than immediately putting it in a queue for approval by a human moderator. It might even allow people to manage what kinds of content they are comfortable seeing, giving other users greater leeway to express themselves freely.
Ultimately, offering UGC options aims to attract and retain the users who best match a brand’s personality and values. That means allowing them to exercise their creative instincts – and content moderation tools can be just as valuable here as they are for limiting inappropriate speech.
Related articles
See allAll postsThis is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.