Contents
The future! When did you last spend 5 minutes considering how to deal with upcoming content moderation challenges? It’s incredibly easy to get caught up in the day-to-day operations of ensuring content quality and protecting users from scammers, and the future often has to take a backseat.
The problem is that the future has a tendency to come knocking on the door at the least convenient time and if we don’t spend time preparing for it we are left completely unprepared to handle the next generation of media, prevent new scams and manage upcoming trends.
Equipping ourselves to deal with the future was the primary reason behind Besedo’s recent merger with ioSquare. We realized that to be able to provide the content moderation of tomorrow our two companies needed each other.
In the wake of the merger, it feels natural to sit down with the heads of both ioSquare and Besedo to discuss the future.
But just before we dive into what’s to come, let’s have a quick look at where content moderation is at right now. We asked both Patrik Frisk (CEO at Besedo) and Maxence Bernard (co-founder of ioSquare) to give us their view on the matter.
What Defines Content Moderation Right Now?
“It is being taken a lot more seriously than say, 5 years ago. Companies now realize that moderation is not something you do just to get a competitive edge, it is essential if you want to be taken seriously as a player in the industry.
The mobile revolution has shaped a new consumer – one that has a shorter attention span is more demanding and expects more relevant content fast! At the same time, we live in a world where it’s too easy to go viral for the wrong reasons and PR scandals can spread on social media like wildfire.
Content moderation can no longer be just an afterthought and all smart site owners understand this.
At the same time companies are starting to realize that they are better off outsourcing both tools and services related to moderation, focusing their own internal resources on developing and improving their core products.
Previously they couldn’t really rely on 3rd party tools for combined moderation and automation, simply because there were none available, but with SaaS solutions like Implio starting to emerge in the market, the need to develop in-house solutions is gradually disappearing.
Furthermore, content moderation today is a lot more complex than 10 years ago. You can’t hire a few developers or content moderators and be done with it. You need a really good team of data scientists, analysts, linguistics, moderators with industry knowledge, really good coaches, recruiters, managers, and the list goes on. It’s a very complex operation to manage, and this is why top players out there partner with companies focusing on excelling at moderation”, stated Patrik.
“AI moderation is another big thing. Companies can see that they will need automation, not just to cope with the exponentially increasing volumes cost-efficiently, but also for the added accuracy good AI offers”, said Maxence.
“As for focus and what the market needs right now: pictures are the recurring subject whenever we talk to site owners. They are all looking for a tool that can handle moderation of large numbers of pictures”, agreed both Maxence and Patrik.
What Kind of Content Moderation Will Be Needed in 10 Years?
“It will depend a lot on the technology that emerges over the next decade and on what the public embraces. 10 years ago mobile phones looked like this:
…and the prediction for the future was that they would get even smaller. That didn’t happen at all! So you must be careful when predicting where technology is taking us.
At the same time, we can’t postpone our development so we have to make educated guesses. Right now it looks like VR and IoT (internet of things) are gaining a foothold and that means that we as a moderation company need to consider how to moderate content produced and distributed through these new channels.
We are also looking a lot at what researchers are predicting for the future. Take an example like online dating. In 2015, students from the MSc Management programme predicted that full-sensory virtual dating is going to be a thing in 2040. If that prediction comes true, we need to be ready to moderate it and that’s going to be an interesting challenge. Because how do you moderate smells for instance? Will we need to?
We are not saying that scent moderation is necessarily going to happen, but it’s a great thought experiment demonstrating that in order to be prepared for the future we need to consider all new communication forms and channels, predict how they can be abused and consider a moderation solution for how to manage it”, said Maxence.
“One thing we can say with almost absolute certainty is that the future of moderation is going to be a lot more data-driven. We will increasingly use data and data patterns to have AI determine the decision made when moderating”, said Maxence.
“Yes”, agreed Patrik Frisk, ”and we will also be using data to better understand the impact of different moderation approaches.”
“To be quite honest the use of moderation metrics is pretty underwhelming right now and focused almost solely on operational KPIs. In the future I think that the use of moderation data will be expanded to see how moderation helps improve user traffic or conversion and that way tie it a lot closer to marketing and sales efforts”, continued Patrik.
“Speaking of conversion and user acquisition, AI will also be able to help here. Natural language generation or NLG for short is going to power AI-assisted content creation by helping users write their ad or profile. This will help optimize the content for conversion and minimize UX friction”, added Maxence.
What Will Be the Main Challenges for Content Moderation in the Future?
“There will be a lot of challenges in figuring out the best way to moderate future content, but the main challenge will probably be the same as we face now: Staying one step ahead of the scammers. With VR for instance, will that be the new media for sextortion? If so, we need to make sure that we are equipped to prevent that”, stated Maxence.
“Another big challenge will be to ensure that we have people with the right expertise to train and optimize the AI.
If AI is going to be the main moderation method going forward, and we believe it will be, then it needs to be as close to 100% accurate as possible. Achieving that will require big data sets created by excellent content moderators. And it will require linguistics, data scientists, and analyst to tweak the algorithm on an ongoing basis. Ensuring that you have the right skills in place will be a major task and that was one of the key reasons that Besedo and ioSquare merged, as Besedo compliments ioSquares top of the line technology with long-term expert knowledge about moderation!, Maxence and Patrik agreed.
Patrik Frisk – CEO at Besedo
Maxence Bernard – Co-Founder of ioSquare
Related articles
See allSee all articlesThis is Besedo
Global, full-service leader in content moderation
We provide automated and manual moderation for online marketplaces, online dating, sharing economy, gaming, communities and social media.