Is Moderation Censorship?

Recently Facebook came under fire for removing a Pulitzer prize-winning photo titled the Napalm Girl. The picture hit their moderation filters because it features a naked girl and Facebook has a very strict no nudity policy. So strict in fact that they have earlier been in trouble for removing pictures of breastfeeding women.

Just recently Facebook came under fire for removing a Pulitzer winning photo titled the Napalm Girl. It hit their moderation filters because it features a naked girl and Facebook has a very strict no nudity policy. So strict in fact that they have earlier been in trouble for removing pictures of breastfeeding women.

The incident of Facebook removing Napalm girl went viral and at the time of writing this it still shows up as the third result in Google despite the fact that it occurred over a month ago. The search for the term has also spiked to an all-time high as seen in the screenshot from Google trends below:

line graph analytics

Search spike for the term “Napalm Girl” after the Facebook moderation incident

Whatever you feel about Facebook’s reasoning behind removing the picture, this kind of bad media attention is something you definitely want to avoid.

On the other hand, you don’t want to end up at the other end of the spectrum where media is berating you for providing a digital space for child pornography exchange.

So how do you handle this very sensitive balance act?

 

Moderation Is Censorship!

Let’s first address a claim that has come up a lot in the wake of the Napalm Girl case. Is moderation censorship?

According to Merriam-Webster, a censor is: “a person who examines books, movies, letters, etc., and removes things that are considered to be offensive, immoral, harmful to society, etc.”

That sounds an awful lot like the job of a moderator and the purpose of content moderation doesn’t it?

With that definition in mind moderation is censorship, but and there is a but. The word censorship is laden with negative connotations mostly because in the course of history it has been used to oppress people and slow down progress.

But censorship in the form of moderation is not always bad. It just has to be applied right. To do so you need to understand your target audience and act in their best interest (Not what you think their best interests are, what they really are).

Not convinced? Well, let’s consider the case of a chat forum frequented by young adults. If someone starts harassing another user, should we not step in? In the spirit of free speech and to avoid censorship are we prepared to allow cyberbullying to rage uncontrolled?

What about discrimination? Should homophobic or racist statements be allowed online just to appease a false understanding of freedom of speech? Will that do anything positive for our society? Not likely! So there are very valid and altruistic reasons for moderating or censoring if you will.

And that is not even touching upon the issue of legality. Some things just have to be removed because they simply aren’t legal and you as the site owner could be held liable for inaction.

 

How to Avoid Users Feeling Censored

You were probably onboard with all this already, but the real question is, how to get the buy-in from your users?

You need to ensure that you and your site adhere to these five key elements.

  • Understanding of your target audience and their expectations
  • Transparency in rules and intent
  • Consistency
  • Managing support and questions
  • Listen and adapt

 

Understand your Target audience

This is super important because your moderation rules should look significantly different if you are running a dating site as opposed to a chat forum for preteens. And even within the same site category, there could be huge deviations. Moderating away swear words would probably feel like strict censorship on a general dating site, whereas if your target audience is fundamental Christians it would make complete sense.

Transparency in Rules and Intent

It has always been a source of confusion to us why sites would choose not to share their reasoning behind the rules and policies they set in place. If those rules are created with the best interest of the target audience in mind, then sharing them should just show users that they are understood and listened to.

But the fact is that very few companies are wholly transparent with their moderation process and policies.

The problem with that is that if users do not understand why certain moderation rules are enforced, those rules are going to feel a lot like censorship and conspiracy theories tend to start. It is a great exercise to go through all your current moderation policies, check which of them you would not feel comfortable sharing and really investigate why. Chances are the rules you wish to keep hidden are probably not in place for the benefit of your users and if that is the case you will have a hard time getting buy-in. And when you remove something you can easily refer to that clause in your policies.

Now some rules may be in place to keep the company profitable, but with a proper explanation that can easily be explained to users so they understand that without these rules in place the site would close down.

Consistency

The reason the annoyance with Facebook for banning breastfeeding pictures caught fire was their inconsistency in approach to pictures of women showing skin. On one hand, they banned women in the very natural act of feeding their baby. On the other hand, they allowed women in skimpy bikinis with (from a user point of view) no hesitation.

If you don’t allow books on Adolf Hitler, then you need to ban all books about leaders who have committed genocide, so Stalin biographies are a no go as well.

Managing Support and Questions

As anyone who works within customer care will know, how you handle customer feedback and complaints is vital to achieving customer satisfaction.

The same goes for getting user buy-in. When you remove content posted by a user, it is critical that they understand why and that you answer any questions they may raise clearly. You can even reference your rules. They are clear and transparent after all, right?

Listen and Adapt

You can’t predict the future! This means that you will never be able to account for all possible scenarios which might arise when moderating your site. New trends will pop up, laws might change and even a slight shift in your target audience could cause a need to review your policies.

The Facebook/Napalm girl case is once again a great example of this. On paper, it might make total sense to ban all photos of nude children, but in reality the issue is a lot more complex. Try to understand why your users would post things like that in the first place. Are they proud parents showing their babies first bath for instance? Then consider what negative consequences it could have for your users and for your site to have pictures like this live.

If you find that your policies are suddenly out of tune with user goals, laws and your site direction then make sure you adapt quickly and communicate the chance and the reasoning behind these to your users as quickly and clearly as possible.

 

From Word to Action

Of the five key elements to secure user buy-in for moderation, consistency is the hardest one to continuously achieve.

It is very easy to set the goal to be consistent, but a whole other thing to actually accomplish consistency. Particularly in moderation as the issue is made more difficult by multiple factors.

  • Rules and policies are often unclear leaving space for many gray zone cases.
  • A lot of moderation is done by humans. Each person might have a slightly different interpretation of the rules and policies in place.
  • Moderation rules need to be updated continuously to keep up with current trends and event, this is time-consuming and rules are as such often outdated.
  • Moderating well requires expert knowledge of the field. Achieving expert level on any topic takes time and as such the expertise required for consistent moderation is not always available.

 

The best way to achieve consistency in moderation is to combine well thought through processes and training of your human expertise with machine learning algorithm ensuring that the right decisions are taken every time.

So is moderation censorship? We’ll leave that to semantic experts to answer, but as content moderation experts we boldly claim that moderation is needed today and even more so in the future. To get your users on board, make sure that you cover all the 5 key elements for user buy-in. Doing that will ensure that moderation is something your users will appreciate rather than a question.

Want to learn more?
Join the crowds who receive exclusive content moderation insights.