The importance of social media moderation goes far beyond defamation liability

by Larah Kennedy August 1, 2019

The recent ruling in the Dylan Voller defamation trial, that companies should be held liable for the activity of commenters, has only reinforced our beliefs in favour of community management on social media pages. This type of behaviour includes, and is not restricted to:

  • defamatory statements
  • discrimination
  • racism
  • sexism
  • vilification for religious beliefs
  • obstruction of justice
  • disclosure of self-harm or suicide

This is only one reason that active community management is fundamental for organisations operating in the social media space.

While organisations may be liable for these conversations and comments, I strongly believe that they are also widely responsible for ensuring the online spaces they create are also safe and welcoming. In fact, we wouldn’t step into their bricks and mortar establishment only to be verbally attacked or discriminated against by another person. If it did happen, we would expect the business to intervene.

The same can be applied online, however, we continuously see the comments section of a Facebook page or group left as a free-for-all, where people can hurl abuse at each other from the safety of their keyboard.

As businesses operating in the social media space, we can do better.

Moderation is key

Moderation isn’t just about covering an organisation from risk and liability (although that’s important); it is fundamental to ensuring that the social media space they’ve created is an inviting and pleasant place to be. It’s about:

  • protecting vulnerable populations
  • duty of care
  • reinforcing positive behaviours
  • denouncing bad behaviour
  • cultivating connections and building loyalty and advocacy

Moderation is so much more than just reducing your liability, it’s about creating space that everyone can enjoy. An organisation cannot claim that they are not aware of the comments or discussions happening in an online space. This attitude is lazy at best and downright negligible at worst.

Within the Australian Community Management industry are thousands of experienced and passionate community managers working across media, large corporations, NFP and government, who take the role and responsibility of moderating online spaces very seriously. In fact, there’s even a Code of Conduct that outlines the ethical and legal framework, which we strive to operate within. More organisations need to be looking to community management experts and investing in moderation resourcing if they want to protect themselves from legal risks AND build safe and welcoming online spaces.

So please be aware that Facebook isn’t a broadcast channel, it’s a social network and it’s time organisations wanting to play in that space take moderation and community management more seriously.

 

Further Reading: