In June 2019 the New South Wales Supreme Court ruled in favour of a defamation case brought by Dylan Voller, a former detainee of the NT Donvale Youth Detention Centre, against several large Australian media organisations.
The decision found that media organisations were legally responsible for defamatory comments posted on their social media page. In his verdict, Justice Stephen Rothman said that the companies named in the case could not claim to be neutral parties, but rather “provided the forum for its publication and encouraged, for its own commercial purposes, the publication of comments.”
In June 2020 the New South Wales Court of Appeal upheld the ruling, affirming that publishers are accountable for user comments on their social media profiles.
What does the ruling mean for organisations?
The ruling as it stands means that any organisation publishing content on their social media page may be liable for the content their users posts (whether in reaction to that content, or in general).
If you have consistent social media audience management this is unlikely to mean much additional work. Ensure existing staff and workload includes proactive and reactive moderation of content or behaviour that may be risky from a legal, reputational or social perspective.
If you do not have a professional overseeing the activity on your social media at least several times a week, it’s time to make that investment, either by hiring, partnering with a specialist, or by engaging training for existing staff that are in a position to support this work.
Moderation doesn’t just help capture defamatory content, but offers a buffer against any illegal or harm inducing content or behaviour, such as hate speech, racism, sexism, harassment and threats of harm. It is an opportunity to create new value with users and show what your brand or business stands for. Additionally, time spent engaging with users offers useful business insights and audience development opportunities.
What does the ruling mean for social media managers?
If you oversee social media profiles and pages you may be implicated should legal issues arise.
Familiarise yourself with the Australian Community Managers Code of Conduct, that outlines the ethical and legal framework we strive to operate within regardless of industry or subject matter.
Discuss a moderation plan with your employer or client and ensure you have both a practical and defensible strategy in place. Leverage automation and tools on platform (where available) to reduce the amount of time spent on the front lines, and allow for regular, systematic check-ins.
The Voller precedent also means taking care of yourself and anyone on your team, as time spent on platforms and possible exposure to possibly harmful content increases. Make sure your plan accounts for you having time-off, and that you have a support network if you experience stress or trauma.
If moderation isn’t your speciality, chat to someone trained who can assist in creating a plan that protects you and your organisation, while balancing your needs or objectives around user engagement.
The Voller ruling may move to a higher court and further precedents are likely to be set in this space. Businesses and social media professionals should keep a close eye to stay on top of any actions required.