As the dust settles on Justice Rothman’s ruling in the Dylan Voller defamation case, publishers face a major challenge. While responsible for libellous comments left on their own websites, this ruling sets the precedent that publishers will be responsible for comments left on platforms like Facebook.
This is a problem. Publishers – and their followers – generate a relentless tide of comments on Facebook, which offer minimal moderation options for publishers who have their own page on the platform. One broadcaster we work with receives over 138,000 comments per week on just one of their pages. The only way to stop comments appearing on posts published to the page is to add in word-based filters which automatically hide content until moderators can review and publish the content. These filters automatically hide profanity and can be ‘hacked’ to prevent every single comment from appearing. As Facebook is built in the United States, where the right to Freedom of Speech is enshrined, libel in the media is hardly ever prosecuted.
It’s obvious that this is not a workable solution. Either our defamation laws need to change taking in account social media, as is the case in Britain, or Facebook needs to provide better moderation services to publishers whose content drives an enormous amount of usage of their product. (Publishers also spend large amounts of money with the platform just to have their posts seen in the newsfeed.) Publishers can, and should, push for these changes.
While social media teams take stock of this new regulatory environment, I also invite them to take pause to consider the broader landscape of the online communities they have created on third-party platforms as well as in the comment sections on their own websites.
Are Australian online communities safe and respectful?
Do they inform and support their members? Do they align with the publisher’s organisational goals? I’d argue that they’re not.
If social media platforms can be compared to a public square, we have allowed preachers of hate to enter our squares. We’ve allowed bots disguised as humans to infiltrate our places of discussion and manipulate conversations. We’ve given equal space to people just dropping by and those who’ve contributed to the community for years.
By reframing our online communities to become intentional spaces, there is an opportunity to create value and drive website traffic. To do that, we need to look at the operating rules of our digital communities and make sure they are fit for purpose.
To that end, this week saw WPP interim AUNZ CEO John Steedman call for an end to anonymous online comments. “If somebody has something relevant to say about any issue, they should be required to log in,” he mentions in an open letter to media outlets.
While anonymity gives some online trolls the guts to say nasty things, anyone who has moderated Facebook comments knows that people are also happy to say the same things under their full name. (For proof, look at some of the replies Change.Org director and LBQTI activist Sally Rugg received after appearing on the ABC TV show Q & A this week.) Anonymity does provide valuable opportunities to discuss sensitive topics such as mental health and abuse without fear of real-world repercussions.
Is there another way?
Websites like Reddit and The Guardian have tried to improve the quality of their online communities by implementing user tools which enable readers to ‘upvote’ comments and downvote others. Reddit also awards pro-social behaviour with it’s ‘karma” system, elevating comments by members whose opinions are respected by the wider community.
The Guardian features comments it believes are particularly worthwhile as a ‘Guardian Pick’, acting as both an incentive as an exemplar for others.
Creating intentional communities isn’t easy. Facebook needs to take the moderation concerns of publishers seriously and provide more tools and control for page owners. Publishers shouldn’t just invest in moderation services, but also in community management – taking in a holistic view of their entire digital footprint.
In a sea of online news served up via algorithm, a great community is what can make your online presence worth visiting – and staying for.