Your guide to the new Australian social media laws

by Venessa Paech April 28, 2019

The devastating terror attack in Christchurch, New Zealand put social media at the centre of the terrorism debate. The alleged perpetrator brutally leveraged social media to document and amplify his rationale for the crime and the crime itself.

The failings of social media platforms to identify and remove the violent content in a timely manner came under justifiable scrutiny, with researchers, policy-makers, the technology community and ordinary citizens fairly unified in calls for platform monopolies in particular, to do better.

 

The New Zealand government promptly acted on gun control for their region. But few expected the Australian government to act equally quickly on platform governance.

On Thursday 4 April 2019 – less than three weeks after the attack – the Australian government passed The Sharing of Abhorrent Violent Material bill, effective immediately.

The new laws characterise forms of media depicting terrorism, murder, attempted murder, torture, rape and kidnapping happening anywhere in the world, as “abhorrent violent conduct” and require that such content be removed from social media platforms “expeditiously”. Failure to remove content could lead to organisational fines of up to 10% of annual profit, and for individuals, imprisonment for up to three years.

There are legal defences where the content in question relates to a news report in the public interest, a court proceeding, research or artistic work. Additionally, Australia’s attorney general has the discretion to prevent any prosecutions the government considers inappropriate.

While most, including professional community managers, agree social media platforms should be working harder to police violent content (especially once that content is reported by community managers or social media users), this new Bill is widely viewed as knee-jerk and too ambiguous to create positive change. There are also serious concerns that it may make it harder for those people who use social media to expose possibly criminal actions, such as police brutality or violence against dissidents.

The Voller ruling is another critical legal judgement impacting social media managers or professionals managing online communities on social media platforms like Facebook. Learn about it here.

The bill’s ambiguity is particularly troublesome for those who work with or social media. Anyone that has built a social media service, maintains an online community, employs people to manage that community, and any individual that manages an online community may be implicated by this legislation.

“Expeditiously” is undefined in the bill, and its definition differs radically depending on perspective.

Definitions of platforms themselves are problematic. Terms deployed in the legislation like Content Service, Social Media Service, Internet Service, and Hosting Service are listed as having the same meaning as in Australia’s Enhancing Online 22 Safety Act 2015. Those definitions are not confined to major social networking platforms such as Facebook, YouTube or Twitter. Any service whose principal function is to connect human beings for the purpose of social interaction online may be considered under this definition (such as forums or group messaging services).

There is some language in the Enhancing Online Safety Act 2015 that speaks to exemptions, specifically;

                     (a)  an electronic service has controls on:

                              (i)  who can access material, or who can be delivered material, provided on the service; or

                             (ii)  the material that can be posted on the service; and

                     (b)  those controls will be effective in achieving the result that none of the material provided on the service could be cyber‑bullying material targeted at an Australian child;

the Commissioner may, by writing, declare that the service is an exempt service for the purposes of this section.

It is unclear if this exemption framing also applies to the new Sharing of Abhorrent Violent Material bill, and how this would affect peer-to-peer social networking where boundaries and controls are present, such as enterprise social networks or private groups on social networking platforms like Facebook.

What does it actually mean in practice?

  • It means that platform owners need to take content that meets the definition of this new bill down as fast as possible or risk legal issues.
  • It may mean that community owners are implicated if they encounter this content and facilitate its further distribution – or take no action at all.

It is likely the bill will receive more formal challenges in months to come, designed to at least flesh out its vagaries, if not entirely repeal it in favour of more considered and consultative governance.

What do you need to do differently?

  • If you have an online social media presence – whether it is being managed as a community or a broadcast channel where audiences can react – and you do not have moderation policies in place, now is the time to create and implement those policies. Good moderation policies help you manage risk and may contribute to defensibility if you can demonstrate these provisions are in place and are followed in good faith. With so much uncertainty around the new legislation, it pays to take a prudent approach.
  • As above, if you are not actively moderating your online social media presence and channels, start now. Moderation is important for reducing exposure and creating healthy online environments that can best help you reach your business or social objectives. Use humans for this critical work – machines are not yet great at ‘grey areas’ and those areas are rife around these evolving issues.
  • Report content that doesn’t belong in a healthy commons or online community if you encounter it – always if it appears in your own channels or communities. Report it to the platform itself, and if it is content that falls under the definition contained in this bill, escalate to relevant authorities. Document this process for transparency and protection of all parties.
  • If your social media channels or online communities are at a higher risk for attracting this type of content (e.g. an activist community aggregating and discussing violence), seek specific counsel.

You can read the bill in full here. To understand your legal position and possible liabilities in full it is highly recommended you consult with a lawyer, ideally one who has experience with social media and user content matters.

Want to read more?