“This has to stop.”
These were the defiant words issued by St Kilda Football Club following the online racial abuse of Indigenous AFL footballer Bradley Hill in the aftermath of a round 18 defeat this season.
This is only one of the countless online abuse instances faced by Australians in today’s digital age. The dangers posed by this emerging threat have been echoed across the Australian Federal Government, culminating in the Senate passing a historic Online Safety Act on 22 June this year.
The Act, which will come into effect in January 2022, gives the eSafety Commissioner unprecedented powers to regulate online activity and reform a pre-existing framework of online safety legislation. Paul Fletcher, Minister for Communications, Urban Infrastructure, Cities and the Arts, said that the Act reflects the growing need to keep Australians safe online.
“The new Act represents a step-change for eSafety, tightening its powers in existing areas, and creating a new reporting scheme that will allow our eSafety Commissioner, Julie Inman Grant, to take action to remove toxic cyber-abuse, when online platforms fail to do so,” he says.
The legislation responds to mounting pressure from a host of individuals and organisations, urging social media platforms to take more responsibility in content moderation. But despite this emphasis on social media platforms, the Act also applies to a range of other online service providers such as Google, Gmail, Amazon and Microsoft.
Dr Karen Lee, senior lecturer in the Faculty of Law at the University of Technology Sydney, tells upstart that the new legislation forms part of a wider pattern of attempting to embed digital platforms within a regulatory framework.
“I think the government has finally recognised that self-regulation is not working. They clearly don’t think that various platforms are doing enough to control the nature of the content on their platforms,” she says.
While the Act pushes for more accountability from service providers, it also pertains to end-users who engage in harmful online behaviour. The eSafety Commissioner possesses the capacity to obtain contact information from service providers regarding individuals exploiting anonymous accounts to engage in abusive behaviour.
Melissa Fai, partner at Gilbert + Tobin lawyers, tells upstart that service providers must ultimately fulfill this legislative requirement, despite having their own privacy policies.
“Under the Privacy Act, you can disclose personal information if it’s required or authorised by another Australian law,” she says.
The new legislation also establishes a world-first adult cyber-abuse scheme, offering Australian adults who are victims of online abuse an avenue for greater protection. However, a “serious harm” threshold and intention test needs to be satisfied in order for the eSafety Commissioner to issue a removal notice. For example, this threshold would more than likely be met in relation to the recent abuse directed at AFL footballer Toby Greene, where a direct Instagram message to his partner told Greene to kill himself.
Dr Lee says that these requirements assist in balancing the competing interests of free speech and protecting adults from online harm.
“There are many instances where free speech will offend and the government has to allow for that,” she says. “But where offence amounts to something that is menacing or threating and there’s some kind of intention to actually harm, then yes it’s appropriate.”
Given the relatively quick passage of the Act, some online service providers have sought greater clarity on certain aspects of the legislation. Fai says that some service providers are concerned about the absence of an exclusion in the Act for business-to-business services.
“If a co-worker is harassing their colleague and then complains to the eSafety Commissioner, it technically may fall on one of these hosting providers that provide the intranet service to find the material and block or remove it,” she says.
“From a service provider’s point of view, that should ultimately be the responsibility of the workplace.”
One distinct requirement of the Act involves imposing a reduced time frame on service providers for the removal of harmful material, shifting from 48-hours to 24-hours. As a member of a joint submission for online safety legislative reform in 2019, Dr Lee says that the general consensus was that this condensed time frame is ultimately beneficial.
“We all agreed it was a good thing and that’s because in the online environment, we can see the damage that can be done even in time frames less than 24 hours,” she says.
On the flipside, Fai explains that this reduced time frame could limit the scope for service providers to challenge removal requests which they believe have been issued mistakenly. In accordance with section 220 of the Act, a service provider would need to issue a formal application to the Administrative Appeals Tribunal (AAT) for a decision of the eSafety Commissioner to be reviewed.
“There’s a time period which you have to answer the removal notice. Service providers are not going to have time to make an AAT claim,” she says.
“Over the long term as well, they didn’t see any kind of check on how the eSafety Commissioner was using her powers.”
The extent to which checks and balances are in place to monitor the eSafety Commissioner’s use of her powers has contributed to some debate. In particular, her independence from the Australian Communications and Media Authority (ACMA) has led to some concerns relating to a lack of transparency and accountability in decision-making.
Despite this independence, Dr Lee says that the eSafety Commissioner still maintains the capacity to utilise ACMA’s resources including access to the Consumer Consultative Forum.
“I think the eSafety Commissioner, while not under the direction of ACMA, can nevertheless seek their advice and draw on their resources when she needs to,” she says.
The ‘victim-focused’ legislation comes as a welcome relief not just for high-profile figures such as AFL footballer Bradley Hill, but also for everyday Australians who are in need of greater security from online abuse and harassment. Time will tell whether the Act will be met with compliance from service providers and fulfill its intended purposes says Dr Lee.
“I’m quietly optimistic that the platforms will do something here, but it is unfortunate that they haven’t done enough to date.”
Photo: Woman holding silver iPhone 6 by Firmbee.com available HERE and used under a Creative Commons Attribution. The image has not been modified.
Article: Jonathan Potenza is a second-year Bachelor of Laws/Bachelor of Media and Communications (Journalism) student at La Trobe University. You can follow him on twitter @j_potenza15.