The eSafety commissioner has rejected the Australian tech sector’s proposed regulations for not doing enough to detect and remove online child sexual abuse and terrorist material from their services.
Tech industry associations representing companies like Meta and Google now have until March 9 to supply an improved version of the regulations for registration under the Online Safety Act or face having the eSafety commissioner develop enforceable industry standards for them.
On Thursday commissioner Julie Inman Grant announced she had written to the associations representing eight sections of the online industry to request they review and resubmit their draft industry codes.
“While I have not made a final decision, my preliminary view is that the draft codes we received in November are unlikely to provide the appropriate community safeguards required for them to be registered,” she wrote.
The industry codes are a new part of Australia’s rules for online material, created as part of the Online Safety Act 2021, which came into force at the beginning of 2022.
Under new powers in the act, the eSafety commissioner issued notices to six industry groups representing eight industry sections — including social media services, websites, search engines, app stores, internet service providers, device manufacturers, hosting services and services such as email, messaging, gaming and dating services — requesting that they develop industry codes.
These codes lay out how tech companies will respond to “illegal and restricted online content”. This is defined as online material listed as “class 1” or “class 2” by Australia’s National Classification Scheme, which includes illegal content such as child sexual abuse and pro-terror material.
In September last year, a draft of the industry’s proposed codes were published online for feedback. They include a commitment to policies like having default settings for children and reporting child sexual abuse material.
Once these codes have been registered, the eSafety commissioner can fine individuals up to $111,000 and companies up to $555,000 for failure to abide by them.
The draft codes have already faced criticism from groups such as digital rights advocacy groups who accused the tech industry of writing their own “weak” regulation.
The eSafety commissioner did not announce why the draft codes were unlikely to meet her standards for approval and did not make the letters to industry available. Crikey understands that one concern was that the codes did not require tech companies to do enough to monitor and remove known examples of child sexual abuse and pro-terror material online.
Update: This article has been updated to accurately describe the material included of the industry codes.
Crikey is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while we review, but we’re working as fast as we can to keep the conversation rolling.
The Crikey comment section is members-only content. Please subscribe to leave a comment.
The Crikey comment section is members-only content. Please login to leave a comment.