Australian regulator to quiz digital giants on sexual abuse
Twitter, TikTok and Google will be forced to answer questions about how they tackle child sexual abuse and blackmail attempts on their platforms after the Australian eSafety commissioner issued legal notices to the companies.
Photo: Yui Mok/PA Wire
The tech giants, as well as gaming platforms Twitch and Discord, will have 35 days to respond to the commissioner’s questions or risk fines of up to $687,000 a day.
The legal demands come six months after similar notices were issued to Apple, Meta, Microsoft, Snap and Omegle, which revealed some tech platforms were not using well-known safety measures to detect abusive content and protect users.
The commissioner of eSafety, Julie Inman Grant, said she was particularly concerned about the treatment of illegal material on Twitter following massive job cuts to its Australian and safety teams.
“Back in November, Twitter boss Elon Musk tweeted that addressing child exploitation was priority number one but we have not seen detail on how Twitter is delivering on that commitment,” Ms Inman Grant said.
“We’ve also seen extensive job cuts to key trust and safety personnel across the company – the very people whose job it is to protect children – and we want to know how Twitter will tackle this problem going forward. ”
The tech platforms must answer questions about how they detect and remove child sexual abuse content from their platforms including live streams, how algorithms could amplify its reach, and how the companies deal with sexual extortion attempts against children.
These attempts typically involve tricking underage users into providing intimate images and later blackmailing them.
“The creation, dissemination and viewing of online child sexual abuse inflicts incalculable trauma and ruins lives. It is also illegal,” Ms Inman Grant said.
“It is vital that tech companies take all the steps they reasonably can to remove this material from their platforms and services.”
Incidents of child sexual abuse on digital platforms is widespread, with 29.1 million reports made to the US National Centre for Missing and Exploited Children in 2021, including 875,783 reported by Google, 154,618 reported by TikTok, and 86,666 from Twitter.
In a statement this month, Twitter reported it had suspended 404,000 accounts for engaging with child sexual exploitation on its platform in January in what it called “a 112 per cent increase in CSE suspensions since November” 2022.
“Not only are we detecting more bad actors faster, we’re building new defences that proactively reduce the discoverability of tweets that contain this type of content,” the statement read.
Companies in the tech industry have also been asked to draft an enforceable code of conduct for dealing with illegal online material, with the eSafety Commissioner expected to accept or reject the code in March.
-AAP