Australia’s e-safety commissioner warned platforms like Meta, YouTube and TikTok of “major gaps” in their enforcement of the social media ban for under-16s, almost four months after the law came into effect.
“While social media platforms have taken some initial action, I am concerned through our compliance monitoring that some may not be doing enough to comply with Australian law,” commissioner Julie Inman Grant said in a statement on Tuesday.
The legislation requires 10 of the largest social media networks, including TikTok, Instagram, Snapchat, YouTube, Facebook, and X, to keep under-16s away or face fines of up to A$49.5m (£26.5m), making it one of the world’s toughest digital restrictions.
As of early March, the e-safety commission noted in a report, the platforms had blocked around 5 million social media accounts due to age restrictions. However, “major gaps” remained in the way these companies had responded to the legislation.

The report warned that the platforms were enabling children under 16 to repeatedly attempt age verification so as “to ultimately obtain a 16+ outcome”.
It followed a survey of some 900 parents and carers conducted between 19 January and 2 February.
Nearly half of the surveyed parents reported that their child had an account on at least one platform, which dropped to about 31 per cent once the law came into effect.
Afterwards, some of the platforms failed to provide “effective pathways” for parents and carers to report age-restricted accounts.
Ms Grant said the watchdog was “currently investigating potential non-compliance” by Facebook, Instagram, Snapchat, TikTok, and YouTube. But to show that companies had not taken reasonable steps to comply with the ban “would take time”. “The evidence must establish that the platform has not taken reasonable steps to prevent children aged under 16 from having an account. That means more than simply demonstrating some children do still have accounts,” she said.
“We certainly expect companies operating in Australia to comply with our safety laws.”
She warned the platforms “can choose to do so or face escalating consequences, including profound reputational erosion with governments and consumers globally”.
Responding to the report, a Meta spokesperson told ABC News that accurately determining a user’s age was “a challenge for the whole industry”.
“The most effective, privacy protective and consistent approach is to require robust age verification and parental approval at the app store,” the spokesperson said. In the meantime, the company would “keep investing in enforcement to detect and remove under‑16 accounts”.


