Meta announced on Thursday that it has begun removing users under the age of 16 from Instagram, Facebook, and Threads in Australia, marking the first steps toward complying with the country’s upcoming youth social media restrictions — the first of their kind anywhere in the world.
The technology company said it is initiating age-screening and enforcement measures across its platforms as part of a wider industry shift prompted by Australia’s planned legislation aimed at reducing online harms for minors. The new rules, expected to be phased in over the coming months, will require social media platforms to block access for children under 16 unless parental consent is verified.
Meta stated that accounts belonging to users identified as underage will be suspended or removed, and additional systems are being deployed to detect minors attempting to create new profiles. The company emphasized that it supports efforts to create safer digital spaces but maintains that any national framework must be workable for both families and platforms.
Australia’s proposed legislation follows growing concerns among policymakers, educators, and mental-health advocates about the impact of social media on young people — including exposure to predators, cyberbullying, and addictive design features. The government argues that stricter age limits, combined with mandatory age-verification tools, will significantly reduce such risks.
Meta’s early compliance move places further pressure on other platforms operating in Australia to clarify how they plan to meet the new legal requirements. The rollout is expected to set a global precedent as governments worldwide grapple with regulating young people’s access to social media.