Meta has removed nearly 550,000 Facebook, Instagram, and Threads accounts belonging to Australian users under 16, marking the most visible enforcement to date of Australia’s world-first social media ban for children. The removals, carried out between December 4 and December 11, come as the Albanese government ramps up efforts to limit young people’s exposure to algorithm-driven content online.
According to Meta, 330,000 accounts were deleted from Instagram, 173,000 from Facebook, and almost 40,000 from Threads, with enforcement beginning even before the law officially came into effect on December 10.
Why Australia introduced the under-16 social media ban
Passed in 2024, Australia’s minimum age legislation aims to protect children from targeted algorithms, harmful content, and addictive platform design. Companies that fail to take “reasonable steps” to prevent under-16s from holding accounts face fines of up to $50 million.
The ban applies to major platforms including Facebook, Instagram, Snapchat, TikTok, X, YouTube, Reddit, Twitch, Threads, and Kick, though exemptions exist for platforms primarily focused on gaming, health, or education. Enforcement is overseen by the eSafety Commissioner, which has warned that more platforms could be added over time.
Meta pushes back, warning of unintended consequences
While confirming its compliance, Meta has sharply criticised Australia and the legislation. In a blog post, the company argued that the ban fails to meet its stated goal of improving youth wellbeing and instead risks isolating vulnerable teens who rely on online communities for support.
Meta also warned that the law could push children toward less-regulated platforms, increasing rather than reducing risk. The company challenged the assumption that banning accounts eliminates algorithmic influence, noting that logged-out users are still shown algorithmically selected content.
“The premise of the law is false,” Meta said, calling for a more collaborative, industry-wide solution rather than what it described as a “blanket ban.”
Age verification remains a major challenge
A key point of contention is age verification. Platforms currently rely on a mix of government-issued ID, facial age estimation, and age inference technologies. Meta works with UK-based company Yoti for age assurance but says inconsistent standards across platforms undermine effectiveness.
Meta has urged the Australian government to require app stores to verify users’ ages and obtain parental consent before allowing under-16s to download social media apps, arguing this would create a consistent, privacy-preserving system.
Public opinion divided as impact becomes clear
While a Monash University survey found 79% of Australian adults support the ban, younger users are far less convinced. An ABC poll revealed 70% of under-16s oppose the restrictions, with many already migrating to alternative platforms not yet covered by the law.
The federal government is expected to release official data this week detailing how many children were removed across all platforms, as debate continues over whether Australia’s bold experiment will become a global model—or a cautionary tale.