Australia’s nationwide ban on social media use for children aged 16 and under officially took effect at midnight on Dec. 10 — a first-of-its-kind law that has drawn global attention and stirred debate over online safety, privacy, and enforceability.
The law has been widely welcomed by child-safety advocates and many parents, who say it represents a major step toward improving young people’s mental health, reducing anxiety, and protecting self-esteem. Still, questions remain about whether tech companies can meaningfully prevent underage users from accessing their platforms.
“My eldest child is 10 … I can absolutely see the benefit,” one parent told local media, echoing the sentiment of many families relieved to see national action.
Prime Minister Anthony Albanese said the ban is meant to support parents and create a shared national standard, calling it “a source of national pride.”
“This law is about making it easier for you to have a conversation with your child about the risks and harms of engaging online,” Albanese said Sunday. “It’s also about helping parents push back against peer pressure. You no longer need to worry that by stopping your child from using social media, you’re making them the odd one out — you can point to a national ban.”
Under the new law, social media companies must “take reasonable steps” to prevent users 16 and under from creating or maintaining accounts or accessing social-media-like features. Platforms are required to deactivate existing underage accounts and block attempts to create new ones, including through workarounds.
The ban currently applies to 10 major platforms, including Facebook, Instagram, Threads, Snapchat, TikTok, Reddit, Twitch, X (formerly Twitter), and YouTube, according to Australia’s eSafety Commissioner, the government’s independent online safety authority.
Tech companies, however, have pushed back — with some arguing the policy will have unintended consequences.
Snapchat called the law misguided, warning it could push children toward apps with weaker safety protections.
“Disconnecting teens from their friends and family doesn’t make them safer — it may push them to less safe, less private messaging apps,” Snapchat said in a Nov. 22 statement. The company urged lawmakers to consider age verification at the device or app-store level instead of platform-level bans.
Meta, parent company of Facebook, Instagram, and Threads, also voiced opposition, saying it supports online safety efforts but disagrees with a blanket ban.
























