Australia bans social media for children

Australia is set to become the world’s first country to ban social media for children under 16, blocking them from platforms including TikTok, YouTube, Instagram and Facebook.
Approved by Parliament last year, the ban is scheduled to take effect on Wednesday. Companies that fail to comply could face penalties equivalent to up to $33 million.
“From 10 December 2025, age-restricted social media platforms will have to take reasonable steps to prevent Australians under the age of 16 from creating or keeping an account,” the government said, calling the measure a way to protect children “at a critical stage of their development.”
Platforms will be required to use a mix of signals, including account activity, viewing habits and user photos, to identify underage users. They must also stop minors from circumventing age limits by using fake IDs, AI-generated images, deepfakes or VPNs.
Tech companies have criticized the ban, describing it as “vague,” “problematic,” and “rushed.” TikTok and Meta said the law would be difficult to enforce but pledged to comply. Meta already began removing under-16 accounts ahead of the December 10 deadline. Snapchat and other platforms warned the measure could push young people toward “darker corners of the internet.” Reddit has also sharply criticized the law, calling it “legally erroneous” and “arbitrary.”
Other countries are also exploring similar legislation with the expressed intent of protecting children.
The European Parliament adopted a non-binding resolution in November calling for a minimum age of 16 on social media to ensure “age-appropriate online engagement.” Denmark has proposed banning users under 15, while France, Spain, Italy, Denmark and Greece are jointly testing an age-verification app. Malaysia has announced plans to ban social-media use for under-16s starting in 2026.
Last week, Russia banned Roblox, an online gaming platform marketed largely toward children, over what it called distribution of extremist content and LGBTQ propaganda.
Concerns over child safety online have led to mounting legal pressure. Meta is facing lawsuits in the US alleging it allowed illicit content to remain on its platforms despite repeated violations, including adult strangers contacting minors, suicide, eating disorders, and child sexual abuse.











