Meta has commenced notifying Australian teenagers under the age of 16 that their Instagram, Facebook, and Threads accounts will be shut down from December 4, ahead of the country’s groundbreaking social media ban set to take effect on December 10.
The social media ban, passed by the Australian parliament, targets nine platforms including Facebook, Instagram, Threads, TikTok, YouTube, X (formerly Twitter), Reddit, Snapchat, and Kick. Prime Minister Anthony Albanese has described the measure as “world-leading,” aimed at protecting young people from online risks and allowing “kids to be kids.” Meta estimates that approximately 350,000 Instagram users and 150,000 Facebook users in the 13-15 age bracket will be affected.
In its communications, Meta is advising teens to download and save their posts, videos, and private messages before their accounts are closed. The company is also encouraging users to update their contact details so they can be notified when they turn 16 and become eligible to reopen accounts. While Meta opposes the ban, it has stated it will comply with the law, and failure to do so could result in fines of up to A$50 million for platforms.
For users who believe they have been incorrectly flagged as under 16, Meta has established an appeals process. They can challenge the account removal by submitting a “video selfie” for facial age estimation or providing government-issued identification such as a driver’s license. The facial age scans are conducted by Yoti, a third-party company, which claims not to store personal data after verification.
However, age verification technologies face accuracy challenges. Independent testing, including a government-commissioned report, found that facial age checks can have a false negative rate of around 13.9% for 16-year-olds, meaning about one in seven eligible teens might be wrongly blocked. This raises concerns about potential errors in the system and the recourse available for those unfairly affected.
Meta has emphasized that compliance will be an ongoing process, and it plans to use artificial intelligence to improve detection of underage users who may have lied about their age. The company advocates for an alternative approach where parental consent is required for under-16s to access social media, rather than an outright ban.
Australia’s e-Safety Commissioner, Julie Inman Grant, supports the ban as a necessary step to shield teenagers from harmful online content and social pressures. As the December deadline approaches, other platforms like Roblox are also adjusting their policies to avoid inclusion, highlighting the global ripple effects of Australia’s regulatory move.
