Meta begins early removals
Meta now removes Australian children under 16 from Instagram, Facebook and Threads. The company starts this process one week before the official youth ban begins. Meta announced last month that it notified users aged 13 to 15 about account closures starting on 4 December. The company expects around 150,000 affected Facebook accounts and about 350,000 Instagram profiles. Threads also faces removals because users can only access the service through Instagram. Australia’s new social media law takes effect on 10 December and requires platforms to block under-16s. Companies face penalties of up to A$49.5m if they ignore these rules.
Meta calls for standardised age checks
A company spokesperson told a British news outlet that compliance will remain a layered and ongoing task. She said Meta will follow the law but wants a stronger and more unified system that protects privacy. Meta urges governments to make app stores verify user ages when they download apps. Parents should approve access for under-16s, which would prevent repeated age checks in every app. Last month, Meta said flagged users under 16 can save their posts, videos and messages before deactivation. Teens who believe the system misjudged them can ask for a review and send a short video selfie to confirm their age. They can also use a driver’s licence or another government ID.
Other platforms under pressure
The new ban also covers YouTube, X, TikTok, Snapchat, Reddit, Kick and Twitch. The government says the ban protects children from online harm, but critics fear isolation for young people who rely on social platforms for connection. Experts also warn that children may shift to poorly regulated spaces online. Communications Minister Anika Wells said she expects early challenges but stresses protection for Generation Alpha. She warned that young people face powerful algorithms that act like behavioural traps. Wells also said children connect to a “dopamine drip” once they get smartphones and social accounts. She added that she monitors newer apps like Lemon8 and Yope to see if children move there after the ban.
Lesser-known apps face scrutiny
Earlier this week, Australia’s eSafety Commissioner contacted Lemon8 and Yope and asked both to check whether the ban applies to them. Yope’s chief executive said his company has not received formal questions but already completed an internal review. He said Yope functions as a private messenger without public content. He compared it to WhatsApp because users only share moments with trusted contacts. Reports say Lemon8 plans to block under-16s next week even though the law does not include the app. YouTube, first exempt then added to the ban, called the law rushed. The company argues that banning teens from accounts with parental controls will reduce platform safety.
Global attention on the new law
Governments worldwide watch Australia’s experiment closely. A government study earlier this year found that 96% of children aged 10 to 15 use social media. The report showed that seven in ten saw harmful content, including violent material and posts promoting eating disorders or suicide. One in seven experienced grooming behaviour from adults or older children. More than half reported cyberbullying.

