Australian teenagers face stronger restrictions on a popular social media platform but there are concerns about how enforceable the measures are.
Instagram announced it would automatically restrict the accounts of people under the age of 18 in Australia, the US and Britain on Wednesday after it faced calls to make its social media platform safer for children.
Teens’ accounts will only be able to receive messages from people they follow or are connected to and sensitive content that includes violence or promotes cosmetic procedures will be limited.
A notification will tell them to leave the app after 60 minutes each day and a “sleep mode” will mute notifications and send an auto-reply to private messages between 10pm and 7am.
Current account holders will be transitioned to the new measures over 60 days.
Those aged above 16 will be able to turn off the added restrictions and those who are younger will need parents’ permission to disable the features.
Communications Minister Michelle Rowland welcomed the measures but called for further action to make children safer online with evidence pointing to social media having a detrimental mental health and wellbeing impact on kids.
“The industry needs to do more and today’s announcement shows that they can do more,” she told reporters in Sydney on Wednesday.
The federal government has committed to a trial of age verification technology and pledged to introduce a minimum age for social media after the coalition called for it to be 16.
An appropriate age between 13 and 16 was being assessed, Ms Rowland said, but the trial needed to ensure the technology worked and companies were enforcing it.
“One of the reasons why it’s not getting enforced is that there isn’t a consistent set of age assurance standards across Australia,” she said.
Opposition communications spokesman David Coleman said it was easy to work around the restrictions without robust age verification technology.
“So if a 10 year old signs up to Instagram in the future, the system is exactly the same as it is today and they can just say they are 20,” he said.
“Meta will do everything it can to avoid a real system of age verification because it will lead to them losing huge numbers of underage users over time.”
Parent company Meta, which also runs Facebook, acknowledged this was a risk and says it’s working on age verification technology.
Cyber security envoy Andrew Charlton said Meta’s move validated the need to make social media safer for children but said other platforms should be regulated as well.
“We need to be thinking about what we can do in Australia, not just for Meta, but right across all of the platforms to have a nationally consistent approach that safeguards our young people on social media,” he told Sky News.
Dominic Giannini
(Australian Associated Press)