Australian children are easily circumventing inadequate and poorly enforced minimum age rules employed by well-known social media services with most only asking kids to self-declare their age at sign-up, according to a recent report released by Australia’s online safety regulator.
The report combines the results of a national survey looking at the social media use of Australian children aged 8-15, and information provided directly to eSafety by social media platforms about how they enforce their own age restrictions.
The most popular platforms for kids aged under 13 were identified as YouTube, TikTok, Snapchat and Instagram, with YouTube the only platform that allows access to users under this age when attached to a family account with parental supervision.
Responses received from YouTube, Facebook, Instagram, TikTok, Snap, Reddit, Discord and Twitch, cover the period between January and July 2024, and reveal setting up an account if you were a child under 13 was a relatively simple process with many services only requiring a self-declaration of age at sign up.
The report also reveals mixed results in how a number of platforms were enforcing age limits through proactive tools and reporting systems to detect children under the age of 13 already present on their services.
eSafety Commissioner Julie Inman Grant said that the first-of-its-kind report shows very clearly that there is still significant work to be done by any social media platforms relying on truthful self-declaration to determine age with enforcement of the Government’s minimum age legislation on the horizon.
“Social media services not only need to make it harder for underage users to sign up to their services in the first place, but also make sure that users who are old enough to be on the service, but are not yet adults, have strong safety measures in place by default,” Ms Inman Grant said.
“Few have any real stringent measures in place to determine age accurately at the point of sign-up so there’s nothing stopping a 14-year-old for instance entering a false age or date of birth and setting up an unrestricted adult account that doesn’t carry those extra safety features.”
Ms Inman Grant said this likely means the platforms are unaware of the true numbers of users identified as children and teens on their services. Some platforms also make it very difficult to report under-aged users who are on their platforms today.
“As a result, the reported numbers of monthly active users by services under the age of 18 likely underestimate the true numbers. This is likely also true in the very young cohort aged 13-15 which will be deemed too young to access some services when the Government’s minimum age legislation comes into force by the end of the year.
“Our survey also found 95 per cent of teens aged 13-15 reported using at least one of the eight social media services since January 2024, so we can expect the actual numbers to be much higher.
“We’ll be consulting with industry and other stakeholders this year about what reasonable steps platforms should be expected to take to give effect to the minimum age requirements, and this report will be one key input to that process. This report shows that there will be a tremendous amount of work to be done between now and December.”
Ms Inman Grant said age restrictions, while important, are just one part of a wider holistic approach eSafety is taking to ensure children and teens are having safe age-appropriate experiences online.
“We remain committed to working with teachers, parents and young people, to not only ensure they are well informed about risks, but also well-equipped to thrive online by building digital literacy and resilience, and providing access to meaningful, co-designed educational content and resources,” she said.
More reading: A defining year for social media