
Better regulation rather than banning orders is the way to tackle harmful content, argues IAN RITCHIE
The much-anticipated Australian social media ban finally took effect in December. Children under 16 are now banned from having accounts on ‘age-related’ social media platforms; providers can be fined up to A$50 million if they fail to make reasonable steps to block minors from having accounts on their platforms.
Although many other countries said that they wanted to examine the effectiveness of the Australian law before taking their own actions, instead there has been an unruly rush among other countries to impose their own restrictions.
French lawmakers have just approved a bill to ban social media for children under 15 which is now going through the final legislative stages.
The Norwegians, Spanish, Austrians, Poles and Danes are currently undertaking consultations in preparations for fresh controls to be imposed during 2026. Several US States are also planning on introducing age-related restrictions, as are Canada and Malasia.
And out of the blue last month the House of Lords tried to force the British government into imposing a social media ban by voting to amend the Children’s Wellbeing and Schools Bill. It would require MPs in the House of Commons to also pass it, which they are not currently ready to do.
Of course, it is very early days to make judgements about how the Australian ban is going. There are reports that social media providers have locked or deactivated millions of Australian accounts in the first month and there are early anecdotal reports of children taking up more sports activities, or ‘hanging out’ more in parks. Mind you, it’s the height of summer down-under right now, so those choices might not be surprising.
On the other hand, there have been reports from mental health experts of increased social isolation, particularly for marginalised youths, such as LGBTQ+ teens, who have previously relied on online communities for support.
However, the new Australian law remains a very blunt instrument. It relies on unreliable age verification involving showing IDs or from facial recognition, and it makes no allowance for how mature different teenagers are at different times in their development.
Many parents are happy for their children to have smartphones for a variety of very legitimate reasons, and many children value developing friends and contacts with others with similar interests independent of geography.
Setting age controls using these methods are not fit for purpose. Most self-respecting 14-year-olds can bypass them with fake ID or borrowing an older face to pass the age test. Nerdy kids will easily be able to utilise a virtual private network (VPN) to sidestep controls imposed in their home country and advise others how to do the same.
We are long overdue a more reliable and flexible system of controls. It is the duty of every government to protect its population from harm and laws are already in place to restrict the sale of harmful products such as alcohol, tobacco and weapons.

Cinemas have long operated a system on rating films by minimum age for viewing and television has a ‘watershed’ which controls the content of what may only be shown after 9pm. It is well beyond time to put similar flexible controls on social media use.
Social media is almost always consumed on smartphones and tablets, and these devices are equipped with operating systems from only two suppliers: IOS (Apple) or Android (Alphabet/Google). If these companies were forced to provide a ‘date of birth’ (dob) code to be embedded on their devices it would be possible to provide controlled access to various services in a more flexible way.
If a ‘dob’ code was available on devices then individuals could be allowed to tailor which teenager could have access to which products at which stage.
Such a feature is already available in Denmark as part of its national digital identity system and as it happens the Scottish Government has commissioned the Danish company Netcompany to develop a proof of age feature as part of its ‘digital-first’ nation which it hopes to launch this year.
Such a system would allow a more sophisticated control of what would be allowed on various devices. Some teenagers are more mature than others and could be allowed access to different products when they are mature enough.
It would be possible for Government regulators such as Ofcom to set standards for various features of social media. Elon Musk recently disclosed the formula that decides what posts you like best on X so that it can send you many more similar ones. Social media companies should be required to disclose their formulas and regulators could set standards to limit the sites becoming addictive.
Regulators could require social media companies to identify and block hate mail or dangerous discussion topics such as depression and suicide. They should be required to build systems to catch those who spoof their identity, such as agents based in Russia or Iran interfering in elections around the world.
We need to get social media under control, not by the blunt instrument of banning usage until your 16th birthday, but by regulating these services in a more flexible way and make them behave in a way that doesn’t cause harm.
Ian Ritchie is a tech company creator and investor
>Latest Daily Business news
Related
#Dont #ban #social #media #control #Daily #Business #Magazine