Children are not safe anywhere online. It’s time to hit tech companies in the bottom line
For those of us trying to protect children from online harm, it’s been like bailing a boat with a soup ladle while
big tech is filling it with a fire hose
Children have a right to participate in the online world and to be protected from harm when doing it. They have a right to true and accurate information, and not to be exploited for commercial reasons.
That’s what the UN Convention of the Rights of the Child tells us, but as we all know these worthy goals are entirely disconnected from the reality we currently face.
This disconnect manifests itself in the disturbing global increase in mental health concerns for children and young people over the past 15 years. As CyberSafeKids’ Trends & Usage report published today shows, over 25% of children under 12 are troubled by what they encounter online, and a further 25% encounter cyberbullying (this rises to 38% of 12-14 year olds). The majority of children (57%) are telling us that their online experience is broadly not a positive one.
The playgrounds of this discontent are all the big tech platforms you will be familiar with, although YouTube and Roblox stand out as the most likely places for young children to encounter harmful content, whereas Snapchat and TikTok are the worst purveyors of such content for older kids.
This data is deeply troubling for an organisation like ours, which since being founded 9 years ago has tried hard to address the significant education gap for children, parent-guardians and educators about how to be safe online. We have achieved a huge amount, engaging with over 75,000 children since we started, but it’s been like bailing a boat with a soup ladle whilst big tech is filling it with a fire hose.
One thing we have realised is that whilst education remains a key strategy in better equipping children to be safe online, we need to do more to help parents and educators to support them. Parents have such a fundamental role and yet many don’t feel equipped to fulfil that role in a meaningful way. We urgently need to change that.
The other critical component of any solution is regulation. The tech companies behind the myriad of services that Irish children use in huge numbers – Google, Meta, TikTok, Roblox and other gaming companies like Epic Games, the makers of Fortnite – have a responsibility to ensure that children will be safe on their platforms.
I’ve been told not to be naive and that tech companies will always put their commercial interests ahead of the safety of their users and only change when Governments hold them to account. There is truth in that. So regulation needs to really hit the bottom line if the companies continue to fail to uphold their duty of care in relation to children.
These companies are profiting enormously from children’s use of their services – to the tune of billions of dollars each year. There has to be some quid pro quo in terms of their safety and meaningful efforts to protect children. The current, heavily advertised, safeguarding efforts that companies are making fall woefully short, and we should be deeply suspicious of their claims to be able to self-regulate.
To make an analogy with the offline world, consider public spaces in which children gather – playgrounds, parks, schools – there are specific rules that apply to those spaces to ensure that they are not just suitable for children, but that they can thrive in those spaces. Some will fall below the expected standard, but there are mechanisms in place to correct failings and enforce responsible ownership.
For far too long, there has been a complete lack of accountability in online spaces in relation to users in general, nevermind children. Yes, that is changing in Ireland with the Online Safety and Media Regulation Act (OSMRA ) and the Digital Services Act (DSA) at the European level, but concerning gaps remain.
We need to ensure that ‘safety by design’ is the default obligation for any online service that children use. Not just the overall ‘very large online platforms’, which fall under the DSA, but for any service that is used by children. One gap is gaming platforms that are enormously popular with children – the likes of Roblox or Fortnite. The regulation of these is unclear since they fall below the size threshold of the DSA and are not headquartered in Ireland, so do not fall under the remit of the OSMRA either.
It’s time for child-specific online safety legislation because the OSMRA does not go far enough in relation to child users: legislation that defines a ‘safe online environment’ for children and outlines features that do not meet that threshold, and sets out the age at which a young person can access those features, with robust age-verification mechanisms in place to protect younger users from them. The law needs to strengthen the powers of Coimisiún na Meán so that it can properly hold these companies to account.
There is no one silver bullet solution, but there is clearly so much more that could be done. I worry for my own kids that these changes won’t come soon enough. But I remain optimistic that realisation is dawning on us as citizens, and our politicians as leaders that we cannot delay action any longer.
Alex Cooney is Chief Executive of CyberSafeKids
Published Tuesday, 3rd September, 2024 in The Irish Times
Read our latest Trends and Usage Report ‘Left To Their Own Devices’ here
Posted on:
Sep 13, 2024
CyberSafeKids
CyberSafeKids is an Irish charity, which has been empowering children, parents, schools and businesses to navigate the online world in a safer and more responsible way since 2015.