A teenager looking at his cell phone in his room.
24/05/2025
2 min

Our teenagers are hooked, above all, on the Chinese platform TikTok from a very early age. Their digital lives are very active on this and other social networks, giving them access to everything, with few barriers. They are also on Instagram, Facebook, WhatsApp, and Twitter, although these networks, also popular, appeal more to adults. Is it necessary to limit minors' access to this digital ocean? How? Who should do it? To what extent? These are questions that concern families, governments, the educational community, and mental health experts. There are, indeed, reasons for concern. But the answers are not so obvious.

Australia has passed a pioneering law in this regard, which affects children under 16: it will come into force at the end of the year and will carry fines of nearly €30 billion if large technology companies do not enforce it. Along the same restrictive lines, and with the same age threshold, the Spanish government is promoting a law to protect minors in the digital environment, currently in parliamentary proceedings. And together with France and Greece, Pedro Sánchez's administration has urged Brussels to promote EU regulations that, among other things, require all electronic devices sold in the EU that can connect to the internet—and therefore have access to social networks—to have parental verification. This means that parents should be the ones who decide whether their children can access social networks, and within what limits.

Is banning the solution? The first problem is that technically it's not that easy. There are tools, but they're not infallible. In fact, experts believe that young people will find a way to overcome any digital barriers that arise. And, aside from that, they also believe that banning will actually only generate greater interest among excluded children, while preventing them from properly educating themselves in the use of this ubiquitous communication channel.

The responsibility, in reality, lies with adults. On the one hand, and first and foremost, with parents, who are the ones who should set an example in the rational use of social networks—we often do evil For example, they are the ones who must accompany and interact with their children in their use of content. On the other hand, technology companies have the responsibility to actively assume their oversight role in the matter of what type of content they make accessible to minors. It's clear they don't.

A legal ban, as some countries are beginning to propose, aside from its dubious effectiveness, could become, above all, a gesture for show to calm consciences. In practice, it could lead to technology companies further ignoring their responsibility in content moderation and to no one seriously considering the need to curb the commercial exploitation of data, an exploitation carried out using attention-grabbing techniques that generate a strong addiction and to which children are especially vulnerable. There is, therefore, room to regulate; a regulation that doesn't necessarily have to primarily involve prohibition.

stats