It is necessary to put limits on neurotechnology
UNESCO approves the first global ethical framework for technologies capable of influencing or modulating brain activity
Neurotechnology, the tools that can read, modulate, or interpret our brain activity, is advancing at a rapid pace. But with this rapid evolution also comes an increased risk of unwanted and unauthorized interference with our minds, which can affect privacy and even inner freedom. Faced with this global challenge, UNESCO has taken a historic step: on November 11, 2025 approved the first global ethics framework for neurotechnology, It establishes universal standards to ensure that this scientific and technological revolution serves to improve the quality of life and respects human dignity. The text was drafted by a group of 24 international experts, incorporating thousands of contributions from civil society, the private sector, research institutions, and governments, in a pluralistic and transparent process.
Neurotechnology is no longer confined to laboratories or hospitals. At the forefront of medicine, this discipline has helped develop treatments for diseases such as Parkinson's and depression; it has enabled direct communication between the brains of people with paralysis and machines, making it possible to control prostheses with neural activity. Neurotechnology could define the next great frontier of human progress. But, as UNESCO itself warns, this revolution, which can bring so many benefits, is not without risks. At risk are aspects that, as humans, are very valuable, such as the integrity of the mind, personal identity, mental privacy, and freedom of thought.
The problem is compounded when these increasingly affordable technologies begin to infiltrate everyday life. Headphones, earphones, wristbands, and other devices are now available that measure stress, optimize sleep, or improve attention using brain data, often in direct connection to the internet. This neural data, as it's called, is extremely personal and sensitive: it can reveal emotions, reactions, and even mental state. But in many cases, it's collected without clear regulation.
Respect fundamental rights
The recommendations adopted by UNESCO, which have already entered into force, define a set of essential safeguards to ensure that neurotechnology contributes to human well-being and progress without violating fundamental rights. Some of the key principles include upholding the dignity of the human mind as an inviolable principle; considering neural data as sensitive, requiring explicit consent, clear purposes, and transparency; providing special protection for vulnerable groups, such as children, young people, people with disabilities, or those with mental health problems, to prevent non-therapeutic uses in these communities; preventing the use of neurotechnology in areas such as work, education, or consumption to monitor performance, condition behavior, control attention, or manipulate decisions without awareness and supervision; and ensuring that any development, deployment, or commercialization of neural technologies clearly explains its potential physical, cognitive, and emotional effects, and that guarantees of safety, accessibility, and equity exist.
The current context makes this regulation urgent and encourages its implementation. According to UNESCO data, investment in neurotechnology companies grew by 700% between 2014 and 2021. This economic boom is accompanied by an accelerated proliferation of devices not only for medical purposes, but also aimed at any consumer, promising to access and monitor our brains.
In 2019, for example, the author of this article participated in an international forum entitled The Next Brain The conference brought together 34 specialists from around the world to discuss these very issues, and included the presentation of initial data from a project conducted in China to monitor the attention spans of pre-adolescent and adolescent students in real time. The project's stated aim was to identify educational practices that improved their attention levels. However, a mobile app allowed parents direct access to this data, potentially leading them to increase pressure on their children, and the data was also stored on the app.
Furthermore, the combination of these technologies with artificial intelligence has exponentially increased the possibilities for analysis: detecting neural patterns, emotions, mental states, and, in the nearer future than it seems, reconstructing mental images. This opens up new risks. What does free thought or even free will mean if emotions can be read or induced? How do we ensure mental privacy? What weight will intimate identity carry in the face of increasingly sophisticated and efficient algorithms?
The UNESCO recommendation is not a universally binding law. In fact, it is a minimum standard, but an essential one. For now, its main virtue is its symbolic and moral force, as it is the first global standard to establish ethical limits on neurotechnology. Now the real work begins. It is up to member states to incorporate it into their legislation to transform these principles into laws, regulations, best practices, controls, and, above all, social awareness about what it means to possess and grant access to the human brain. With this standard, UNESCO invites us to reflect and decide.