News update
  • Five Shariah Banks to Merge Into State-run Sammilito Islami Bank     |     
  • Dhaka’s air ‘unhealthy for sensitive groups’ Wednesday morning     |     
  • US proposes that the UN authorize a Gaza stabilization force for 2 years     |     
  • Democrat Zohran Mamdani is elected New York City mayor     |     
  • Martyr Mugdha's brother Snigdha steps into politics with BNP     |     

UN Calls for Ethical Safeguards in Brain Technology

By Conor Lennon Human rights 2025-11-05, 9:05pm

img-20251105-wa0024-fe4ac9c442cfc7ea3a97a437b96731171762355104.jpg

Young people learn gas welding techniques using virtual reality software.




As wearable devices begin to tap into our mental states, UN experts warn that without ethical safeguards, the right to freedom of thought could be at risk from unchecked innovation.

It may sound like science fiction, or even magic: the ability to communicate, control a computer, or move a robotic limb using only thought. However, it is not just possible—it is already transforming the lives of patients with severe disabilities.

In 2024, a UN conference in Geneva witnessed a young man in Portugal with “locked-in syndrome”—a neurological disorder that left him unable to move any part of his body—“speak” using a brain-computer interface (BCI) that translated his thoughts into words spoken in his voice, allowing him to answer questions.

This striking example illustrates the growing field of neurotechnology, which offers hope for those living with disabilities and mental disorders such as Parkinson’s disease, epilepsy, and treatment-resistant depression.

Mental privacy: A lost battle?

While the use of neurotechnology in medical care is strictly regulated, its adoption in other areas raises concerns. Headbands, watches, and ear pods that monitor heart rate, sleep patterns, and other health indicators are increasingly popular. The data they collect can reveal deep insights into private thoughts, reactions, and emotions, potentially improving quality of life.

However, this raises ethical and human rights challenges. Manufacturers can currently sell or share the data without restriction, meaning individuals risk having their most intimate mental privacy intruded upon—their thoughts could be exposed, monetized, or even manipulated.

“It’s about freedom of thought, agency, and mental privacy,” says Dafna Feinholz, acting head of Research, Ethics, and Inclusion at UNESCO. She warns that mental privacy is already being eroded in an age of social media, with users willingly uploading personal information to platforms owned by a handful of giant tech companies.

“People say ‘I have nothing to hide,’ but they don’t understand what they’re giving away,” she adds. “We are already profiled by AI, and now there is the possibility of entering thoughts, directly measuring brain activity, and inferring mental states. These technologies could even alter your nervous system, allowing manipulation. People need to know these tools are safe and that they can stop using them if they wish.”

The UN official emphasizes that while we must live with technology, humans must remain in control. “The more we surrender to these tools, the more we risk being overtaken. We need to control what they do and what we want them to achieve, because we are the ones producing them. This is our responsibility.”

Time for an ethical approach

Ms. Feinholz spoke to UN News from Samarkand, Uzbekistan, where delegates from UNESCO member states formally adopted a “Recommendation”—non-binding guidance on principles and best practices for national policies—on the ethics of neurotechnology. The emphasis is on protecting human dignity, rights, and freedoms.

The guidance advocates promoting well-being, avoiding harm, safeguarding freedom of thought, and ensuring developers, researchers, and users uphold ethical standards and accountability.

Member states are advised to implement legal and ethical frameworks to monitor neurotechnology, protect personal data, and assess impacts on human rights and privacy.

“Humans have to be in the loop,” Ms. Feinholz declares. “There must be transparency, redress, and accountability. For example, if you eat at a restaurant and get sick, you can complain. The same principle should apply to neurotechnology: even if you don’t fully understand how it works, there must be a chain of accountability.”