As neurotechnology enters mainstream consumer markets, Europe faces a critical policy decision: protect brain data now, or risk surrendering cognitive autonomy to commercial interests. With brain-sensing earbuds, electroencephalogram headsets, and dream-influencing apps already in circulation, current EU regulations lag the ethical demands of this rapidly evolving landscape. If the human mind is to remain sovereign, neural data must be recognized, regulated, and protected as the most sensitive form of personal information.
Chris Kremidas-Courtney
Soon, a pair of earbuds may do more than play your favorite podcast, they may also listen to your thoughts. With companies like Apple (US), myBrain (France), and Enertech (China) quietly preparing the next generation of brain-sensing wearable sensors, we’re entering a world where brainwaves can be monitored, decoded, and even shaped, ushering in a new age of consumer neurotechnology that is evolving faster than our laws can keep up.
If we don’t act now to protect the sanctity of the human mind, we risk losing one of our most fundamental rights; the right to think freely.
Brain-computer interfaces were until recently the stuff of labs and clinics. But now they’re moving into homes, offices, and even school classrooms. Wearable electroencephalogram (EEG) headsets from companies like Muse (Canada) and Brainbit (Germany) are already being marketed for productivity and mental wellness. Neurostimulation devices like Halo Sport promise to supercharge learning and athletic performance. And if Apple follows through on its 2023 patent, we may see AirPods that passively record brain activity through sensors nestled in our ears.[i]
These devices may not require surgery, but they pierce the privacy of the mind. EEG headsets and neurofeedback tools can quietly collect vast amounts of information about a person’s inner life: focus levels, stress, emotional states, even reactions to ads or content. This opens the door to a new kind of behavioral surveillance that reaches inside our heads.
That danger is now recognized at the highest levels. In 2023, UNESCO convened global leaders, ethicists, and technologists who jointly called for stronger governance of neurotechnology to protect mental privacy, freedom of thought, and cognitive liberty. Their communique warned that “neurotechnology should never be used to exploit or manipulate human thoughts,” and urged states to establish legal safeguards against such risks.[ii] This global momentum reinforces the need for democratic societies to act before neuro-intrusive tools become normalized and ungoverned.
Neurotechnology holds enormous promise: from accelerating learning and improving mental health to supporting people with neurological conditions or enhancing communication for those with speech impairments. But these benefits can only be realized if the tools are developed and brought to the market within a framework that protects cognitive privacy.
The past two decades of digital life taught us that every innovation in convenience comes with a trade-off in privacy. The rise of behavioral advertising and algorithmic manipulation showed just how easily our choices can be shaped when our attention is for sale (or we give it away). Now imagine what happens when it’s not only clicks or scrolls, but human brainwaves become part of that system.
Legal scholar Nita Farahany puts it bluntly: “Neural data is the last bastion of privacy.” Once breached, it gives corporations a window into our thoughts, biases, emotions, and even unconscious impulses.” [iii]
This isn’t hypothetical. Emotiv’s early EEG headsets allowed third-party data sharing for research and marketing. [iv] And brain-sensing devices like AirPods could normalize the casual collection of cognitive data at massive scale.
A 2024 study by the Neural Rights Foundation found that 29 of the 30 companies in the consumer neurotechnology market already appear to have access to user’s neural data and “provide no meaningful limitations to this access.” That same study indicated that a clear majority of these companies’ policies allow them to share this data with third parties.[v]
Even without direct stimulation, neurotechnology can subtly shape how we think and feel. Neurofeedback apps that reward calmness or focus act as digital conditioning tools, training users to conform to predefined mental states. Combined with persuasive design and dopamine-driven feedback loops, real-time brain monitoring introduces new forms of behavioral shaping, often without our awareness or consent.
This shift marks a move from measurement to manipulation. What’s at stake is not only the collection of our most intimate data, but the quiet erosion of mental autonomy. As these technologies evolve, the line between observing brain activity and actively shaping it continues to blur.
In more advanced cases, sleep itself is being targeted. MIT’s Dormio Project has shown that wearables can detect hypnagogic sleep states and deliver targeted prompts to influence dream content, an effect that can later impact creativity, mood, and even behavior.[vi] A 2021 survey of 400 marketing firms indicated that 77% planned to experiment with dream advertising by 2025, raising deep ethical concerns.[vii]
In 2021, Molson Coors ran a promotional campaign claiming to “shape dreams” about its products using music and audio cues. This triggered an open letter from more than 40 cognitive scientists warning the US Federal Trade Commission (FTC) about the dangers of targeted dream incubation. As the researchers wrote, “Dream incubation advertising is not some fun gimmick, but a slippery slope with real consequences.” [viii] In the four years since that letter, the US FTC has not issued any public statements, warning letters, or complaint filings specifically about dream advertising or targeted dream incubation (TDI).
The convergence of brain-sensing wearables, emotionally tuned content, and algorithmic prediction moves neurotechnology from measurement to manipulation. The risk is not only that our data is being collected but that we can be discreetly influenced without our knowledge or consent.
The urgent need to regulate neurotechnology
The European Union has been a global leader in digital rights, from GDPR to the AI Act. But when it comes to the inner frontier of brain data and cognitive liberty, its regulatory frameworks remain dangerously behind the curve.
Right now, neural data is not explicitly classified as sensitive personal data under GDPR.[ix] This leaves the door open for tech companies to process brainwave data as just another input stream, rather than extremely sensitive private information.
Farahany warns that neuro-intrusive features are being quietly normalized through wellness marketing and sleek design. This evolution reframes brain data extraction as a product feature, not a privacy risk. She further cautions that framing such tools as empowering self-optimization products masks their potential to embed behavioral surveillance into daily life via health apps, meditation tools, and personalized content. If left unregulated, these tools may reshape cognition under the guise of convenience, long before the public understands what has been traded away.[x]
Various thinkers in Europe such as Marcello Ienca have been calling for an update to the EU’s laws to protect against invasive neurotechnology as early as 2017, even calling for updates to the concept of human rights.[xi] [xii] In 2021, a Council of Europe roundtable on Neurotechnologies and Human Rights concluded that neurorights, including mental privacy, mental integrity, and cognitive liberty, represent a necessary and emerging category of human rights tailored to the brain–mind domain. They also saw these not merely extensions of existing rights but that they require a new layer of normative protectionagainst misuse of neurotechnologies.[xiii]
More recently, a 2024 European Parliament Science and Technology Options Assessment (STOA) on neurotechnology called for new policy frameworks to protect mental autonomy and mental integrity from non-consensual manipulation by neurotech.[xiv]
Elsewhere, in 2021 Chile became the first country to embed “neurorights” in its constitution: the rights to mental privacy, identity, free will, and equitable access to cognitive technologies.[xv]
In the United States, a growing number of states are also stepping up. Colorado, California, and Montana have each passed laws that explicitly protect neural data as sensitive personal information, making it subject to higher standards of consent, purpose limitation, and data handling. Montana’s law goes further by banning the use of neural data for discriminatory profiling, an essential safeguard for preserving psychological integrity and preventing algorithmic exploitation based on subconscious traits.[xvi]
These laws define neural data broadly, encompassing brainwaves, mental states, and other cognitive signals collected by consumer neurotechnology. Europe risks falling behind not only in innovation but also in digital ethics if it does not act with similar resolve.
What the EU Can (and Should) Do Now
Europe doesn’t need to start from scratch. It already has one of the world’s most advanced regulatory frameworks for data protection (GDPR) and medical safety (MDR). The key is to apply these tools strategically and expand them where needed to protect cognitive self-determination in the age of neurotechnology. Here’s how:
Colorado’s recent law offers a promising template. It defines neural data to include “data generated by the brain, spinal cord, or peripheral nervous system” and requires explicit, opt-in consent before such data can be collected or processed.[xvii] EU policymakers could adopt similar statutory language under GDPR or complementary regulation.
1. Classify Brain Data as Sensitive Personal Data. Amend GDPR or issue guidance to explicitly categorize neural data such as EEG signals, cognitive states, and emotional profiles as sensitive personal data. This would trigger stricter consent and purpose limitation rules.
2. Expand the Scope of the Medical Devices Regulation (MDR). Leverage the existing MDR (EU 2017/745), especially Annex XVI, which includes non-medical brain-altering devices. Actions could include:
- Clarifying that EEG-enabled wearables and neurofeedback tools qualify as Class IIa or higher risk if they interpret or influence cognitive states.
- Amending Annex XVI to include passive brain-monitoring devices, not just those that stimulate.
- Applying Software-as-a-Medical-Device (SaMD) oversight to apps using neural data for behavioral inference.
This would close current loopholes allowing “wellness” devices to avoid accountability despite their profound psychological impact.
3. Enshrine Cognitive Self-determination into EU Law. If Europe is to lead ethically in the neurotechnological age, it must move decisively to safeguard the sovereignty of the human mind. Europe has a chance to do for neurorights what it did for digital data: set a global benchmark for citizens’ cognitive self-determination, a goal increasingly shared by international bodies such as UNESCO, which in 2023 called for global neurotechnology governance rooted in mental privacy, autonomy, and freedom of thought. Therefore, the EU should introduce a Neurorights Charter guaranteeing:
- Mental privacy (protection against unauthorized brain data collection).
- Cognitive autonomy (freedom from manipulation or influence).
- Psychological integrity (safeguards against subtle behavioral programming).
An ideal model for a European Neurorights Charter would blend the urgency and clarity of Chile’s 2021 constitutional neurorights amendment with the legal depth and normative strength of the Council of Europe’sOviedo Convention and the EU Charter of Fundamental Rights. One option could be to develop a new protocol under the Oviedo Convention specifically for neurorights to codify protections against non-consensual mental manipulation, neuro-discrimination, and the commodification of brain data.[xviii]
Additionally, any European Neurorights Charter must align with and expand upon Article 1 (Human Dignity), Article 7 (Respect for Private and Family Life), and Article 10 (Freedom of Thought) of the EU Charter of Fundamental Rights, but push further by incorporating:
- Mental privacy as inviolable.
- Cognitive liberty as the foundation of democratic agency.
- Psychological integrity as a protected state.
- Transparency and consent as mandatory for any neuro-interactive product.
This approach could produce a principled, enforceable, and future-proof system of protections for citizens’ cognitive self-determination.
4. Require Ethical Transparency and User Control. Mandate transparency from any device that reads or alters mental states. This includes:
- Full plain-language disclosure of what is being collected and how it’s used.
- Independent auditing of any “neuro-wellness” claims.
- Opt-in consent (not opt-out defaults).
- Manual override or pause functions on devices used for neurofeedback or behavioral nudging.
As Chile and a few US states create a patchwork of protections, the EU has an opportunity to set a new global benchmark through a NeuroSafe label, ensuring coherence, consumer trust, and ethical tech design from the outset.
5. Create a Neuro-Safe Certification for Consumer Products. Establish a “NeuroSafe” label (similar to CE or Energy Star) for brain-interfacing technologies. Criteria should include:
- Transparent data handling
- No covert behavior manipulation
- Strict limits on third-party access to brain data without explicit, informed consent.
6. Fund Independent Research and Public Education. Support long-term studies on the impact of consumer neurotech, and invest in public literacy campaigns about brain data, dream manipulation, and the importance of mental sovereignty.[xix] Public education in the EU should also focus on helping people think clearly, work with others, and use technology wisely in a world shaped by persuasive technologies such as neurotechnology and AI.[xx]
Consumer neurotechnology is already being woven into everyday life through earbuds, headbands, and even smart pillows.[xxi] The potential to sense, store, and subtly shape human thought is here and advertisers, app developers, and platform owners are already experimenting with it.
If the 20th century taught us that human rights must evolve to meet new threats, the 21st century demands we extend those rights to the last frontier of freedom: the human mind. Cognitive liberty is the bedrock of democracy, dignity, and self-determination. As neurotechnology advances, so too must our resolve to protect citizens. Europe now has the tools, precedent, and moral imperative to act. By embedding neurorights into law and aligning innovation with ethics, the EU can ensure the digital future respects not just our data, but our very thoughts. The question is no longer whether we need protections but whether we can build them before it’s too late.
[i] Aslam, M. D. (2024, March 5). Apple’s next-gen AirPods with brainwave monitoring: A neurotechnology revolution. Medium.
[ii] UNESCO. (2023, July 18). Ethics of neurotechnology: UNESCO, leaders and top experts call for solid governance.
[iii] Farahany, N. A. (2023). The battle for your brain: Defending the right to think freely in the age of neurotechnology. St. Martin’s Press.
[iv] Ienca, M., & Andorno, R. (2017). Towards new human rights in the age of neuroscience and neurotechnology. Life Sciences, Society and Policy, 13(1), 5.
[v] Genser, J., Damianos, S., & Yuste, R. (2024, April). Safeguarding brain data: Assessing the privacy practices of consumer neurotechnology companies. Neurorights Foundation.
[vi] Haar Horowitz, A., Fenn, K. M., & Stickgold, R. (2020). Targeted dream incubation: Manipulating dream content using sensory cues. Consciousness and Cognition, 78, 102863.
[vii] Marlan, Dustin, The Nightmare of Dream Advertising (February 16, 2023). 65 William & Mary Law Review 259 (2023).
[viii] Stickgold, R., Wamsley, E. J., Mednick, S., et al. (2021, June 8). Open letter on the ethical dangers of dream advertising.
[ix] Bublitz, J. C. (2022). The right to mental integrity and cognitive liberty. In Ienca, M. & Andorno, R. (Eds.), Responsible neurotechnology (pp. 125–140). Springer.
[x] Farahany, N. (2025, August 5). Your meditation app may soon have more legal protections than brain surgery. Thinking Freely.
[xi] Ienca, M., and Andorno, R. (2017).
[xii] Kremidas‑Courtney, C. (2023, June 14). From countering disinformation to cognitive self‑determination. Friends of Europe.
[xiii] Council of Europe. (2021). Round table on the human rights issues raised by the applications of neurotechnologies. Steering Committee for Human Rights in the Fields of Biomedicine and Health (DH-BIO).
[xiv] European Parliament, Panel for the Future of Science and Technology (STOA). (2024). The protection of mental privacy in the area of neuroscience (EPRS STU (2024)757807). European Parliamentary Research Service.
[xv] Guzmán H., L. (2022, February). Chile: pioneering the protection of neurorights. The UNESCO Courier, 2022(1), 13–14.
[xvi] Szabo, L. (2024, July 24). Colorado, California, Montana lead the way on neural data privacy. KFF Health News.
[xvii] Szabo, L. (2024, July 24).
[xviii] Council of Europe. (1997). Convention on Human Rights and Biomedicine (Oviedo Convention).
[xix] Genser, J. et al (2024, April)
[xx] Kuiper, E., Świeboda, P., & Walther, C. C. (2025, April 7). Beyond skills: How to equip the EU with hybrid intelligence. European Policy Centre.
[xxi] Kremidas‑Courtney, C. (2024). The Rest of Your Life: Five Stories of Your Future, Brahmaloka Press