Home

Donate
Perspective

Why EU Policy Must Catch Up to the Neurotechnology Boom

Virginia Mahieu / Jul 7, 2025

Imagine this: you’re hard at work, when your headphones detect your brain is reaching its limit and suggest a break. During that break, you scroll through social media and see an advertisement that catches you at your most mentally vulnerable. You make an impulse purchase. Your headphones then sense your refreshed state and switch to a "focus mode" playlist.

This isn't science fiction. Every element of this scenario is theoretically possible with existing neurotechnologies, such as electroencephalography (EEG), which can read and interpret brain activity. These devices are rapidly entering the mainstream market with virtually no regulatory oversight.

After conducting a comprehensive analysis of nearly 300 neurotechnology companies worldwide, the Center for Future Generations discovered a surprising trend: among firms fully dedicated to neurotech, consumer firms now outnumber medical ones, making up 60% of the global neurotechnology landscape. And they're proliferating at an unprecedented rate—more than quadrupling in the past decade compared to the previous 25 years.

A regulatory grey zone

While medical neurotechnologies must undergo rigorous clinical trials and regulatory scrutiny in the form of conformity assessments for safety and efficacy that are assessed by national Notified Bodies, consumer devices face much more minimal barriers to market, requiring only compliance with standard product safety regulations and—if the device collects and processes data—European digital regulations (General Data Protection Regulation ((GDPR), AI Act, etc.). At the same time, these devices are rapidly entering the market through everyday wearables like headphones, earbuds, and glasses, often marketed as "smartwatches for your brain" that can enhance productivity, improve sleep, or reduce stress.

EEG, the technology at the heart of this revolution, has been around since the 1920s. It's crude and can't read individual thoughts, but it can detect patterns of brain activity related to focus, fatigue, and even emotional states. And when coupled with artificial intelligence and other personal data—like location, buying behaviors, and biometrics—these patterns can reveal far more about us than we might imagine.

I've witnessed this firsthand. At policy conferences and workshops, I invite volunteers to record brief snippets of brain data—just a minute of calm breathing followed by imagining something painful. With their consent, I then feed this data—just a simple line graph—into ChatGPT. With no special training, the LLM can identify a person’s mental state, and detect shifts in attention. For example, it can note the time someone moved from a calm and restful state to being focused and engaged or detect if they gradually lost focus and drifted into idleness—all from a simple consumer neurotech headband purchased online. This demonstrates just how easy it may soon be for the average person to access brain data and extract highly sensitive (and potentially valuable) information.

Brain data as a target for manipulation

While this AI-driven analysis of the brain could unlock valuable insights for productivity, stress, and mental health, two key factors make this kind of brain data collection risky. First, it is knowing the context in which the data was recorded—what content you were viewing, or what you were doing or saying at the time. Second is access to longitudinal data—patterns in how your brain typically responds over time, which can reveal when you react unusually. Taken in conjunction, this data could pave the way for unwanted cognitive surveillance, as well as profiling and discrimination.

Furthermore, like the Cambridge Analytica scandal, this raises concerns about behavioral targeting and democratic integrity—but with even greater stakes, as with this brain data, advertisers could potentially influence people on a massive scale, putting power in the hands of those who can afford it.

As this technology moves into the mainstream, the potential for misuse becomes profound. Imagine pre-election advertising that adapts its messaging based on your emotional reaction. Imagine disinformation campaigns tailored to your subconscious fears, measured directly from your brain. Imagine authoritarian governments monitoring emotional responses to propaganda, searching for dissent in citizens' brainwaves.

This marks a critical moment for European policymakers. With increasing miniaturization and integration into everyday products—from earbuds to glasses—these technologies could reshape how people work, rest, socialize, and interact with digital systems, raising urgent questions about data privacy, consent, and autonomy.

International frameworks, such as the OECD's Recommendation and the European Charter on the Responsible Development of Neurotechnologies, as well as the anticipated UNESCO Recommendation on the Ethics of Neurotechnology, offer high-level principles—but they remain voluntary.

Some jurisdictions are taking concrete action. Chile amended its constitution to enshrine the right to mental integrity. California and Colorado have amended their privacy legislation to protect brain data specifically. But Europe risks falling behind.

Our research reveals that Europe accounts for 38% of consumer neurotech firms worldwide, second only to North America's 48%. As it is cheap, portable, and safe, EEG accounts for nearly 65% of all consumer neurotech. As these technologies become more integrated into mainstream devices, the window for establishing proper governance is rapidly closing.

What policymakers should consider

The European Union has significant blind spots regarding the protection of brain data, which is not named explicitly in any EU regulation. Consumer neurotechnologies do not fall under the Medical Devices Regulation and therefore, do not undergo the same strict evaluation and validation procedures as medical devices, even though many such “wellness” devices have health implications or medically related functions. As the market is growing rapidly, this raises questions about whether a new framework for overseeing the biometric wellness industry is needed.

Further, despite the progressive stance on digital rights of the European Union, uncertainties and gaps remain in its legislation. In the GDPR, the definition of biometric data ( “personal data, [...] physical, physiological or behavioral, […] which allow or confirm the unique identification of that natural person”) leaves room for interpretation for how brain data should be classified, and thus handled, processed, and protected. Crucially, it is still unclear how the AI Act’s provisions against profiling, manipulation, and subliminal messaging will be enforced—particularly outside a clinic, where public oversight is less rigorous.

With its strong industrial base, Europe has the opportunity to lead with both innovation and regulation that is anticipatory, proportionate, and grounded in rights protection—before public safeguards are outpaced by technological momentum. To achieve this, the EU must stress-test its regulatory and oversight ecosystem to ensure it is equipped to manage the blurred boundaries and unique risks posed by consumer neurotechnologies, ideally through a dedicated task force or inquiry.

Critically, the EU needs to take a strategic, multifaceted approach to neurotechnology: funding and competitive support for ethical innovation that addresses unmet medical needs; clear and cohesive rules specific to brain data; and broad public discourse and multistakeholder dialogue on the risks and benefits of these technologies before they become ubiquitous in our daily lives.

Mental privacy is a fundamental human right. Brain data cannot become just another commodity in the digital economy. The time to act is now—before people’s private thoughts become the next frontier of surveillance capitalism.

Authors

Virginia Mahieu
Dr. Virginia Mahieu is the Director of the Neurotechnology Program at the Center for Future Generations, an independent think-and-do tank helping decision-makers anticipate and govern rapid technological change. Virginia holds a PhD in sensory neuroscience from the University of Sussex and a Postgra...

Related

Brain Privacy Rights Are Not Enough—Neurotech Calls for Strengthening Freedom of ThoughtJuly 15, 2024
Perspective
Habeas Cogitationem: A Writ to Enforce the Right to Freedom of Thought in the Neurotechnological EraApril 16, 2025

Topics