Neurorights: The Final Frontier of Privacy

This Christmas I gifted myself a pair of AirPods Pro 3. Because I deserve nice things, damn it!
… And because my previous set gave up after one too many accidental drops.
A few weeks later, I’m scrolling through my iPhone’s Health app (procrastinating, as one does) and I notice something new: my heart rate. Just sitting there. A neat little graph showing my pulse throughout the day.
My first reaction was “huh, that’s pretty cool.” I didn’t even know the new AirPods could do that.
But, being the privacy nerd that I am, something kept bugging me. These things are tracking my heart rate? All the time? Whenever I’m wearing them? Without me explicitly turning anything on or giving permission beyond whatever I mindlessly agreed to when I first connected them?
What else could these AirPods monitor without me even realizing?
That question sent me down a rabbit hole, and let me tell you, Apple keeping tabs on my heart rate is pretty tame compared to everything else that’s happening in the consumer wearables space.
Turns out there’s a whole category of headphones and headbands that track more than your pulse. They actually log your brain activity. With actual EEG sensors that read your brainwaves while you work, meditate, sleep, or game. And they’re marketed as wellness devices for focus, sleep, stress management, and meditation.
And who doesn’t want better sleep these days, am I right?
Here’s the thing, though: these companies are collecting medical-grade brain data with basically zero oversight. Hospitals use this same EEG technology to diagnose epilepsy. But when you buy the consumer version? None of the medical privacy laws apply.
And most of these companies can do whatever they want with your brain data, like share it with third parties. Or keep it indefinitely on their servers and use it to train AI models.
Here’s something else I learned. Your brainwaves aren’t like a password you can change. They’re unique to you. Once that data exists, it exists forever. And whatever algorithms can decode from it today is nothing compared to what AI will be able to extract from it five years from now.
So yeah, down the hole I went, trying to figure out what these devices actually do and what companies are handling our data. Spoiler: you should probably be concerned.
I’m going to show you what I found. And then I’ll give you the practical steps I’ve figured out to protect your brain data. Because buying new gadgets hoping to sleep better doesn’t mean we’re quite ready to hand over our brainwaves, right?
Let’s start with how brain-tracking technology went from hospitals to your Amazon shopping cart.
From Hospital Equipment to Amazon Prime
Brain-computer interfaces aren’t new. Neurologists have been using EEG (electroencephalography) for decades to measure brain activity through electrodes placed on the scalp. Doctors rely on it to diagnose epilepsy, sleep disorders, and brain injuries.
What’s new is that this technology escaped the hospital.
Companies like Neuralink are grabbing the headlines with their surgical brain implants. Elon Musk’s company received FDA approval for human trials in May 2023 [1]. The technology is genuinely impressive. Researchers have achieved 97.5% accuracy decoding attempted speech in ALS patients [2], which could be life-changing for people with paralysis.
But here’s the critical difference: invasive brain implants are regulated as medical devices. They require surgery, FDA oversight, and full HIPAA protections for any data they collect.
With consumer neurotechnology none of those safeguards exist.
🔒This is where the free preview ends.
Join thousands of professionals who get the complete story
Our Deep Dive subscribers rely on these investigations to stay ahead of emerging threats and make informed decisions about technology, security, and privacy.
✅ Complete access to this investigation
✅ All future Deep Dive reports
✅ Searchable archive of past investigations
✅ No ads, no sponsored content
✅ Cancel anytime


