Skip to main content

Brain-reading headphones are here to give you telekinetic control

Neurable Headset
Neurable

For the past 45 years, SIGGRAPH, the renowned annual conference for all things computer graphics, has been a great place to look if you want a sneak peak at the future. In the 1980s, it was the place animation enthusiasts Ed Catmull and John Lasseter crossed paths for one of the first times. A decade later they had created Toy Story, the first feature length movie animated on computer. In the 1990s, it was home to a spectacular demo in which thousands of attendees used color-coded paddles to play a giant collaborative game of Pong. Today, online games played by millions are everywhere.

Recommended Videos

And, in 2017, it was where an early stage startup called Neurable and VR graphics company Estudiofuture demonstrated a game called The Awakening. In The Awakening, players donned a VR headset, along with head-mounted electrodes designed for reading their brain waves. Using machine learning technology to decode these messy brain signals, Neurable was able to turn thoughts into game actions. Players could select, pick up, and throw objects simply by thinking about it. No gamepads, controller, or body movement necessary. Is this the future of computer interaction as we know it?

Neurable
Neurable

In the closing days of 2019, Neurable has moved beyond gaming. But it’s lost none of its enthusiasm for building technology that can be operated by a person using nothing more than brain activity. Having just announced a new $6 million funding round, the company is preparing to create its first mass market product: what its creators claim will be the world’s first true “everyday brain-computer interface.” Coming soon to a head near you.

Reading your mind

According to Neurable CEO, Dr. Ramses Alcaide, it all started when he was a kid. Eight, to be precise. “My uncle got into a trucking accident and lost both his legs,” he told Digital Trends. “[Since then], the idea of developing technology for people who are differently abled has been my big, never-ending quest.”

Neurable headset
Neurable

It’s a quest that took Alcaide first to the University of Washington, for an undergrad degree in electrical engineering, during which he developed control systems for prosthetics. After that it was on to the University of Michigan for a Master’s and then Ph.D. in neuroscience, focusing on brain activity communication systems. While there, Alcaide played a key role in one of the big tech breakthroughs that lead to Neurable.

“We came up with a significant breakthrough that let us increase the signal-to-noise [ratio] dramatically for EEG and other types of signal processing systems for brain activity,” he said.

EEG, short for electroencephalography, is a noninvasive brain electrical activity monitoring method. Using electrodes placed along the scalp, it measures voltage fluctuations from ionic current in the neurons of the brain. While the resulting signal contains far less information than you would receive from an fMRI (functional magnetic resonance imaging) scan, the technology is more portable, since it doesn’t require large, expensive equipment that’s usually found only in clinics. That makes it more practical for use in real world brain-computer interfaces.

Using cutting edge A.I. algorithms to probe the EEG signals, Alcaide was convinced that the performance metrics were “significant enough” that a commercial product was possible. This could help people without the ability to use their limbs to easily control the machines around them. Its appeal could extend beyond that group, too. The question was what form the final brain-reading product would take.

The perfect form factor

Neurable’s first attempt at such a device was the DK1, an EEG reading cap and software developer kit. The company showed it off by carrying out demonstrations such as having users “think” a toy race car around a track or flying drones with their mind. The DK1 utilized six dry electrodes (look, Ma, no gel!) and took just two minutes to calibrate. That was a major step up from its previous proof-of-concept, which required 32 wet electrodes and a calibration time of 30 minutes. Still, Alcaide knew that this was not the Platonic ideal of what he was trying to create.

Neurable headset
Neurable

That ideal is what Neurable is now working to bring to market. While not yet shown off publicly, according to Alcaide it will be a set of headphones with the company’s electrode technology built in. Instead of looking like, frankly, the mad scientist rig of earlier EEG devices, it will instead resemble something all of us are already used to seeing on a daily basis. The headphones will appear entirely unremarkable — and that’s exactly the point.

“Because they’re headphones, you can just put them on and nobody will know that you’re wearing a brain-computer interface,” Alcaide said. “It just looks like an average pair of headphones, but [will give users] the ability to do hands-free control. It can also track your cognitive states throughout the day.”

Neurable headset
Neurable

These two main use cases — one active, the other passive — are what Alcaide is excited about. So far, Neurable has shown that its technology works. But, just like the earliest personal computers, it hasn’t been possible to do much more than show that it works. An everyday wearable headset could change that.

The “hearables” market, referring to ear-worn devices that also include smart features like A.I. assistants or health-tracking, is one of the fastest growing in tech. A recent report by International Data Corporation (IDC) suggested that smart earwear shipments will hit 273.7 million per year by 2023. Today it is, at 139.4 million, around half that. If Neurable is able to be first to introduce brain-reading tech into headphones, it would represent a unique market opportunity. If anything’s going to make Apple’s AirPods Pro look like yesterday’s news, this could well be it!

Neurable

With plans for Neurable to unveil the product sometime in 2020, Alcaide was not yet able to say exactly what these headphones will be able to do. “On a high level, it’s going to be what you standardly want to do with a pair of headphones,” he said. “I don’t know if I’m allowed to go into deeper detail than that, but it’s pretty self-evident [what that might be].”

In some ways, that is. It doesn’t take a Ph.D. in neuroscience to figure out that a brain-computer interface built into a pair of headphones could be great for letting you start, stop, or skip tracks without having to use your hands or speak commands out loud. But it could go far deeper than that. For instance, the cognitive state-tracking that Alcaide referred to could be used to cue up the perfect soundtrack for the moment, depending on how you are feeling.

It could go even further down the rabbit hole by playing you the right song not just to match your mood, but to bring you to a desired emotional state. Rather than just triggering a certain playlist to match your mood, a person could theoretically enter their desired emotion and then have a customized playlist generated to provoke that response.

The next big interface

Alcaide doesn’t think his team is just building a cool gimmick. He believes that brain-computer interfaces represent the next big computing paradigm shift; the logical evolution in a string of popular mass market technologies which started with the personal computer and then moved on to devices like the smartphone.

“Computing is becoming more and more mobile,” he said. “It’s also [going to become] more spatial. As it continues to go down that path, we need forms of interaction that enable us to more seamlessly interact with our technology.”

Neurable headset
Neurable

New ways of interacting with technology isn’t just about speed. The mouse, for instance, wasn’t just a faster keyboard. It changed the mode of interaction by which we interfaced with computers. Before the mouse, there was no way to carry over to a computer the universal shorthand of pointing to indicate an area of interest. As Steve Jobs told Playboy magazine (of all places) in the 1980s, “If I want to tell you that there is a spot on your shirt, I’m not going to do it linguistically: ‘There’s a spot on your shirt 14 centimeters down from the collar and three centimeters to the left of your button.’”

Instead, we do it by taking our finger (or our mouse cursor) and indicating the precise spot. The mouse empowered average users at a time when using a computer was scary. It ushered in the world of the graphical user interface, transforming computer operating systems into a recognizable, real world-inspired environment the user could move through.

A quarter century later, multi-touch did the same thing for mobile device touchscreens. Touching a device to navigate on it removed the middleman of the mouse, making interactions even more intuitive. It also opened up more possibilities for real world gestures such as “pinch to zoom,” while the adoption of sensors such as the accelerometer gave your smartphone a sense of where it was in terms of orientation.

Alcaide is confident that brain-computer interfaces will change how we interact with machines to a similar degree. If augmented reality becomes as big as many believe it will be, we’ll need a way to interact with it. Interfacing on the move with AR glasses can’t easily be done with a mouse or by touching a mobile display. Nor is A.I. assistant-style voice interaction an obvious solution in every scenario.

But a computer system which knows what you’re thinking very well could be.

Do the rewards outweigh the risks?

Alcaide said that brain-reading technology will help create richer interactions. Take messaging, for instance. “Right now, the way we communicate is so limited,” he said. “If I was to send you a message via text, versus speaking to you in real life, we could interpret those messages differently depending on the media. Brain-computer interfaces allow us to add further context to what we are sending to one another. Imagine that you were able to send a text message and the color bubble can tell you how the person intends to say something: whether they’re angry or upset or being sarcastic.”

Or what if the technology, constantly monitoring your background cognitive state, knew when to present you with certain information. In a world in which users are bombarded with constant notifications, media, ads, and other distractions, Alcaide thinks such technology could prevent data overload. “Your brain is able to determine which information is not important,” he said.

Neurable X Siggraph

Right now, we’re still at the start of this particular journey. Neurable is far from the only group working on brain-computer interfaces. Other interested parties range from cutting edge research labs at top universities to (who else?) Elon Musk’s Neuralink brain-computer interface. But Neurable’s founder and employees are convinced that they are onto a good thing.

Should we be worried about the growing number of tech companies looking to peer into our brains to power next-gen computer systems? After all, it’s hard to look at some of the more egregious abuses of sensitive user data in recent years and think that Silicon Valley is entirely a force to be trusted. Things are bad enough when algorithms simply comb through our data to predict what we’re interested in. Would it get worse if they could actually read our brains?

“[People certainly] have the right to be concerned,” Alcaide said. However, he also insists that Neurable is working hard to protect users’ brain data. He additionally thinks part of his job at Neurable is to educate the public about what exactly he’s doing — and what brain-computer interfaces will mean for us.

“We’re not reading thoughts,” he said. “We are reading high-level brain electrical changes, and then interpreting those to make assumptions as to what are a person’s thoughts.” Besides, he notes, it’s very easy for users to protect themselves against brain-reading tech such as this. “You just don’t put [the headsets] on, right?”

If Neurable has done its job properly, Alcaide hopes that the rewards will far, far outweigh the perceived risks. The next few years will shed some light on whether he’s correct.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Subaru’s electric comeback starts now: Trailseeker EV to debut in NYC
subaru trailseeker ev debut 2026 4  thumb

Subaru is finally accelerating into the EV fast lane. The automaker is officially teasing the 2026 Trailseeker, an all-new electric SUV set to debut at the New York International Auto Show next week. While details are still scarce, the Trailseeker marks Subaru’s long-awaited second entry into the EV space, joining the Solterra — and the expectations couldn't be higher.
The teaser image offers only a glimpse of the Trailseeker’s rear badge and taillight, but the name alone suggests rugged ambitions. It's a clear nod to Subaru’s outdoorsy heritage. But in the EV space, the outdoors belongs to brands like Rivian, whose upcoming R2 compact SUV is already turning heads. The Trailseeker is Subaru’s chance to reassert its identity in an electric age.
Currently, Subaru’s only EV is the Solterra, a joint venture with Toyota that shares a platform with the bZ4X. While the Solterra nails some Subaru essentials — all-wheel drive, spaciousness, and off-road capability — it falls short on key EV metrics. Reviewers have pointed to its modest 225-mile range, slow 100kW charging, and unremarkable acceleration, especially compared to rivals like the Hyundai Ioniq 5  or Ford Mustang Mach-E.
The hope is that Subaru has learned from these criticisms and is poised to deliver a more competitive product. The Trailseeker could either be a variation of a newer Toyota EV (possibly the next-gen C-HR+), or something entirely new under the shared platform strategy. Subaru previously announced that its next three EVs would be co-developed with Toyota, before launching four in-house EVs by 2028.
Given how long Subaru has waited to expand its EV offerings, the Trailseeker has to deliver. It's not just about adding a second electric model — it's about keeping pace with a market rapidly leaving legacy automakers behind. If the Trailseeker can improve on the Solterra's shortcomings and channel that classic Subaru ruggedness into a truly modern EV, it might just be the spark the brand needs.

Read more
I tested the world-understanding avatar of Gemini Live. It was shocking
Scanning a sticker using Gemini Live with camera and screen sharing.

It’s somewhat unnerving to hear an AI talking in an eerily friendly tone and telling me to clean up the clutter on my workstation. I am somewhat proud of it, but I guess it’s time to stack the haphazardly scattered gadgets and tidy up the wire mess. 

My sister would agree, too. But jumping into action after an AI “sees” my table, recognizes the mess, and doles out homemaker advice is the bigger picture. Google’s Gemini AI chatbot can now do that. And a lot more. 

Read more
What happened to Amazon’s inaugural Project Kuiper launch?
Official Imagery for Amazon Project Kuiper.

Amazon is aiming to take on SpaceX’s Starlink internet service using thousands of its own Project Kuiper satellites in low-Earth orbit.

The first Project Kuiper satellites were suppsoed to launch aboard a United Launch Alliance (ULA) Atlas V rocket from Cape Canaveral in Florida on April 9, but rough weather conditions forced the mission team to scrub the planned liftoff.

Read more