Skip to main content

From drones to bionic arms, here are 8 examples of amazing mind-reading tech

Mind-reading tech is here to help, not put you away for thoughtcrime

Elon Musk is a firm believer that brain-computer interfaces will be a big part of how we interact with computers in the future. But make no mistake: Mind-reading machines are here already. As science fiction writer William Gibson has noted, “The future is already here — it’s just not evenly distributed.”

Without further ado, then, here are eight examples of amazing mind-reading tech being explored in some of the world’s most exciting research labs.

Recommended Videos

Mind-reading hearing aids

kzenon / 123RF Stock Photo

Hearing aids are amazing inventions, but they run into problems in certain scenarios, such as crowded rooms where multiple people are speaking at the same time. One possible solution? Add in a dose of mind-reading.

That’s the broad idea behind a so-called “cognitive hearing aid” developed by researchers at the Columbia University School of Engineering and Applied Science. The device is designed to read brain activity to determine which voice a hearing aid user is most interested in listening to, and then focusing in on it. It’s still in the R&D phase, but this could be a game-changer for millions of deaf or hard of hearing people around the world.

“Working at the intersection of brain science and engineering, I saw a unique opportunity to combine the latest advances from both fields, to create a solution for decoding the attention of a listener to a specific speaker in a crowded scene which can be used to amplify that speaker relative to others,” Nima Mesgarani, an associate professor of electrical engineering, told Digital Trends.

Future interrogation techniques

Ken Jones/University of Toronto

Want an idea of what future interrogation scenarios might look like? Researchers at Japan’s Ochanomizu University have developed artificial intelligence that’s capable of analyzing a person’s fMRI brain scans and providing a written description of what they have been looking at. Accurate descriptions can extend to the complexity of “a dog is sitting on the floor in front of an open door” or “a group of people standing on the beach.”

Ichiro Kobayashi, one of the researchers on the project, said that there are no plans to use it as the basis for a supercharged lie detector… just yet, at least. “So far, there are not any real-world applications for this,” he told Digital Trends. “However, in the future, this technology might be a quantitative basis of a brain-machine interface.”

Another project from neuroscientists at Canada’s University of Toronto Scarborough was able to recreate the faces of people that participants had previously seen.

Next-gen bionic prostheses

The Mind-Controlled Bionic Arm With a Sense of Touch

Bionic prostheses have made enormous strides in recent years — and the concept of a mind-controlled robot limb is now very much a reality. In one example, engineers at Johns Hopkins built a successful prototype of such a robot arm that allows users to wiggle each prosthetic finger independently, using nothing but the power of the mind.

Perhaps even more impressively, earlier this year a team of researchers from Italy, Switzerland, and Germany developed a robot prosthesis which can actually feed sensory information back to a user’s brain — essentially restoring the person’s sense of touch in the process.

“We ‘translate’ information recorded by the artificial sensors in the [prosthesis’] hand into stimuli delivered to the nerves,” Silvestro Micera, a professor of Translational Neuroengineering at the Ecole Polytechnique Fédérale de Lausanne School of Engineering, told Digital Trends. “The information is then understood by the brain, which makes the patient feeling pressure at different fingers.”

Early warning epilepsy warnings

Hospital uses wearable device to create new healthcare reality for people with epilepsy

For people with epilepsy, seizures can appear to come out of nowhere. Unchecked, they can be extremely dangerous, as well as traumatic for both the sufferer and those people around them. But mind-reading tech could help.

Researchers at the University of Melbourne and IBM Research Australia have developed a deep learning algorithm which analyzes the electrical activity of patients’ brains and greatly improves seizure prediction.

“Our hope is that this could inform the development of a wearable seizure warning system that is specific to an individual patient, and could alert them via text message or even a fitbit-style feedback loop,” Stefan Harrer, an IBM Research Australia staff member who worked on the recent study, told Digital Trends. “It could also one day be integrated with other systems to prevent or treat seizures at the point of alert.”

Treating impulsive behavior

888 online gambling
Image used with permission by copyright holder

In not dissimilar work, researchers from Stanford University School of Medicine have developed mind-reading tech that could be used to moderate dangerously impulsive behavior.

Their system watches for a characteristic electrical activity pattern in the brain which occurs prior to impulsive actions, and then applies a quick jolt of targeted electricity. (No, it’s not as painful as that makes it sound!)

“This is the first example in a translatable setting that we could use a brain machine interface to sense a vulnerable moment in time and intervene with a therapeutic delivery of electrical stimulation,” Dr. Casey Halpern, assistant professor of neurosurgery, told Digital Trends. “This may be transformative for severely disabling impulse control disorders.”

Controlling virtual reality

University of Michigan

Imagine if it was possible to navigate through a virtual reality world without having to worry about any handheld controller. That’s the idea behind a project by tech company Neurable and VR graphics company Estudiofuture. They’re busy developing the technology that will make brain-controlled virtual reality a… well, real reality.

Neurable’s custom headset monitors users’ brain activity using head-mounted electrodes to determine their intent. While there are limitations (it’s not ideal for typing or navigating menus), it could nonetheless be invaluable for making fields like VR gaming even more immersive than they already are.

Mind-reading drones

Image used with permission by copyright holder

When we control a vehicle, it’s important that our ability to manipulate its controls are as close as possible to our ability to perceive potential obstacles. In other words, we see something; we process it; our brain tells our hands to turn the wheel. Wouldn’t it be a whole lot easier if we just cut out the middleman?

That’s the concept behind neural interfaces which make it possible to steer drones (or even swarms of drones) using nothing more than our thoughts. Back in 2016, the University of Florida made headlines when it organized the world’s first ever brain-controlled drone race. Participants donned electroencephalogram headsets powered by brain-computer interface (BCI) technology, and then flew drones around a course using only their brainwaves.

While there’s still work to go, this could potentially be a useful method of rethinking the way in which future vehicles are piloted. Speaking of which…

The brainy way to drive a car

Renault KADJAR presenterar Team Will Power

So you’ve got a new possible means of controlling a vehicle using brainwaves, but it’s not quite ready for prime time just yet. What do you test it on? Driving a car, of course — with the passengers inside. At least, that was the basis for an intriguing (if terrifying) experiment carried out by carmaker Renault late last year.

The company recruited three willing participants and gave them the opportunity to work together to mentally pilot a modified Renault Kadjar SUV. One person controlled the car’s left turns, another controlled its right turns, and the third handled its acceleration.

No, this is unlikely to make it to our roads any time soon, but it’s certainly a memorable tech demo. Even if, quite frankly, we’d rather walk to pick up our groceries!

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Star Wars legend Ian McDiarmid gets questions about the Emperor’s sex life
Ian McDiarmid as the Emperor in Star Wars: The Rise of Skywalker.

This weekend, the Star Wars: Revenge of the Sith 20th anniversary re-release had a much stronger performance than expected with $25 million and a second-place finish behind Sinners. Revenge of the Sith was the culmination of plans by Chancellor Palpatine (Ian McDiarmid) that led to the fall of the Jedi and his own ascension to emperor. Because McDiarmid's Emperor died in his first appearance -- 1983's Return of the Jedi -- Revenge of the Sith was supposed to be his live-action swan song. However, Palpatine's return in Star Wars: Episode IX -- The Rise of Skywalker left McDiarmid being asked questions about his character's comeback, particularly about his sex life and how he could have a granddaughter.

While speaking with Variety, McDiarmid noted that fans have asked him "slightly embarrassing questions" about Palpatine including "'Does this evil monster ever have sex?'"

Read more
Waymo and Toyota explore personally owned self-driving cars
Front three quarter view of the 2023 Toyota bZ4X.

Waymo and Toyota have announced they’re exploring a strategic collaboration—and one of the most exciting possibilities on the table is bringing fully-automated driving technology to personally owned vehicles.
Alphabet-owned Waymo has made its name with its robotaxi service, the only one currently operating in the U.S. Its vehicles, including Jaguars and Hyundai Ioniq 5s, have logged tens of millions of autonomous miles on the streets of San Francisco, Los Angeles, Phoenix, and Austin.
But shifting to personally owned self-driving cars is a much more complex challenge.
While safety regulations are expected to loosen under the Trump administration, the National Highway Traffic Safety Administration (NHTSA) has so far taken a cautious approach to the deployment of fully autonomous vehicles. General Motors-backed Cruise robotaxi was forced to suspend operations in 2023 following a fatal collision.
While the partnership with Toyota is still in the early stages, Waymo says it will initially study how to merge its autonomous systems with the Japanese automaker’s consumer vehicle platforms.
In a recent call with analysts, Alphabet CEO Sundar Pichai signaled that Waymo is seriously considering expanding beyond ride-hailing fleets and into personal ownership. While nothing is confirmed, the partnership with Toyota adds credibility—and manufacturing muscle—to that vision.
Toyota brings decades of safety innovation to the table, including its widely adopted Toyota Safety Sense technology. Through its software division, Woven by Toyota, the company is also pushing into next-generation vehicle platforms. With Waymo, Toyota is now also looking at how automation can evolve beyond assisted driving and into full autonomy for individual drivers.
This move also turns up the heat on Tesla, which has long promised fully self-driving vehicles for consumers. While Tesla continues to refine its Full Self-Driving (FSD) software, it remains supervised and hasn’t yet delivered on full autonomy. CEO Elon Musk is promising to launch some of its first robotaxis in Austin in June.
When it comes to self-driving cars, Waymo and Tesla are taking very different roads. Tesla aims to deliver affordability and scale with its camera, AI-based software. Waymo, by contrast, uses a more expensive technology relying on pre-mapped roads, sensors, cameras, radar and lidar (a laser-light radar), that regulators have been quicker to trust.

Read more
Uber partners with May Mobility to bring thousands of autonomous vehicles to U.S. streets
uber may mobility av rides partnership

The self-driving race is shifting into high gear, and Uber just added more horsepower. In a new multi-year partnership, Uber and autonomous vehicle (AV) company May Mobility will begin rolling out driverless rides in Arlington, Texas by the end of 2025—with thousands more vehicles planned across the U.S. in the coming years.
Uber has already taken serious steps towards making autonomous ride-hailing a mainstream option. The company already works with Waymo, whose robotaxis are live in multiple cities, and now it’s welcoming May Mobility’s hybrid-electric Toyota Sienna vans to its platform. The vehicles will launch with safety drivers at first but are expected to go fully autonomous as deployments mature.
May Mobility isn’t new to this game. Backed by Toyota, BMW, and other major players, it’s been running AV services in geofenced areas since 2021. Its AI-powered Multi-Policy Decision Making (MPDM) tech allows it to react quickly and safely to unpredictable real-world conditions—something that’s helped it earn trust in city partnerships across the U.S. and Japan.
This expansion into ride-hailing is part of a broader industry trend. Waymo, widely seen as the current AV frontrunner, continues scaling its service in cities like Phoenix and Austin. Tesla, meanwhile, is preparing to launch its first robotaxis in Austin this June, with a small fleet of Model Ys powered by its camera-based Full Self-Driving (FSD) system. While Tesla aims for affordability and scale, Waymo and May are focused on safety-first deployments using sensor-rich systems, including lidar—a tech stack regulators have so far favored.
Beyond ride-hailing, the idea of personally owned self-driving cars is also gaining traction. Waymo and Toyota recently announced they’re exploring how to bring full autonomy to private vehicles, a move that could eventually bring robotaxi tech right into your garage.
With big names like Uber, Tesla, Waymo, and now May Mobility in the mix, the ride-hailing industry is evolving fast—and the road ahead looks increasingly driver-optional.

Read more