Your brain was meant for hacking, and the U.S. military and a professor of electrical engineering are among the latest parties to espouse that view. On one hand, there’s the recent news that the Defense Advanced Research Projects Agency (DARPA) is funding an initiative to observe and control the emotions of service members and veterans with mental illnesses. On the other hand, there’s research underway at the University of Washington to show the implications of mining revealing data by observing subconscious emotional responses.
On Tuesday, DARPA announced that it will fund two teams led by the University of California, San Francisco (UCSF) and Massachusetts General Hospital as part of the military-technology agency’s Systems-Based Neurotechnology for Emerging Therapies (SUBNETS) program. The two teams are, in simple terms, working to create brain implants that can learn more about psychiatric and neurological diseases and target certain areas of the brain with stimulation to treat those diseases.
Here’s a potential real-world application of this research: “Imagine if I have an addiction to alcohol and I have a craving,” according to Jose Carmena, a professor at the University of California, Berkeley, who is part of the UCSF project. “We could detect that feeling and then stimulate inside the brain to stop it from happening.”
Drive about 12.5 hours up Interstate 5 and you’ll get to the University of Washington, where Howard Jay Chizeck and two graduate students are working on a brain-sensor device (officially dubbed a brain-computer interface, or BCI) that sits on a person’s head and can observe involuntary emotional responses to various images flashed during a game called “Flappy Whale.”
The device can, for instance, pick up a participant’s emotional response to seeing a company logo that flickers as they play the game. “I could flash pictures of [gay and straight] couples and see which ones you react to,” Chizeck says. “And going through a logic tree, I could extract your sexual orientation. I could show political candidates and begin to understand your political orientation, and then sell that to pollsters.”
The goal of this research, according to Chizeck, is to protect against the ways this kind of technology might be taken advantage of in the future, and to find ways to implement privacy and security measures before it’s more widespread.
We’ve progressed far beyond throwing virtual trucks with our minds, and the implications are both promising and alarming.
[Image courtesy of Alex Mit/Shutterstock]