Jas Brooks, a long-haired engineer who looks as if they might moonlight as the roadie for a hair metal band, sat blindfolded in a room with electrodes up their nose and let people over the internet crank up the smells.
“It definitely looks … horrifying,” they told Digital Trends, likening the experimental setup to the Milgram experiment, a controversial series of 1960s experiments, conducted by a Yale psychologist, in which people were tested on their willingness to dish out electric shocks to participants.
In Stanley Milgram’s experiments, the participants weren’t really electrocuting people, however. Unbeknownst to them, participants were being tested on whether they were willing to obey an authority figure in doing something they themselves might deem unconscionable. In Brooks’ experimental setup, Brooks really was receiving electrical currents from the people at the controls. These just happened to register in the form of warm, wasabi-like sensations or sharp whiffs of vinegar fumes instead of shocks.
“It’s not painful for me,” Brooks said. “I was just sitting there being like, ‘Oh yeah, I feel this. This is what I’m perceiving right now.’ The basic setup was that I had this blindfold on and there was this screen [I shared] with instructions. It was this interface I had designed with [a picture of] my nose and a right and left button. They could click on it virtually to test the sensor.”
Brooks, a Ph.D. student in the department of computer science at the University of Chicago Human-Computer Integration Lab, is focused on the shape of tech to come. And, at least based on this recent experiment, one shape that tech could take is a pair of electrodes, held in place by tiny magnets, inserted up the nose of wearers.
To picture it, imagine some kind of high-tech anti-snoring device or the type of cyborgic data-gathering accessory Twitter’s Jack Dorsey might swap his nose ring for at Burning Man. The tiny, wireless, battery-powered wearable is able to detect when wearers inhale and then uses its electrodes to stimulate their septum, the bit of cartilage in the nose that separates the nostrils.
Digital Trends previously covered the Human-Computer Integration Lab’s work when researchers there (Brooks included) developed a technique for replicating temperature in virtual reality by pumping odorless chemicals with trace elements of capsaicin and menthol to simulate a feeling of hot and cold. This was done using a low-power attachment affixed to a VR display. This time, however, the device the team has come up with involves no actual chemical stimulation at all. The wearer isn’t actually smelling an external smell; they’re simply having one of the nerve clusters associated with smell tickled in a way that makes them think they are.
“Most people might know that we perceive smell using our olfactory bulb, but really smell is a multimodal sensation,” said Brooks. “We have two systems that [contribute] to our small perception. We have that olfactory bulb, and then we have the nerve endings in our nose that are perceiving things like the sharpness of vinegar, which is a very clear sensation, mediated by this nerve, as well as things like the refreshing aspect of mint.”
The Bluetooth nose wearable buzzes this latter trigeminal nerve region in order to pull off its trick. This easier-to-reach nerve cluster (easier, that is, than the olfactory bulb, which is located behind the eyeball) adds certain smell sensations, which the brain then mashes together with the olfactory bulb data to conjure up certain smell sensations.
The work carried out by Brooks and the rest of the team is cutting-edge. But it’s not the first time the world has entertained the notion of smell tech. On April 1, 2013, Google announced its Google Nose project, a new initiative for the tech giant that would, it said, expand the search space into the olfactory realm. A video produced by Google showed product manager Jon Wooley explain how smell is a crucial part of the way we navigate the world, but one that had been cruelly overlooked by previous search methods.
The idea of Google Nose was to build upon a Google Aroma database of 15 million “scentibytes” from all over the world to allow users to “search for smells.” By clicking a new Google Smell button while using a laptop, desktop or mobile device, a user could, for instance, hold up their phone to a flower and receive a positive identification based upon its scent. “By intersecting photons with infrasound waves, Google Nose Beta temporarily aligns molecules to emulate a particular scent,” the video explained.
It was, unfortunately, an April Fool’s prank rather than a real product. Although it was in good fun, this is also indicative of how smell tech has often been treated in recent history. Nobody disputes that olfactory senses are powerful (there’s a reason people talk about the importance of baking fresh bread when you’re doing house viewings for selling your home), but scent is a difficult sense to harness in the way that we can, for example, create bubbles of personalized sound with earbuds or control what the eye sees using a changing video display.
Efforts to do so have routinely been treated with ridicule by critics. For example, the long-departed Smell-O-Vision is often laughingly considered the nadir of mid-20th century movie theater gimmickry at a time it was losing ground to television. The first Smell-O-Vision movie, 1960’s Scent of Mystery, pumped an automated scent to theater seats using plastic tubing. The 30 different smells, ranging from ferume to shoe polish to wine, were designed to correspond to what was happening on the screen.
An ad for the movie read: “First they moved (1895)! Then they talked (1927)! Now they smell (1960)!” As a gimmick, it stunk.
Olfactory control is far more possible with this latest work from the Human-Computer Integration Lab. For instance, one of the unusual features of the device is the fact that it makes it possible to smell in either stereo or mono. That means that it can activate each electrode independently, which is why Brooks’ virtual control panel, described earlier, had separate buttons for left and right. Stereo sniffing is remarkable because this isn’t part of how we typically sense aromas in the real world.
Don’t expect the nasal wearable to be able to replicate more complex scents, though. Simulating a wider range of aromas might be possible, Brooks said, but not solely through stimulation of the trigeminal nerve. The olfactory bulb has the much wider palette of sensations. The trigeminal nerve is more like the tongue, which can detect just five tastes: sweet, sour, salty, bitter, and umami. (Much of the subtlety of what we call taste is actually smell.) Similarly, trigeminal nerve stimulation can provide big sensations that we recognize as smell, but without any notes. In other words, while you can replicate the tingling sensation of vinegar fumes, you can’t do the same with the smell of fresh rain.
To stimulate the olfactory bulb involves a lengthy nasal swab, overseen by a doctor, that would make a COVID test look like blowing your nose by comparison. Brooks noted that the optimal way to achieve olfactory bulb stimulation would be by way of a tiny medical implant, although this is unlikely to be something most of us would entertain. There’s also the challenge of replicating smells on the level of code. “We don’t know what the parameters would be to actually encode a smell digitally or electrically so that it could be decoded by the bulb correctly afterwards,” they said.
As far as use cases go, the most obvious is making virtual reality more immersive. No matter how good the graphics might be, no matter if we master the ability to do infinite walking in virtual reality or work on the haptic tech to feel textures and objects in the virtual world, a VR pine forest is, for many, always going to seem lacking if it doesn’t smell of pine trees.
But Brooks doesn’t view this purely as a gaming accessory. “We already have phenomenal smell experiences, maybe that we don’t pay too much attention to, in real life that are just super rich,” they said. “You’re walking down the street and an odor just hits you. In Chicago, there’s a pretty famous chocolate factory, and you just get clouds of this odor in the city. What I’m imagining this could lead to is purely olfactory augmented reality … really transforming how we interact with everyday odors instead of trying to produce a new set of odor experiences from scratch.”
This work, which is still in the future for the team, could focus on making the smell experience smarter. Where is a particular odor coming from? Can you dial up one odor that you liked and dial down another that you didn’t? How about odor notifications: Who wouldn’t want the sharp burn of wasabi in their nostrils whenever their boss messages them on Slack? Or, more seriously, could you be made to smell a deadly gas like carbon monoxide that is currently odorless? While carbon monoxide detectors do this without requiring users to stick electrodes up their nose, such a tool could conceivably be useful for certain scenarios, such as those faced by rescue workers.
“One of the things that we’re thinking about is, can we use this as an intervention technology like hearing aids for people that have smell loss?” Brooks said, pointing out that this could become more pressing in a post-pandemic world with continuing smell loss proving to be a prevailing side effect for many people.
And, of course, there’s always the possibility of other types of sensory entertainment beyond VR and gaming. “Chemical senses are so intense that it’s hard imagining, like, a three-hour smell opera that’s constantly stimulating you for those three hours and not giving you breaks,” Brooks said. But the idea is certainly alluring. “I’ve been thinking about, over the last year and a half, how much I would personally enjoy a smell Walkman.”
The notion of picking a playlist of smellscapes — from the smell of tomatoes on the vine to the aroma of fabric softener — and playing each one on demand is the stuff tech dreams are made of. Far-fetched, maybe. But not impossible. “It’s definitely not out of the question,” said Brooks.
A paper describing the team’s work was recently presented at the 2021 Conference on Human Factors in Computing Systems (CHI). Along with Brooks, other investigators who worked on the project include lab head Pedro Lopes, Romain Nith, Shan-Yuan Teng, Jingxuan Wen, and Jun Nishida.
- Intel is using A.I. to build smell-o-vision chips
- Skylum Luminar 4’s A.I.-powered portrait editor zaps zits and whitens teeth
- Smelling is believing: Feelreal’s odor vision may be coming to VR headsets
- Incase’s ProConnected smart carry-on is a rolling power station for hungry devices
- Scientists use a supercomputer to simulate one of history’s biggest quakes