Skip to main content

Meet the MIT scientist who’s growing semi-sentient cyborg houseplants

Elowan: A Plant-Robot Hybrid

There’s little telling what the future holds for technology. Flying cars, once thought to be a 20th Century certainty, have barely made it off the ground. Facebook, once a tool for rating college co-eds, has become one of democracy’s greatest threats. And who but Jeff Bezos would’ve thought online book sales could turn into $150 billion?

So, it doesn’t sound so crazy when Harpreet Sareen, a designer in the Massachusetts Institute of Technology Media Lab, says our cities might someday be lush with plant-robot hybrids. Like a modern day Dr. Frankenstein — or, uh, Plantenstein? — his vision is to give plants a new kind of life.

harpeet sareen plant
Harpreet Sareen

Enter Elowan, a cybernetic plant unveiled this month by Sareen and his team. Tethered by a few wires and silver electrodes, the plant-robot hybrid moves in response to the plant’s light demands. When light shines on its leaves, the plant elicits bioelectrochemical signals, which the electrodes detect and transmit to the wheeled robot below. The robot then moves towards the light.

Elowan is more than just a plant on wheels. Sareen and his colleagues claim their project is an example of part-organic, part-artificial entities that may become more common in the future. Many of the functions we find in electronics — for example, the ability to sense surroundings and display data — first existed in nature. And they’re often more efficient and resilient in the natural world, less prone to wear, tear, and environmental damage. By identifying and interpreting the way plants function, the researchers hope to turn them into biohybrids that power, monitor, and converge with their technological surroundings.

This isn’t the first plant-robot partnership we’ve encountered. Vincross CEO Sun Tianqi created a robot tasked with keeping a succulent alive by monitoring its surrounding. But Elowan might be the most interesting. It takes the partnership one step further by directly connecting the plant with the machine.

We spoke to Sareen about his project and his vision for a world of cybernetic plants. This interview has been edited and condensed for clarity.

Digital Trends: What first motivated you to build a cyborg plant?

Harpreet Sareen: I’ve been interested in two aspects of research around nature. One is how we study capabilities in nature to power our future new interaction devices. Right now, we build everything out of the artificial world. It’s a very industrial way of thinking. We design everything artificially from the ground up.

“I wanted to show what it would be like if plants could walk like a human.”

In my research I’ve found many capabilities we can use in the natural world. For example, plants actually have electrical signals inside them that are similar to artificial circuits. That inspired me to think of new capabilities. So, I wanted to show what it would be like if the plant had mobility or could walk like a human, but could be powered by the plant itself.

How are you able to translate the plant’s electrical signals into movement?

Plants respond to a lot of environmental factors. In the morning, for example, plants try to orient themselves towards the sun to the east. As the sun keeps moving throughout the day, they reorient themselves more to get maximum sunlight. So, they respond to things like light conditions, gravity changes, impurities in the soil, and insects trying to eat their leaves. When that happens, the plant internally tries to communicate with its other organs. That communication is an electrical signal. It’s actually a bioelectrochemical signal.

The mimosa plant responds to light by opening or closing its leaves Elbert Tiao

With Elowan, I placed circuits on the plant to read those signals, and was able to read them by just touching the plant or changing its environment. I discovered that its signals were really clear when I changed its light conditions. For this robot, I have lamps set up in either direction, which I turn on and off. During the transition, the signal is produced and that signal travels to the robot to trigger the robot to move left and right.

Your idea then is to use the plant’s built-in physiology as a sort of natural circuit system. And you want to replace artificial circuits with natural ones.

On a broader level, that’s what I’ve tried to communicate here. But, as an interaction designer, I’m focused on how interactions [between humans and machines] work right now.

“Plants might be the best kind of electronics we already have in the environment, things that we can only strive to create artificially.”

Two things happen when we use digital devices — sensing and displaying. When we sit in front of a computer, the computer is almost trying to sense what I want to do, and it tries to provide an output based on that. Then there’s display, which comes out as interfaces that we see in the digital world. We create these artificial electronic devices to sense and display, but plants already have such capabilities.

Plants are self-powered, self-regenerating, and self-fabricating. They move and change color. Leaves open and close and grow. They can serve as inspiration for our electronics. Plants might be the best kind of electronics we already have in the environment, things that we can only strive to create artificially. Since we haven’t been able to recreate them, why not just align the designs with nature? I think the future of interaction design will put interfaces within nature itself.

What are some of the explicit advantages you see from having a hybrid device rather than exclusively synthetic ones?

Well this process of hybridizing with nature would be a paradigm shift that would lead us to think about how we design future devices. For example, plants continuously absorb water, like little motors in the environment. Plants open and close, acting like a display. If we look at those capabilities, we can start to use some of them and morph them with electronic functionalities, so we don’t have to really design things from the ground up.

Harpreet Sareen

The second benefit is that in the era of IOT and smart environments, we tend to put sensors everywhere but it won’t be possible to build everything efficiently with the scale we’re thinking for the future. And if we design everything artificially, we might put things into the environment that also destroy the environment, because they’re all made of silicon or mental. So how do we scale up? Plants might help us answer that question.

The way I see it, if we align ourselves with these natural capabilities, we can try to be convergent with nature. I call this convergent design. Right now, our environmental initiatives are always on the back foot. We say, “Okay, now that we’ve destroyed this part of the environment, how do we fix it now?” By hybridizing with nature and making cyborgs we won’t be passive in our efforts. We can be active and align our technological development with nature itself.

What sort of devices and infrastructure designs do you envision for this hybridized future?

My current project is called Cyborg Botany. Right now, we use plants mainly as food crops, but plants in some Asian cultures are also used as things like bridges. They go from one side of the river to the other side and they are used as a self-growing bridge. That’s one application where you can think of creating architecture out of a tree. Or think of the natural motor I mentioned. Plants could become monitoring platforms where they could monitor water quality, toxicity, or pollution, and then we don’t have to deploy artificial sensors.

“Plants could become monitoring platforms where they could monitor water quality, toxicity, or pollution, and then we don’t have a deploy artificial sensors.”

Other applications could connect to the digital world. I’m currently working on a plant that can be controlled with software, so you click on the leaves of the plant and the leaves close. That becomes a sort of bidirectional communication between plant and the computer. The leaf acts like a display.

As living organisms, plants have their own self-interest and don’t always follow the rules we set out for them. For example, tree roots grow through concrete or bushes grow in gutters. So, they might actually be more difficult to take care of than man-made devices. What sort of challenges do you face with cyborg plants that wouldn’t with synthetic devices?

There are two principles I have in this project that can make things difficult. One is that the plant shouldn’t be harmed and the other is that the environment shouldn’t be harmed. For example, if I’m growing something inside the plant or if I’m doing something in the environment, it shouldn’t hurt an animal that might come around and eat it.

It can also be challenging to study capabilities and interpret what they mean. When I listen to the plant’s electrical signals, I need to be able to tell that one signal happened because the light switched on and another signal happened because I put something in the soil. Based on those discreet interpretations, I’m able to really study a plant system and figure out if this is the right kind of system to use for my application.

You obviously value plants. I’m curious if you think plants have agency and if they can feel pleasure and pain?

It is very important to mention that plants do not have nerves that humans do. Plants do not have emotions but they do have evolutionary signals. They are systems on some level. I try to make interpretations of those evolutionary signals, but they are not emotional signals. They are just responses to the environment. But at the end of the day they are still living systems. Through Elowan, I amplify the thing the plant already wants to do.

Editors' Recommendations

Dyllan Furness
Dyllan Furness is a freelance writer from Florida. He covers strange science and emerging tech for Digital Trends, focusing…
This AI cloned my voice using just three minutes of audio
acapela group voice cloning ad

There's a scene in Mission Impossible 3 that you might recall. In it, our hero Ethan Hunt (Tom Cruise) tackles the movie's villain, holds him at gunpoint, and forces him to read a bizarre series of sentences aloud.

"The pleasure of Busby's company is what I most enjoy," he reluctantly reads. "He put a tack on Miss Yancy's chair, and she called him a horrible boy. At the end of the month, he was flinging two kittens across the width of the room ..."

Read more
Digital Trends’ Top Tech of CES 2023 Awards
Best of CES 2023 Awards Our Top Tech from the Show Feature

Let there be no doubt: CES isn’t just alive in 2023; it’s thriving. Take one glance at the taxi gridlock outside the Las Vegas Convention Center and it’s evident that two quiet COVID years didn’t kill the world’s desire for an overcrowded in-person tech extravaganza -- they just built up a ravenous demand.

From VR to AI, eVTOLs and QD-OLED, the acronyms were flying and fresh technologies populated every corner of the show floor, and even the parking lot. So naturally, we poked, prodded, and tried on everything we could. They weren’t all revolutionary. But they didn’t have to be. We’ve watched enough waves of “game-changing” technologies that never quite arrive to know that sometimes it’s the little tweaks that really count.

Read more
Digital Trends’ Tech For Change CES 2023 Awards
Digital Trends CES 2023 Tech For Change Award Winners Feature

CES is more than just a neon-drenched show-and-tell session for the world’s biggest tech manufacturers. More and more, it’s also a place where companies showcase innovations that could truly make the world a better place — and at CES 2023, this type of tech was on full display. We saw everything from accessibility-minded PS5 controllers to pedal-powered smart desks. But of all the amazing innovations on display this year, these three impressed us the most:

Samsung's Relumino Mode
Across the globe, roughly 300 million people suffer from moderate to severe vision loss, and generally speaking, most TVs don’t take that into account. So in an effort to make television more accessible and enjoyable for those millions of people suffering from impaired vision, Samsung is adding a new picture mode to many of its new TVs.
[CES 2023] Relumino Mode: Innovation for every need | Samsung
Relumino Mode, as it’s called, works by adding a bunch of different visual filters to the picture simultaneously. Outlines of people and objects on screen are highlighted, the contrast and brightness of the overall picture are cranked up, and extra sharpness is applied to everything. The resulting video would likely look strange to people with normal vision, but for folks with low vision, it should look clearer and closer to "normal" than it otherwise would.
Excitingly, since Relumino Mode is ultimately just a clever software trick, this technology could theoretically be pushed out via a software update and installed on millions of existing Samsung TVs -- not just new and recently purchased ones.

Read more