There’s little telling what the future holds for technology. Flying cars, once thought to be a 20th Century certainty, have barely made it off the ground. Facebook, once a tool for rating college co-eds, has become one of democracy’s greatest threats. And who but Jeff Bezos would’ve thought online book sales could turn into $150 billion?
So, it doesn’t sound so crazy when Harpreet Sareen, a designer in the Massachusetts Institute of Technology Media Lab, says our cities might someday be lush with plant-robot hybrids. Like a modern day Dr. Frankenstein — or, uh, Plantenstein? — his vision is to give plants a new kind of life.
Enter Elowan, a cybernetic plant unveiled this month by Sareen and his team. Tethered by a few wires and silver electrodes, the plant-robot hybrid moves in response to the plant’s light demands. When light shines on its leaves, the plant elicits bioelectrochemical signals, which the electrodes detect and transmit to the wheeled robot below. The robot then moves towards the light.
Elowan is more than just a plant on wheels. Sareen and his colleagues claim their project is an example of part-organic, part-artificial entities that may become more common in the future. Many of the functions we find in electronics — for example, the ability to sense surroundings and display data — first existed in nature. And they’re often more efficient and resilient in the natural world, less prone to wear, tear, and environmental damage. By identifying and interpreting the way plants function, the researchers hope to turn them into biohybrids that power, monitor, and converge with their technological surroundings.
This isn’t the first plant-robot partnership we’ve encountered. Vincross CEO Sun Tianqi created a robot tasked with keeping a succulent alive by monitoring its surrounding. But Elowan might be the most interesting. It takes the partnership one step further by directly connecting the plant with the machine.
We spoke to Sareen about his project and his vision for a world of cybernetic plants. This interview has been edited and condensed for clarity.
Digital Trends: What first motivated you to build a cyborg plant?
Harpreet Sareen: I’ve been interested in two aspects of research around nature. One is how we study capabilities in nature to power our future new interaction devices. Right now, we build everything out of the artificial world. It’s a very industrial way of thinking. We design everything artificially from the ground up.
“I wanted to show what it would be like if plants could walk like a human.”
In my research I’ve found many capabilities we can use in the natural world. For example, plants actually have electrical signals inside them that are similar to artificial circuits. That inspired me to think of new capabilities. So, I wanted to show what it would be like if the plant had mobility or could walk like a human, but could be powered by the plant itself.
How are you able to translate the plant’s electrical signals into movement?
Plants respond to a lot of environmental factors. In the morning, for example, plants try to orient themselves towards the sun to the east. As the sun keeps moving throughout the day, they reorient themselves more to get maximum sunlight. So, they respond to things like light conditions, gravity changes, impurities in the soil, and insects trying to eat their leaves. When that happens, the plant internally tries to communicate with its other organs. That communication is an electrical signal. It’s actually a bioelectrochemical signal.
With Elowan, I placed circuits on the plant to read those signals, and was able to read them by just touching the plant or changing its environment. I discovered that its signals were really clear when I changed its light conditions. For this robot, I have lamps set up in either direction, which I turn on and off. During the transition, the signal is produced and that signal travels to the robot to trigger the robot to move left and right.
Your idea then is to use the plant’s built-in physiology as a sort of natural circuit system. And you want to replace artificial circuits with natural ones.
On a broader level, that’s what I’ve tried to communicate here. But, as an interaction designer, I’m focused on how interactions [between humans and machines] work right now.
“Plants might be the best kind of electronics we already have in the environment, things that we can only strive to create artificially.”
Two things happen when we use digital devices — sensing and displaying. When we sit in front of a computer, the computer is almost trying to sense what I want to do, and it tries to provide an output based on that. Then there’s display, which comes out as interfaces that we see in the digital world. We create these artificial electronic devices to sense and display, but plants already have such capabilities.
Plants are self-powered, self-regenerating, and self-fabricating. They move and change color. Leaves open and close and grow. They can serve as inspiration for our electronics. Plants might be the best kind of electronics we already have in the environment, things that we can only strive to create artificially. Since we haven’t been able to recreate them, why not just align the designs with nature? I think the future of interaction design will put interfaces within nature itself.
What are some of the explicit advantages you see from having a hybrid device rather than exclusively synthetic ones?
Well this process of hybridizing with nature would be a paradigm shift that would lead us to think about how we design future devices. For example, plants continuously absorb water, like little motors in the environment. Plants open and close, acting like a display. If we look at those capabilities, we can start to use some of them and morph them with electronic functionalities, so we don’t have to really design things from the ground up.
The second benefit is that in the era of IOT and smart environments, we tend to put sensors everywhere but it won’t be possible to build everything efficiently with the scale we’re thinking for the future. And if we design everything artificially, we might put things into the environment that also destroy the environment, because they’re all made of silicon or mental. So how do we scale up? Plants might help us answer that question.
The way I see it, if we align ourselves with these natural capabilities, we can try to be convergent with nature. I call this convergent design. Right now, our environmental initiatives are always on the back foot. We say, “Okay, now that we’ve destroyed this part of the environment, how do we fix it now?” By hybridizing with nature and making cyborgs we won’t be passive in our efforts. We can be active and align our technological development with nature itself.
What sort of devices and infrastructure designs do you envision for this hybridized future?
My current project is called Cyborg Botany. Right now, we use plants mainly as food crops, but plants in some Asian cultures are also used as things like bridges. They go from one side of the river to the other side and they are used as a self-growing bridge. That’s one application where you can think of creating architecture out of a tree. Or think of the natural motor I mentioned. Plants could become monitoring platforms where they could monitor water quality, toxicity, or pollution, and then we don’t have to deploy artificial sensors.
“Plants could become monitoring platforms where they could monitor water quality, toxicity, or pollution, and then we don’t have a deploy artificial sensors.”
Other applications could connect to the digital world. I’m currently working on a plant that can be controlled with software, so you click on the leaves of the plant and the leaves close. That becomes a sort of bidirectional communication between plant and the computer. The leaf acts like a display.
As living organisms, plants have their own self-interest and don’t always follow the rules we set out for them. For example, tree roots grow through concrete or bushes grow in gutters. So, they might actually be more difficult to take care of than man-made devices. What sort of challenges do you face with cyborg plants that wouldn’t with synthetic devices?
There are two principles I have in this project that can make things difficult. One is that the plant shouldn’t be harmed and the other is that the environment shouldn’t be harmed. For example, if I’m growing something inside the plant or if I’m doing something in the environment, it shouldn’t hurt an animal that might come around and eat it.
It can also be challenging to study capabilities and interpret what they mean. When I listen to the plant’s electrical signals, I need to be able to tell that one signal happened because the light switched on and another signal happened because I put something in the soil. Based on those discreet interpretations, I’m able to really study a plant system and figure out if this is the right kind of system to use for my application.
You obviously value plants. I’m curious if you think plants have agency and if they can feel pleasure and pain?
It is very important to mention that plants do not have nerves that humans do. Plants do not have emotions but they do have evolutionary signals. They are systems on some level. I try to make interpretations of those evolutionary signals, but they are not emotional signals. They are just responses to the environment. But at the end of the day they are still living systems. Through Elowan, I amplify the thing the plant already wants to do.
- Meet the MIT artist who builds with fungus and paints with swarms of drones
- Climeworks wants to clean the atmosphere with a fleet of building-sized vacuums
- Are we living in a simulation? This MIT scientist says it’s more likely than not