There’s no doubt that device-filled connected smart homes are on the way. The real question is how we’re going to control them.
That was the starting point of a nifty proof-of-concept project created by interaction designer Ian Sterling and software engineer Swaroop Pal during a recent HoloLens hackathon in San Francisco. Their augmented reality pitch shows how smart devices could be controlled with glances and gestures — in what Sterling calls a “virtual Zen mode,” complete with calming lights and sounds.
“The primary goal of the app is to provide a 3D spatial UI for cross-platform devices — Android Music Player app and Arduino-controlled Fan and Light — and to interact with them using gaze and gesture control,” Ian Sterling, a design student at California College of the Arts, told Digital Trends.
“The connectivity between Arduino and a mixed reality device is something which holds a huge amount of creative opportunity for developers to create some very exciting applications — be it [Internet of Things], robotics, or other sensor data visualization. Besides this, our app features some fun ways to connect devices. Our demo featured a connection between a music player and a light in order to set a certain mood in your home.”
While the so-called IoTxMR concept is just a demo, Sterling thinks it signals the way our interactions with smart devices need to go. “I feel that in order for smart devices to be less obtrusive, while at the same time becoming more robust in functionality, new forms of interaction will be a requirement,” he continues. “Whether this is with a simple app on your phone, or using mixed reality software on your phone or wearable device — such as HoloLens or Magic Leap — we will be interfacing with our smart home devices through a separate piece of smart technology.”
Hey, if it means we get to feel like Neo from The Matrix, by controlling our devices with the simple wave of a hand, courtesy of Sterling and Pal’s “visually serene and relaxing” Zen mode, consider us sold.