Skip to main content

Thalmic Labs’ founder on ditching touch screens, nixing mice, and why gesture control is the future

thalmic labs myo gesture control interview header

The concept of gesture control isn’t anything new – think of Nintendo’s clumsy Power Glove, which dates all the way back to 1989. In practice, however, we’ve been miles away from the effortless, graceful, almost instinctual computer interaction depicted in sci-fi movies such as Minority Report.

Thalmic Labs, a year-old company based out of Waterloo, Ontario, is aiming to change all that with a futuristic armband called the MYO (my-yo). Named after a Greek prefix meaning “muscle,” the MYO reads electrical muscle activity directly, promising seamless interaction with a broad range of technology. A YouTube demo video for the MYO (below), for instance, shows a man snapping his fingers to start an iTunes track, or throwing up “devil horns” to share a skiing clip on Facebook.

Related Videos

To use MYO, you strap it on the arm and perform gestures to issue commands. The company claims it’ll work out of the box with Macs and PCs when it ships sometime in late 2013 (the company’s taking pre-orders for its second shipment of MYOs shipping early 2014), but it could also be used for iOS and Android. The arm band communicates with whatever computer or device you’re using it with via a low-power Bluetooth connection. While the arm band itself won’t be available until late into the year, the API will be available this summer for developers to design apps for use with the armband.

We first covered the MYO during a pre-order rush that ended up selling more than 25,000 armbands. To learn more about the device, we went straight to the source: Aaron Grant, one of the three founders of Thalmic Labs.

In a Skype interview, Grant shared with us what it’s like to start a company straight out of college, how the MYO is different from other gesture control devices, what some of the most interesting ideas people have for using the MYO, and what he sees for the future of gesture control technology. 

Digital Trends: So what makes the MYO really stand out from other gesture control devices like Leap Motion’s Leap Controller or Microsoft’s Kinect? How does it work?

Aaron Grant: Thalmic Labs

Aaron Grant: Gesture control and this whole idea of more natural ways of interacting with your technology is taking off right now, but we’re not convinced that cameras are the way to do it. So, we’ve developed a system that’s completely different from that. It’s an armband that you wear around your forearm, which actually measures the electrical activity from your muscles. Based on that electrical activity, we can determine the pose that your hand is in at any given time, whether you’re making a fist, whether you’re touching your thumb to your index finger or your thumb to your ring finger, whether you’re swiping left to right or up and down. We can get all those different hand positions just from reading the muscle activity in your arm.

How has this kind of electronic muscle activity been used in technology previously?

There’s been some research done in terms of prosthetics, like reading the muscle signals and being able to control, for example, a prosthetic hand. The actual underlying technology, which is called electromyography or EMG, has been around for a long time in clinical settings. It’s similar to ECG, which is for your heart, so if you’re ever in the hospital and you’re hooked up to the little machine showing your heartbeat, it’s very similar to that.

From an insider’s perspective, what are some of the challenges that companies have faced in developing gesture-control devices in the past?

[pullquote]We think that the next evolution of computing are going to be things that are more closely integrated with you as a person.[/pullquote]In the last year, we’ve seen tons of different people trying to do gesture control, whether it’s the game companies like Microsoft Kinect or PlayStation Move, or other companies like Leap Motion. But everybody else is doing a system that’s based on cameras. We feel that there are a lot of disadvantages to that, the most obvious one being that the camera actually has to be able to see you, essentially constraining you. For the Leap Motion device, you have to hold your hands above the device itself. It’s a relatively small workspace that you can actually move around in. There’s also a problem with being outside – the sunlight affects the camera, because they all use some sort of infrared system, so you can’t use it if there’s direct sunlight. And then, say you’re playing a multi-player game – there’s the issue of occlusion. Say you step in front of somebody, between that person and the camera. Then it can’t see them anymore.

What is the MYO like in action? What does it feel like to wear?

For the moment, it’s going to be this one device that works with everything. The device we’re shipping late this year will be a one-size-fits-all device, so a unique design that will fit the vast majority of the population, including kids 12 and up. The benefit of that is that you don’t have to have one that just fits you. If your friends come over, they can put it on and use it. Basically, it’s an expandable line that can grow or shrink as necessary. Lots of the time, I just wear one of our prototypes around all day. It’s not even the nice, final, industrial design version, and after a while you just don’t even notice it.

In terms of your own background, what was it like starting Thalmic Labs straight out of college at Waterloo University?

The program is called Mechatronics Engineering, so basically an intersection of mechanical, software, electrical, and systems integration – really a jack-of-all-trades program. There’s three co-founders, myself and two others who also graduated from Waterloo (Stephen Lake and Matthew Bailey). Same time, same class. We all knew that we wanted to work with each other and that we didn’t want to go off and work for big companies. We wanted to do something ourselves. The three of us graduated last April, and then founded the company about a week later. It’s certainly been quite a ride so far, and it’s going to be a pretty awesome year.

How did you feel seeing such a positive response to your pre-order campaign?

We were definitely excited, and I guess a bit relieved to see that it all went well. It’s great to see the success we’ve had so far with the preorder campaign, and it’s good to see that people actually want this. Up till that point, we had put in a ton of work developing the technology and getting it to a point where it actually works. We’ve had lots of software developers emailing us with cool ideas … every day we get emails from people who want to use it for things we’ve never even thought of before.

How did you all decide to get started in gesture control together?

I mean, this is kind of cliché, but we were literally sitting in a local bar and we were talking about the future of computing, just throwing around ideas, and one of the things we were talking about were these new interfaces – things such as Google Glass, and where that kind of stuff is going. And what we realized was, there’s been a lot of work done on the output modalities, but not as much work done on new ways of actually interacting with this new technology. If the output technology is evolving, then the input technology is going to evolve with it.

What background did you have in gesture control, other than the program at Waterloo?

We hadn’t worked specifically in gesture control, but we all specialized in separate areas that actually made a product like this possible. One of my co-founders, Steve, spent a lot of time in the medical industry developing medical devices, which is where the idea to use muscle activity as the input came from. Myself, I have a pretty solid software background, and my third co-founder, Matt, has specialized mostly in mechanical product design, and more recently in machine intelligence.

What are some of the most interesting ways people want to use the MYO?

We’ve had a couple of doctors contact us who want to use this when they’re in the operating room. They’re scrubbed in, so they can’t actually touch anything with their hands, but they need to navigate the patient’s diagnostic imagery to locate whatever they need to work with for the surgery. Another one that we’ve had is DJs and entertainers contacting us – they want to use it to control their lights and effects on stage. So just a huge number of different things, a very wide variety. That’s just scratching the surface.

Has anyone had a completely outrageous idea for how to use the MYO?

We had the Star Wars reference on our website, “release your inner Jedi,” and some people have said that they want to use [the MYO] to control their kids. They suggested that we should make a complementary device, a collar that goes around a dog’s neck or a child’s neck. That was a pretty out-there, silly one. I wouldn’t be surprised if someone actually built that. It’s kind of creepy, yeah, but that’s definitely one that I remembered.

Do you see the MYO as a sort of fun Jedi mind-trick, or do you think it will fundamentally change the way people interact with technology?

Definitely the latter. That’s absolutely our goal. We don’t want to be another gimmick that you use once and put aside. Since personal computing took off in the early ‘80s, the form factor is constantly evolving. First you had these clunky desktop computers, and then you had laptops, then you have smartphones and tablets. And along with that, the input technology is also evolving. So when the iPhone came out and Apple introduced multi-touch, that’s a new way of interacting with your technology. It’s not a mouse, it’s not a keyboard, it’s something completely new. I definitely think that’s going to continue to evolve, and our goal is to be at the forefront of that.

Has anything surprised you about the pre-order phase so far?

thalmic labs myo gesture control interview founders
Thalmic Labs founders (from left to right): Matthew Bailey, Stephen Lake, and Aaron Grant

Actually, interestingly, looking at our pre-order so far, we were expecting that the majority of people would be developers and hackers – Silicon Valley kind of people. But we’ve actually had only 30 percent of the orders from [them]. The rest are just general consumers who want to use it for everyday things. Additionally, only 40 percent of our orders have been within North America. Right now, we have orders from around 130 different countries, [including] Germany, Great Britain, Australia, China, Russia, [and] France.

What are you most excited about in the realm of gesture control for the future?

In the long term, I’m excited about this being something you would wear all the time. We think that the next evolution of computing are going to be things that are more closely integrated with you as a person. The buzzword we use is “contextual computing.”  That’s definitely what I’m most excited about.

[MYO images copyright: Thalmic Labs]

Editors' Recommendations

Google and LG built a gesture-controlled smartwatch using Project Soli
soli smartwatch speaker header

Google's quietly making a ton of progress on one of its coolest technologies to date, which will change how we interact with our devices. The technology is called Soli, and it's basically a tiny radar that can detect small hand gestures for controlling smart clothing, smartwatches, and other devices.

Soli is built by Google's ATAP division, which is responsible for bringing some of its more out-there technology to the real world. Soli was first shown off at Google I/O last year, and since then, Google has worked with around 60 developers, shipping them dev kits to build object recognition, wearables, and so on. Perhaps the most interesting, however, was the creation of a smartwatch and speaker with Soli built-in.

Read more
Apple’s latest patent could let you type without a keyboard
apple 2015 notebook market share macbook close up snow leopard mac

It appears 3D touch and multi-touch functionalities weren't enough for Apple. In fact, it was revealed today that Apple has been granted a patent for touch-free gesture input. The patent abstract suggests that the technology could theoretically allow you to control your Apple devices by hovering your hands over their displays, keyboards, and trackpads.

While that feature seems like the sort of gimmick most users would abandon after calibration, the patent also suggests the use of similar sensors designed to prevent accidental touch input on your iPhone screen. This could be useful in the case of talking on the phone, or perhaps even listening to music on YouTube while it's in your pocket. The iPhone already attempts to reject touch input when it thinks you're speaking, based on input from its internal sensors, but the new patent suggests a more refined system could be in the works.

Read more
HP debuts first laptop with Leap Motion gesture controls
hp debuts first leap motion gesture control laptop

Announced jointly by Leap Motion and HP yesterday, the HP ENVY 17 Leap Motion Special Edition is a new version of the standard HP ENVY 17.3-inch laptop with a built-in Leap Motion sensor that provides 3-dimensional gesture control to users. Much thinner than the regular Leap Motion standalone sensor, the embedded sensor is located to the right of the touchpad and below the keyboard. In order to take advantage of motion control, the user places their hands above the sensor approximately six to twelve inches and perform various gestures depending on the application running on the laptop.
The laptop will come preloaded with Airspace, Leap's application store for 3-D motion controlled software such as games, educational programs, productivity apps and creative tools. In addition, Leap will include five games, one of which has been specifically designed for HP. While the laptop includes the option of a touchscreen monitor for Windows 8 interactivity, users can also download third party software to map gesture controls to the Windows 8 interface. Hypothetically, users could navigate Windows 8 without ever actually touching the laptop. 

One downside to the Leap Motion sensor is that the laptop battery will quickly run out of power, so much so that HP recommends plugging the laptop in before turning on the sensor. To turn the sensor on and off, HP has included a shortcut on the keyboard. Regarding the other specifications for the laptop, it comes with a 1,920 by 1,080 IPS display, up to a 2TB hard drive, up to a Core i7 processor, up to 16GB of RAM, HDMI out and a Nvidia graphics card. 
The HP laptop will be available for preorder on October 16th, 2013 starting at a base price of $1049.99. Of course, Leap Motion is looking beyond the HP laptop to integrate the motion sensor into more devices. Within a statement released by Leap Motion, CEO Michael Buckwald stated "With our new micro sensor, there’s tremendous opportunity to integrate into other form factors like keyboards, smartphones, tablets, head-mounted displays and more." 

Read more