Skip to main content
  1. Home
  2. Emerging Tech
  3. Features

Exoskeletons with autopilot: A peek at the near future of wearable robotics

 

Automation makes things easier. It also makes things potentially scarier as you put your well-being in the hands of technology that has to make spur-of-the-moment calls without first consulting you, the user. A self-driving car, for instance, must be able to spot a traffic jam or swerving cyclist and react appropriately. If it can do this effectively, it’s a game-changer for transportation. If it can’t, the results may be fatal.

Recommended Videos

At the University of Waterloo, Canada, researchers are working on just this problem — only applied to the field of wearable robot exosuits. These suits, which can range from industrial wearables reminiscent of Aliens’ Power Loader to assistive suits for individuals with mobility impairments resulting from age or physical disabilities, are already in use as augmentation devices to aid their wearers. But they’ve been entirely manual in their operation. Now, researchers want to give them a mind of their own.

To that end, the University of Waterloo investigators are developing A.I. tools like computer vision that will allow exosuits to sense their surroundings and adjust movements accordingly — such as being able to spot flights of stairs and climb them automatically or otherwise respond to different walking environments in real time. Should they pull it off, it will forever change the usefulness of these assistive devices. Doing so isn’t easy, however.

The biggest challenge for robotic exoskeletons

“Control is generally regarded as one of the biggest challenges to developing robotic exoskeletons for real-world applications,” Brokoslaw Laschowski, a Ph.D. candidate in the university’s Systems Design Engineering department, told Digital Trends. “To ensure safe and robust operation, commercially available exoskeletons use manual controls like joysticks or mobile interfaces to communicate the user’s locomotor intent. We’re developing autonomous control systems for robotic exoskeletons using wearable cameras and artificial intelligence, [so as to alleviate] the cognitive burden associated with human control and decision-making.”

University of Waterloo: wearable robot exoskeletons camera
University of Waterloo

As part of the project, the team had to develop an A.I.-powered environment classification system, called the ExoNet database, which it claims is the largest-ever open-source image dataset of human walking environments. This was gathered by having people wear a mounted camera on their chest and walk around local environments while recording their movement and locomotion, It was then used to train neural networks.

“Our environment classification system uses deep learning,” Laschowski continued. “However, high-performance deep-learning algorithms tend to be quite computationally expensive, which is problematic for robotic exoskeletons with limited operating resources. Therefore, we’re using efficient convolutional neural networks with minimal computational and memory storage requirements for the environment classification. These dee- learning algorithms can also automatically and efficiently learn optimal image features directly from training data, rather than using hand-engineered features as is traditionally done.”

John McPhee, a professor of Systems Design Engineering at the University of Waterloo, told Digital Trends: “Essentially, we are replacing manual controls — [like] stop, start, lift leg for step — with an automated solution. One analogy is an automatic powertrain in a car, which replaces manual shifting. Nowadays, most people drive automatics because it is more efficient, and the user can focus on their environment more rather than operating the clutch and stick. In a similar way, an automated high-level controller for an exo will open up new opportunities for the user [in the form of] greater environmental awareness.”

As with a self-driving car, the researchers note that the human user will possess the ability to override the automated control system if the need arises. While it will still require a bit of faith to, for instance, trust that your exosuit will spot a flight of descending stairs prior to launching down them, the wearer can take control in scenarios where it’s necessary.

Still prepping for prime time

Right now, the project is a work in progress. “We’re currently focusing on optimizing our A.I.-powered environment classification system, specifically improving the classification accuracy and real-time performance,” said Laschowski. “This technical engineering development is essential to ensuring safe and robust operation for future clinical testing using robotic exoskeletons with autonomous control.”

University of Waterloo: wearable robot exoskeleton in use
University of Waterloo

Should all go to plan, however, hopefully it won’t be too long until such algorithms can be deployed in commercially available exosuits. They are already becoming more widespread, thanks to innovative companies like Sarcos Robotics, and are being used in evermore varied settings. They’re also capable of greatly enhancing human capabilities beyond what the wearer would be capable of when not wearing the suit.

In some ways, it’s highly reminiscent of the original conception of the cyborg, not as some nightmarish Darth Vader or RoboCop amalgamation of half-human and half-machine, but, as researchers Manfred Clynes and Nathan Kline wrote in the 1960s, as “an organizational system in which … robot-like problems [are] taken care of automatically, leaving [humans] free to explore, to create, to think, and to feel.” Shorn of its faintly hippy vibes (this was the ’60s), the idea still stands: By letting robots autonomously take care of the mundane problems associated with navigation, the human users can focus on more important, engaging things. After all, most people don’t have to consciously think about the minutiae of moving one foot in front of the other when they walk. Why should someone in a robot exosuit have to do so?

The latest paper dedicated to this research was recently published in the journal IEEE Transactions on Medical Robotics and Bionics.

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Uber wants to drive you straight into ski season, literally
Book a ride, pack your gear, and let someone else brave the mountain roads.
Uber Ski Featured of 2 people skiing snow

What’s happened? As the winter season approaches, Uber has launched a dedicated service called Uber Ski aimed at simplifying transport to mountain resorts for skiers and snowboarders. The offering lets users reserve larger vehicles in advance, and even purchase lift-pass bundles via a partnership with Epic Pass operator Vail Resorts. Compared to the European Uber Ski rollout earlier this year, which focused mainly on gear-friendly transport between airports, train stations, and nearby slopes, the North American launch goes further. It bundles rides with lift-pass purchases, making it a full ski-trip package built for U.S. and Canadian resorts.

The vehicle options include Uber XL (fits two passengers with gear) and Uber XXL (fits four with equipment), and can be reserved up to 90 days in advance.

Read more
Grab This Professional Ionic Hair Dryer for Only $24.99
A salon style blowout at home for under $25.
NEXPURE 1800W Professional Ionic Blow Dryer

This post is brought to you in paid partnership with NEXPURE.

A good hair dryer should be fast, lightweight, and gentle enough not to fry your hair. The NEXPURE 1800W Professional Ionic Hair Dryer checks all those boxes. Right now, it is heavily discounted down to $24.99, a big drop from its usual $88.99 list price, which makes it one of the better value-focused hair tools you can pick up today.

Read more
ChatGPT finally fixes the em-dash habit, because punctuation matters
A small tweak that might make a big difference in how human your AI writing appears.
Chatbot on a smartphone.

What’s happened? One of the biggest problems with ChatGPT has now been fixed. Sam Altman announced via a X post that ChatGPT will now comply when users explicitly instruct it not to use em-dashes in the custom instructions tab. By adding a rule to the custom instructions to avoid using em-dashes, one can finally get ChatGPT to stop using them.

The update addresses a long-running complaint that ChatGPT’s heavy reliance on the em-dash made its output look "bot-written."

Read more