Whether it’s cyclists about to lurch into the wrong lane or a pedestrian getting ready to cross the street, self-driving cars need to be hyperaware of what is going on around them at all times. But one thing they can’t do is to see around corners. Or can they? In a paper presented at this week’s International Conference on Intelligent Robots and Systems (IROS), Massachusetts Institute of Technology researchers have shown off technology which could allow autonomous vehicles or other kinds of robots to do exactly that — by looking for changes in shadows on the ground to reveal if a moving object is headed their way.
“ShadowCam operates by detecting small differences in shadows and using this information to detect possible static and dynamic objects that are otherwise out of your line of sight,” Alexander Amini and Igor Gilitschenski, two MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) researchers who worked on the project, told Digital Trends via email. “First of all, we need to focus on the same region of interest as we move, which we achieve by integrating a visual motion estimation technique into ShadowCam. Based on this stabilized image of the region of interest, we [then] use color amplification combined with a dynamic threshold on the intensity changes.”
In tests, the system has shown that it could increase the stopping ability of automated vehicles by more than 0.5 of a second compared to regular Lidar technology. While half a second sounds like only the most incremental of improvements, these kind of hair-trigger reactions could potentially mean the difference between colliding with another vehicle and stopping in time.
So far, the technology has only been tested in indoor settings. As with any robot-based system, lab conditions can be deceptive since they make it easier to control variables. In this case, it means more consistent lighting, which helps the system to better analyze shadows. Nonetheless, as with self-driving cars themselves (which, not too long ago, seemed like they would remain research projects only), tech continues to advance at an astonishing rate. With that in mind, we wouldn’t bet against technology like ShadowCam popping up as an assistive driving staple before too much longer.
- Here’s what Bosch hopes to learn from deploying autonomous cars in San Jose
- Neuro-symbolic A.I. is the future of artificial intelligence. Here’s how it works
- Revisiting the rise of A.I.: How far has artificial intelligence come since 2010?
- New ‘shady’ research from MIT uses shadows to see what cameras can’t
- Mobileye maps 28K miles of roads a day to prep for autonomous cars