“Autonomous vehicles and autonomous driving [brought with it a huge amount] of hype. Everybody thought that, by 2020 or 2021, we would see a significant number of autonomous vehicles and autonomous services and autonomous robots. This didn’t happen. I think there is agreement that the reason for this is the lack of mature sensing technologies.”
On the face of it, this is a strange thing for Ram Machness, vice president of product at a company called Arbe Robotics, to say. After all, Arbe makes sensors that help autonomous vehicles to drive. This is a bit like if Tim Cook, CEO of Apple, were to say that the reason the smartphone market declined last year is because nobody’s making good smartphones.
Machness has a point, however. And it’s one that he hopes the company’s new technology — unveiled at this year’s all-virtual CES — will help change. With its new sensing technology, it believes it won’t be long before more autonomous driving tech is hitting the road. For real this time.
Machness points out that the erroneous assumption made by many of the people who built self-driving cars was that the algorithms they built for autonomous driving would have access to complete information about the world in which these cars were driving. This didn’t happen. Instead of having perfect information about the world they were moving in, they were hamstrung by sensing-perception challenges that needed to be solved before they could create algorithms that would power autonomous technologies for various applications.
That’s like trying to teach someone how to do their job in an office that’s just suffered a power outage. One problem needs to be solved before the other can be attempted. And until now that’s not been possible.
Could next-gen radar help change that?
Radar hasn’t been taken especially seriously as a way to get autonomous vehicles to perceive the world, other than as a means by which to detect the velocity of objects that have already identified through other sensors. Much of the main discussion has involved either computer vision using standard cameras or lidar, referring to bounced lasers used to measure distance. Both approaches have their positives and negatives.
Radar, which involves bounced electromagnetic waves, has been around a lot longer than lidar, but also has some pretty big challenges.
As an example, Machness shows an image of a black screen with a handful of glowing orange dots splashed across its surface. It looks like someone has spattered a small amount of colored paint on a dark wall or, perhaps, the reflection of city lights in water at night. It is virtually impossible to work out what you are seeing. This is traditional radar, he said, a technology that many cars are equipped with today for things like parking sensors, but which virtually no- one takes seriously for imaging. What we are “seeing” is a street scene, complete with other cars and an assortment of additional obstacles.
Machness jumps to another video and we’re now seeing what appears to be a psychedelic dashcam of a car winding its way down treelined streets. Other than the fact that it looks like it was filmed in Predator-style heat vision, it’s perfectly readable — by humans, let alone machines.
The big upgrade, he noted, is the amount of transmitting and receiving channels on the part of the radar. Machness likens this to the number of pixels in a camera image. “If I count the amount of channels in today’s radars, they have 12 channels,” he said. “The more advanced ones have 48 channels. We see some competitors working towards 192 channels. [We’ve developed radar with] 2,000 channels. That’s the breakthrough. We’re able to process them simultaneously.”
As announced January 11 at CES, Arbe’s new radar technology promises “4D” radar imaging for autonomous vehicles with the ability to separate, identify, and track objects in high resolution thanks to a next-gen radar that’s 100 times more detailed than any other radar on the market. This “2K ultra-high resolution” radar tech promises to be “road ready” by early 2022, a year from now.
The company is working with a large number of big, but as-yet-unannounced partners to bake this technology into future road-occupying vehicular platforms. “The problem Arbe is trying to solve is to bring imaging radar that has almost zero false alarms and very high resolution [to autonomous vehicles,]” Machness said.
One of the big advantages of radar is the possibility of using it in bad weather conditions. “Things that cameras and lidar are very sensitive to — like fog, rain, or dust — radar technology is significantly less sensitive to,” said Machness.
Not to say this is the case here, but CES demos, at the best of times, can be massaged to make technology look better than it is. Any live demo can. (Steve Jobs, famously, demoed the original iPhone in 2007 on a model that would fail if he didn’t follow a precise series of steps when showing how it worked.) Demos in the virtual era — such as at a livestreamed virtual show like CES 2021 — throw up even more opportunities for misrepresentation.
When it comes to autonomous vehicles and imaging, there are plenty of question marks. Until the problem of autonomous driving is perfected (and what exactly does that mean?), there will be disagreement about the best way to build one. Lidar, for instance, has its staunch proponents, while Tesla CEO Elon Musk has proclaimed it “unnecessary” and “a fool’s errand.”
But new possibilities such as this don’t just represent rival approaches, they represent breakthroughs that could form part of smarter hybrid systems that take advantage of the best of all worlds. In this capacity, Arbe isn’t alone in announcing autonomous sensing breakthroughs at CES. Also at this year’s show, Seoul Robotics — a South Korean company — is introducing its first mass market product, a multi-industry plug-and-play, next-gen lidar solution. Another startup, Cognata, is introducing Real 2 Sim, a new product that takes data recordings from drives and turns them automatically into simulations and datasets.
It’s not just self-driving cars this tech could benefit. Arbe, for its part, has a big focus on improving autonomous delivery robots so they can navigate better in the real world. “For the first generation, [creators went] overkill with the amount of sensors that they are using,” Machness said. “But to try and reduce the costs, [they are now trying to] reduce the amount of sensors, but also increase the safety of those robots and their capability to move everywhere.”
The same technology could also be used to drive autonomous trucks, buses, drones, and far more that will hit the road in increasing numbers in the next several years.
Autonomous vehicles have been a headline-grabbing part of CES since at least 2013. But this year, with coronavirus having stripped away the flash of the live event, hopefully what will emerge is more focus on substance and solving some of the problems that have kept autonomous vehicles partly in the realm of science fiction for so long.
Because who doesn’t want to see the next generation of vehicles sporting self-driving technology?
- Cruise autonomous vehicle drives over woman just after she was hit by another car
- From Paris to NYC, Mobileye will bring self-driving cars to metropolises
- Mercedes-Benz’s new ‘Hyperscreen’ dashboard is a dazzling 56-inch OLED panel
- Ford reveals the vehicle destined for its autonomous-car services
- New Apple self-driving car patent could turn Siri into your personal chauffeur