Skip to main content

The future of cars: A new spin on an old idea could revolutionize autonomous vehicles

“Autonomous vehicles and autonomous driving [brought with it a huge amount] of hype. Everybody thought that, by 2020 or 2021, we would see a significant number of autonomous vehicles and autonomous services and autonomous robots. This didn’t happen. I think there is agreement that the reason for this is the lack of mature sensing technologies.”

On the face of it, this is a strange thing for Ram Machness, vice president of product at a company called Arbe Robotics, to say. After all, Arbe makes sensors that help autonomous vehicles to drive. This is a bit like if Tim Cook, CEO of Apple, were to say that the reason the smartphone market declined last year is because nobody’s making good smartphones.

Machness has a point, however. And it’s one that he hopes the company’s new technology — unveiled at this year’s all-virtual CES — will help change. With its new sensing technology, it believes it won’t be long before more autonomous driving tech is hitting the road. For real this time.

Machness points out that the erroneous assumption made by many of the people who built self-driving cars was that the algorithms they built for autonomous driving would have access to complete information about the world in which these cars were driving. This didn’t happen. Instead of having perfect information about the world they were moving in, they were hamstrung by sensing-perception challenges that needed to be solved before they could create algorithms that would power autonomous technologies for various applications.

That’s like trying to teach someone how to do their job in an office that’s just suffered a power outage. One problem needs to be solved before the other can be attempted. And until now that’s not been possible.

Could next-gen radar help change that?

Radar makes a comeback

Radar hasn’t been taken especially seriously as a way to get autonomous vehicles to perceive the world, other than as a means by which to detect the velocity of objects that have already identified through other sensors. Much of the main discussion has involved either computer vision using standard cameras or lidar, referring to bounced lasers used to measure distance. Both approaches have their positives and negatives.

Radar, which involves bounced electromagnetic waves, has been around a lot longer than lidar, but also has some pretty big challenges.

As an example, Machness shows an image of a black screen with a handful of glowing orange dots splashed across its surface. It looks like someone has spattered a small amount of colored paint on a dark wall or, perhaps, the reflection of city lights in water at night. It is virtually impossible to work out what you are seeing. This is traditional radar, he said, a technology that many cars are equipped with today for things like parking sensors, but which virtually no- one takes seriously for imaging. What we are “seeing” is a street scene, complete with other cars and an assortment of additional obstacles.

Arbe - A radar for the road ahead

Machness jumps to another video and we’re now seeing what appears to be a psychedelic dashcam of a car winding its way down treelined streets. Other than the fact that it looks like it was filmed in Predator-style heat vision, it’s perfectly readable — by humans, let alone machines.

The big upgrade, he noted, is the amount of transmitting and receiving channels on the part of the radar. Machness likens this to the number of pixels in a camera image. “If I count the amount of channels in today’s radars, they have 12 channels,” he said. “The more advanced ones have 48 channels. We see some competitors working towards 192 channels. [We’ve developed radar with] 2,000 channels. That’s the breakthrough. We’re able to process them simultaneously.”

As announced January 11 at CES, Arbe’s new radar technology promises “4D” radar imaging for autonomous vehicles with the ability to separate, identify, and track objects in high resolution thanks to a next-gen radar that’s 100 times more detailed than any other radar on the market. This “2K ultra-high resolution” radar tech promises to be “road ready” by early 2022, a year from now.

Image used with permission by copyright holder

The company is working with a large number of big, but as-yet-unannounced partners to bake this technology into future road-occupying vehicular platforms. “The problem Arbe is trying to solve is to bring imaging radar that has almost zero false alarms and very high resolution [to autonomous vehicles,]” Machness said.

One of the big advantages of radar is the possibility of using it in bad weather conditions. “Things that cameras and lidar are very sensitive to — like fog, rain, or dust — radar technology is significantly less sensitive to,” said Machness.

Living up to the hype?

Not to say this is the case here, but CES demos, at the best of times, can be massaged to make technology look better than it is. Any live demo can. (Steve Jobs, famously, demoed the original iPhone in 2007 on a model that would fail if he didn’t follow a precise series of steps when showing how it worked.) Demos in the virtual era — such as at a livestreamed virtual show like CES 2021 — throw up even more opportunities for misrepresentation.

When it comes to autonomous vehicles and imaging, there are plenty of question marks. Until the problem of autonomous driving is perfected (and what exactly does that mean?), there will be disagreement about the best way to build one. Lidar, for instance, has its staunch proponents, while Tesla CEO Elon Musk has proclaimed it “unnecessary” and “a fool’s errand.”

Image used with permission by copyright holder

But new possibilities such as this don’t just represent rival approaches, they represent breakthroughs that could form part of smarter hybrid systems that take advantage of the best of all worlds. In this capacity, Arbe isn’t alone in announcing autonomous sensing breakthroughs at CES. Also at this year’s show, Seoul Robotics — a South Korean company — is introducing its first mass market product, a multi-industry plug-and-play, next-gen lidar solution. Another startup, Cognata, is introducing Real 2 Sim, a new product that takes data recordings from drives and turns them automatically into simulations and datasets.

It’s not just self-driving cars this tech could benefit. Arbe, for its part, has a big focus on improving autonomous delivery robots so they can navigate better in the real world. “For the first generation, [creators went] overkill with the amount of sensors that they are using,” Machness said. “But to try and reduce the costs, [they are now trying to] reduce the amount of sensors, but also increase the safety of those robots and their capability to move everywhere.”

The same technology could also be used to drive autonomous trucks, buses, drones, and far more that will hit the road in increasing numbers in the next several years.

Autonomous vehicles have been a headline-grabbing part of CES since at least 2013. But this year, with coronavirus having stripped away the flash of the live event, hopefully what will emerge is more focus on substance and solving some of the problems that have kept autonomous vehicles partly in the realm of science fiction for so long.

Because who doesn’t want to see the next generation of vehicles sporting self-driving technology?

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
MIT’s shadow-watching tech could let autonomous cars see around corners
mit shadow look around corners sensing 0

Whether it’s cyclists about to lurch into the wrong lane or a pedestrian getting ready to cross the street, self-driving cars need to be hyperaware of what is going on around them at all times. But one thing they can’t do is to see around corners. Or can they? In a paper presented at this week’s International Conference on Intelligent Robots and Systems (IROS), Massachusetts Institute of Technology researchers have shown off technology which could allow autonomous vehicles or other kinds of robots to do exactly that -- by looking for changes in shadows on the ground to reveal if a moving object is headed their way.

"ShadowCam operates by detecting small differences in shadows and using this information to detect possible static and dynamic objects that are otherwise out of your line of sight," Alexander Amini and Igor Gilitschenski, two MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) researchers who worked on the project, told Digital Trends via email. "First of all, we need to focus on the same region of interest as we move, which we achieve by integrating a visual motion estimation technique into ShadowCam. Based on this stabilized image of the region of interest, we [then] use color amplification combined with a dynamic threshold on the intensity changes."

Read more
Unique cycling jacket could improve safety in a world of autonomous cars
unique cycling jacket could boost safety in a world of autonomous cars qrcode  2

A special jacket that helps autonomous vehicles “see” cyclists has won a prize at an international design conference.

Created by design student Philip Siwek, the reflective jacket incorporates several QR codes that could one day help self-driving cars and trucks better detect cyclists and understand their intended movements.

Read more
In the future, hackers could cause traffic chaos by stalling self-driving cars
stalling self driving cars gridlock traffic manhattan

According to the experts, even a small number of self-driving cars on the road could solve some major traffic problems, due to their avoidance of the unnecessary stop-and-go driving that comes from human error. But a new study from researchers at the Georgia Institute of Technology presents a pretty damning flip side to this effect.

Physicists at Georgia Tech say that future hackers could wreak havoc on cities by seizing control of a limited percentage of autonomous and internet-connected cars and causing them to stall. How big of a headache could this cause? According to the researcher’s modeling of the problem, randomly stalling 20% of cars during rush hour would mean a total traffic freeze in a place like Manhattan. Hacking just 10% of vehicles during the same time frame would be enough to prevent emergency vehicles from being able to expediently weave through traffic.

Read more