Skip to main content

The future of cars: A new spin on an old idea could revolutionize autonomous vehicles

“Autonomous vehicles and autonomous driving [brought with it a huge amount] of hype. Everybody thought that, by 2020 or 2021, we would see a significant number of autonomous vehicles and autonomous services and autonomous robots. This didn’t happen. I think there is agreement that the reason for this is the lack of mature sensing technologies.”

Recommended Videos

On the face of it, this is a strange thing for Ram Machness, vice president of product at a company called Arbe Robotics, to say. After all, Arbe makes sensors that help autonomous vehicles to drive. This is a bit like if Tim Cook, CEO of Apple, were to say that the reason the smartphone market declined last year is because nobody’s making good smartphones.

Machness has a point, however. And it’s one that he hopes the company’s new technology — unveiled at this year’s all-virtual CES — will help change. With its new sensing technology, it believes it won’t be long before more autonomous driving tech is hitting the road. For real this time.

Machness points out that the erroneous assumption made by many of the people who built self-driving cars was that the algorithms they built for autonomous driving would have access to complete information about the world in which these cars were driving. This didn’t happen. Instead of having perfect information about the world they were moving in, they were hamstrung by sensing-perception challenges that needed to be solved before they could create algorithms that would power autonomous technologies for various applications.

That’s like trying to teach someone how to do their job in an office that’s just suffered a power outage. One problem needs to be solved before the other can be attempted. And until now that’s not been possible.

Could next-gen radar help change that?

Radar makes a comeback

Radar hasn’t been taken especially seriously as a way to get autonomous vehicles to perceive the world, other than as a means by which to detect the velocity of objects that have already identified through other sensors. Much of the main discussion has involved either computer vision using standard cameras or lidar, referring to bounced lasers used to measure distance. Both approaches have their positives and negatives.

Radar, which involves bounced electromagnetic waves, has been around a lot longer than lidar, but also has some pretty big challenges.

As an example, Machness shows an image of a black screen with a handful of glowing orange dots splashed across its surface. It looks like someone has spattered a small amount of colored paint on a dark wall or, perhaps, the reflection of city lights in water at night. It is virtually impossible to work out what you are seeing. This is traditional radar, he said, a technology that many cars are equipped with today for things like parking sensors, but which virtually no- one takes seriously for imaging. What we are “seeing” is a street scene, complete with other cars and an assortment of additional obstacles.

Machness jumps to another video and we’re now seeing what appears to be a psychedelic dashcam of a car winding its way down treelined streets. Other than the fact that it looks like it was filmed in Predator-style heat vision, it’s perfectly readable — by humans, let alone machines.

The big upgrade, he noted, is the amount of transmitting and receiving channels on the part of the radar. Machness likens this to the number of pixels in a camera image. “If I count the amount of channels in today’s radars, they have 12 channels,” he said. “The more advanced ones have 48 channels. We see some competitors working towards 192 channels. [We’ve developed radar with] 2,000 channels. That’s the breakthrough. We’re able to process them simultaneously.”

As announced January 11 at CES, Arbe’s new radar technology promises “4D” radar imaging for autonomous vehicles with the ability to separate, identify, and track objects in high resolution thanks to a next-gen radar that’s 100 times more detailed than any other radar on the market. This “2K ultra-high resolution” radar tech promises to be “road ready” by early 2022, a year from now.

Image used with permission by copyright holder

The company is working with a large number of big, but as-yet-unannounced partners to bake this technology into future road-occupying vehicular platforms. “The problem Arbe is trying to solve is to bring imaging radar that has almost zero false alarms and very high resolution [to autonomous vehicles,]” Machness said.

One of the big advantages of radar is the possibility of using it in bad weather conditions. “Things that cameras and lidar are very sensitive to — like fog, rain, or dust — radar technology is significantly less sensitive to,” said Machness.

Living up to the hype?

Not to say this is the case here, but CES demos, at the best of times, can be massaged to make technology look better than it is. Any live demo can. (Steve Jobs, famously, demoed the original iPhone in 2007 on a model that would fail if he didn’t follow a precise series of steps when showing how it worked.) Demos in the virtual era — such as at a livestreamed virtual show like CES 2021 — throw up even more opportunities for misrepresentation.

When it comes to autonomous vehicles and imaging, there are plenty of question marks. Until the problem of autonomous driving is perfected (and what exactly does that mean?), there will be disagreement about the best way to build one. Lidar, for instance, has its staunch proponents, while Tesla CEO Elon Musk has proclaimed it “unnecessary” and “a fool’s errand.”

Image used with permission by copyright holder

But new possibilities such as this don’t just represent rival approaches, they represent breakthroughs that could form part of smarter hybrid systems that take advantage of the best of all worlds. In this capacity, Arbe isn’t alone in announcing autonomous sensing breakthroughs at CES. Also at this year’s show, Seoul Robotics — a South Korean company — is introducing its first mass market product, a multi-industry plug-and-play, next-gen lidar solution. Another startup, Cognata, is introducing Real 2 Sim, a new product that takes data recordings from drives and turns them automatically into simulations and datasets.

It’s not just self-driving cars this tech could benefit. Arbe, for its part, has a big focus on improving autonomous delivery robots so they can navigate better in the real world. “For the first generation, [creators went] overkill with the amount of sensors that they are using,” Machness said. “But to try and reduce the costs, [they are now trying to] reduce the amount of sensors, but also increase the safety of those robots and their capability to move everywhere.”

The same technology could also be used to drive autonomous trucks, buses, drones, and far more that will hit the road in increasing numbers in the next several years.

Autonomous vehicles have been a headline-grabbing part of CES since at least 2013. But this year, with coronavirus having stripped away the flash of the live event, hopefully what will emerge is more focus on substance and solving some of the problems that have kept autonomous vehicles partly in the realm of science fiction for so long.

Because who doesn’t want to see the next generation of vehicles sporting self-driving technology?

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
MIT’s shadow-watching tech could let autonomous cars see around corners
mit shadow look around corners sensing 0

Whether it’s cyclists about to lurch into the wrong lane or a pedestrian getting ready to cross the street, self-driving cars need to be hyperaware of what is going on around them at all times. But one thing they can’t do is to see around corners. Or can they? In a paper presented at this week’s International Conference on Intelligent Robots and Systems (IROS), Massachusetts Institute of Technology researchers have shown off technology which could allow autonomous vehicles or other kinds of robots to do exactly that -- by looking for changes in shadows on the ground to reveal if a moving object is headed their way.

"ShadowCam operates by detecting small differences in shadows and using this information to detect possible static and dynamic objects that are otherwise out of your line of sight," Alexander Amini and Igor Gilitschenski, two MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) researchers who worked on the project, told Digital Trends via email. "First of all, we need to focus on the same region of interest as we move, which we achieve by integrating a visual motion estimation technique into ShadowCam. Based on this stabilized image of the region of interest, we [then] use color amplification combined with a dynamic threshold on the intensity changes."

Read more
Tesla reveals price range for Optimus Gen2, its ‘robot without wheels’
Tesla's 2022 Optimus robot prototype is seen in front of the company logo.

“The future should look like the future”, CEO Elon Musk said at the Tesla "We Robot" special event held in Burbank, California, earlier this week. Sure enough, Tesla’s much-anticipated autonomous robotaxi, the Cybercab, and its large-van counterpart, the Cybervan, seemed straight out of celebrated sci-fi movies. But as the name of the event hinted at, a vision of the future would not be complete without robots: Several of the Optimus Gen 2, Tesla’s latest version of humanoid-like robot, were found serving drinks, holding conversations with guests, and even dancing at the event.Tesla has recently pitched the Optimus as a potential replacement for factory workers in China and elsewhere. Musk previously said he expects the Optimus to start working at Tesla factories in 2025 and to be available to other firms in 2026.
Yet, at the event, the Tesla boss revealed his expanded vision of a household robot that can do “everything you want: Babysit your kid, walk your dog, mow your lawn, get the groceries, just be your friend, serve drinks”.He also gave a closer estimate of the robot’s price tag: Once produced "at scale," Optimus should cost somewhere between $20,000 and $30,000. Musk had previously said the robot’s price would be about half that of a car. 
Staying true to his sci-fi vision, the Tesla CEO referred to Optimus as a cross between R2D2 and C-3PO, the famous droids from the Star Wars film series.
Ever since the first generation of the Optimus was revealed in 2022, Tesla has emphasized the continuity between its cars and the robot. “Everything that we’ve developed for our cars -- the battery power’s electronics, the advanced motor’s gearboxes, the software, the AI inference computer -- it all actually applies to a humanoid robot,” Musk said at the event. “A robot with arms and legs, instead of a robot with wheels.”
Tesla would not be the first to offer a domestic robot on the market. Hyundai-owned Boston Dynamics has already commercialized a home service-type robot called Spot with a hefty price tag of $74,500. BMW and Open AI are backing robots made by Figure, a California-based company. Meanwhile, Nvidia is developing Project GR00T to also deliver humanoid robots.Earlier this year, Goldman Sachs forecast that the annual global market for humanoid robots could reach $38 billion by 2035, with robot shipments of 1.4 million units both for industrial and consumer applications. It also said that robots could become more affordable as their manufacturing cost has been decreasing more than expected -- leading to faster commercialization.

Read more
GM launches PowerBank, a battery that could rival Tesla’s PowerWall
gm launches powerbank a battery that could rival teslas powerwall energy home system bundle

Competition to provide the best energy savings to EV owners is heating up between auto makers.General Motor’s unit GM Energy has just released PowerBank, a stationary energy storage battery pack that gives electric vehicles (EV) owners the ability to store and transfer energy from the electric grid, and allows integration with home solar power equipment.The PowerBank, which comes in 10.6kWh and 17.7kWh battery capacity variants, can power up a home when there is an outage or help offset higher electricity rates during peak demand, GM said. In addition, customers can also use PowerBank to store and use solar energy, supplement the charging of EVs and provide power to a home without an EV being present.GM says that combining two of its 17.7kWh PowerBanks can provide enough energy to power the average American home for up to 20 hours.The PowerBank can be bought as part of two bundles: the GM Energy Storage bundle at $10,999, or the GM Energy Home System bundle at $12,700. The latter includes a bi-directional EV charger that can provide up to 19.2kWh of power. By comparison, Tesla’s energy storage system, PowerWall 3, can store 13.5kWh of energy and has a price tag of $9,300.According to GM Vice President Wade Sheffer, one key advantage of the PowerBank it its “modularity,” which allows for easy integration with existing technology.GM announced in August that it would provide vehicle-to-home (V2H) technology on all its model year 2026 models. It will now also offer vehicle-to-grid (V2G) technology, which can provide additional energy and financial savings.
Energy savings coming from the integration of electric vehicles, solar-powered homes, and energy grids are increasingly at the center of EV manufacturers' offerings.
Nissan, BMW, Ford, and Honda have grouped together to offer the ChargeScape V2G software, which connects EVs to utilities and the power grid. EV owners can receive financial incentives to pause charging during peak demand or sell energy back to the grid.While Tesla has so far backed off from embracing V2G technology, CEO Elon Musk has hinted that V2G tech could be introduced for Tesla vehicles in 2025.

Read more