Skip to main content

Driverless cars? Mother Nature may have a few things to say about that

2015 Honda CR-V
Image used with permission by copyright holder
By 2:00 AM, I had been traveling for over 12 hours, hopping between planes and strange airports. As I staggered out to the parking lot to retrieve my press demonstrator, I was ready for my day to be done. Despite my yearning for rest, I faced a two-hour drive through a dark and stormy night over a stretch of rural Illinois freeway that hadn’t been improved since Abraham Lincoln was Commander in Chief.

Fortunately, I had the 2015 Honda CR-V waiting for me, the first accessible car to include meaningful piloted driving technology.

Though packed with piloted-driving tech, the CR-V is not perfect. To my chagrin, this point became quickly clear. On that dark, rain-soaked road, the CR-V’s autonomous tech found and locked onto an old and faded freeway lane marking and attempted to pilot itself across the current flow of traffic.

It was this frightening mistake that inspired in me the strong realization that, when it comes to dealing with Mother Nature, autonomous driving tech still has a long way to go. Current self-driving technology, not least what is found in the CR-V, is certainly impressive. When it comes to rain, snow, and nighttime, however, modern optically based systems are literally in the dark.

State of play

The 2015 Honda CR-V is a great example of where self-driving cars are going, not to mention an impressive piece of kit in its own right. Accordingly, it makes a perfect example with which to judge the current state of piloted-driving proficiency. The core of its self-driving capabilities is the combination of existing systems, including Lane Keeping Assist and Adaptive Cruise Control (ACC).

In most cars, ACC is capable of keeping pace with the flow of traffic on the freeway by using radar to track speed and distance of the vehicle ahead. In conjunction, Lane Keeping Assist uses cameras to find and identify lane markings and warn the driver if he or she strays out of that lane. Some vehicles, like the current Mercedes-Benz E-Class, can autonomously steer the car back in the lane a little. So, too, can the CR-V, as long as the driver keeps his or her hands on the wheel.

How does it do this? The CR-V uses cameras paired with a powerful pattern recognition program to find lane markings. Then this information is sent to the electric power steering, which turns the wheels accordingly. The technology is, in essence, fairly simple. The complicated bit is the creation of a program that can consistently and accurately recognize lane markings in all of their various forms and states of disrepair.

The problem

Admittedly, That sounds great. When I was using the system on that dark and rainy freeway, though, things fell apart. The system constantly cut in and out as it found and lost lane markings. On one notable occasion, it even tried to drag me into the next lane over, as it followed badly covered lane markings from an earlier point in construction.

Why did this happen? It’s not because Honda’s system is defective, or even bad, but because of inherent problems with the sensors. Cameras — like the human eye — work great in daylight and good visibility. When light levels drop and rain increases, their ability to capably capture a full images drops significantly.

2015 Honda CR-V
Image used with permission by copyright holder

A rain soaked camera lens, for example, washes out contrast between lane markings and pavement for the camera sensor — or simply obscure them completely. Anyone who has ever tried to take a photo in a darkroom should understand the problem. The reduced input of information, in turn, makes the job of the pattern recognition software more difficult. Computer programs, as a rule, can’t intuit results from incomplete information, something that our brains are actually great at.

As an example, when the CR-V tried to follow an old lane marking, it was obvious to me that we were about to move diagonally across the freeway, because I was capable of taking in the whole picture. The computer program was doggedly doing the only thing it could, however, by follow the markings it recognized.

What I encountered is only the tip of the iceberg … literally. Snow and ice that obscure lane markings make optical systems essentially useless as they cover up everything from lane markings to street signs. So, if this is the case, why even use cameras?

But wait!

Our entire driving infrastructure is based off of human vision. Everything from traffic signs to brake lights are designed with the human eye and brain in mind. For that reason, any automated system that is going to exist in this infrastructure has to be able to use these same symbols. And, right now, the only cost effective way of doing that is by using cameras.

Truly autonomous cars get around some of the problems discussed by augmenting cameras with other more complex sensors, especially LIDAR. The term is a portmanteau of light and radar. This gives a pretty good sense of how it works. On cars, LIDAR systems use a spinning array of lasers that reflect off the world around it, the system utilizes sensors to detect the reflected light, thereby creating a picture of the world around the car.

This system has the advantage of working in far more conditions than an optical camera, though lasers can still be fooled by rain and snow bouncing signals back. But the big problem is that these systems are expensive and bulky, as the massive sensor package on top of Google’s autonomous cars can attest. In fact that system by itself costs around $8,000 — far more than automakers would be willing to spend at this point.

For autonomous cars to get around this challenge, either cameras and their attached computers need to get much better at recognizing the world around them in a broader array of conditions, LIDAR or similar systems need to get much cheaper, or the way we mark our roads needs to start taking robots into account.

Conclusion

To really make the autonomous driving work sensors need to be complemented by Car-2-Car and Car-2-X communication (also referred to as vehicle to vehicle (V2V) and vehicle to infrastructure (V2I)). This will not only allow cars to communicate with each other to fill in sensor gaps, but also receive messages from the road infrastructure itself. The early stages of this technology are being researched and prototyped, but full implementation is a ways off.

While the world waits for cars to start talking to each other, and systems like LIDAR to improve, and, more importantly, get cheaper, optical systems are still going to be the main option. Companies recognize this and are working to get the most out of camera based systems and the software that runs them. Look ahead to our next Road Rave to see what some of those innovations might be.

In the meantime, optically based systems can still do an impressive amount. The Honda CR-V is already able to do what was nearly unimaginable a decade ago; drive itself on the highway. The dream of fully autonomous cars just might not be as close as some companies believe.

Peter Braun
Former Digital Trends Contributor
Peter is a freelance contributor to Digital Trends and almost a lawyer. He has loved thinking, writing and talking about cars…
Mercedes-Benz EQG: range, price, release date, and more
Concept image of the larger electric G-Wagon

The G-Class is going electric. We already knew that Mercedes-Benz was working on an electric, small-size G-Wagon, but it looks like the company is also working on a larger G-Class SUV, in the form of the EQG. In fact, Mercedes has gone as far as to show off a concept version of the off-roader.

While there's much we don't know about what will become the production model of the EQG, Mercedes has also shared a lot about it. Curious about whether the Mercedes-Benz EQG could be the EV for you? Here's everything we know so far.
Design
Fear not -- the EQG will retain many of the design aspects of the G-Class that you already know and love but with a modern face-lift. The EQG will keep the boxy design that gives the G-Class a classic look but with some additional modern styling, at least if the concept version is anything to go by.

Read more
Rivian R2 vs. Kia EV9: battle of affordable electric SUVs
Kia EV9 GT-Line Three Quarters

The long-awaited Rivian R2 has finally been announced, and it's an excellent option for those who want an electric SUV that doesn't completely break the bank. Sure, the R2 isn't cheap -- but it's a whole lot cheaper than most other EVs out there, especially when it comes to SUVs. But Rivian isn't the only company trying to tackle the problem of the budget electric SUV. The Kia EV9 is finally available, and it too offers a modern design and a range of helpful features.

Given the fact that the Rivian R2 and Kia EV9 are two electric SUVs in a similar price range, you might be wondering which is better for your needs. That's why we put the Rivian R2 and the Kia EV9 head-to-head.
Design
Both the Rivian R2 and the Kia EV9 are actual SUVs -- not crossovers pretending to be SUVs, like plenty of other EVs out there. The two vehicles offer big, boxy designs and plenty of interior space, making them excellent options for families or those who need that extra storage.

Read more
Rivian R2 vs R1S: How will Rivian’s cheaper SUV compare?
The front three-quarter view of a 2022 Rivian against a rocky backdrop.

Rivian has finally unveiled the R2, its long-awaited attempt at a more affordable electric SUV. The new vehicle may not be available just yet, but fans of Rivian's design aesthetics and feature set are already looking forward to being able to order the new car. The R2 is targeted at being a more affordable take on the electric SUV and will sit alongside the flagship-tier R1S.

Let's get this out of the way right now: The R1S is most likely going to be a better vehicle than the R2. Rivian isn't replacing the R1S with the R2 — it's releasing the R2 as a more affordable alternative, and there will be some compromises when buying the R2 over the R1S.

Read more