Skip to main content

Here’s how Bosch teaches cars to see using artificial intelligence

A car needs to be able to see its environment before it can drive itself. Don’t let pareidolia fool you; its headlights aren’t eyes. They are made out of metal, glass, and plastic parts, and they rely on an enormous amount of computing power to remain open. Without them, the car’s brain isn’t able to make the right decision at the right time.

AI by Bosch is the brains behind many self-driving car platforms, and to see how it works — and how a car sees the world around it — the company gave us a chance to briefly explore the German countryside in one of its prototypes. It turns out Andy Warhol and the future of mobility have more in common than you might think.

Virtual 20/20 vision

The prototype we’re riding shotgun in looks like a garden-variety BMW 3 Series station wagon when you see it from the outside. There are tens of thousands of them on German roads, so what makes this one special? After settling into the leather-upholstered passenger seat we notice it’s decked out with cameras, sensors, and radars that are attached to the windshield, though we’re told only the monocular camera is turned on during our trip. There is also an additional panel on the center console with various input ports, and a tablet mounted on the dashboard.

The technology is relatively simple – at least on paper. The windshield-mounted camera records footage and sends it to a PC stuffed in the Bimmer’s trunk. The information goes through a graphic processing unit (GPU) manufactured by Nvidia before traveling to the car’s on-board brain. The tablet on the dashboard is only there for demonstration purposes.

Our prototype splits up the outside world into 19 categories and each one is identified by a different color.

Artificial intelligence helps our prototype split up the outside world into 19 categories. Each one is identified by a different color, which creates a Pop Art-like view of what’s ahead. It knows the difference between a street and a sidewalk, and it can identify various objects such as traffic signs, traffic lights, pedestrians, and different types of vehicles including cars, trucks, and bicycles. Much like a human driver, the car recognizes which objects are safe to drive over and which ones it needs to brake for.

It’s smart enough to identify what’s ahead with surprising speed and accuracy; it’s been taught a street sign is not a tree or a small child riding a skateboard. It’s chilling to think about. We’re riding in a BMW station wagon that knows almost as much about driving in a city as its two occupants.

Back to school

The prototype learned everything it knows from members of Bosch’s research and development department.

“We have an offline training process,” explained research engineer Dimitrios Bariamis in an interview with Digital Trends. “We give the car images that we annotate, so we say ‘in this part of the image there is a pedestrian, this part of the image is a street,’ and so on. Then we get that into the car, and we give it the image from the camera which is processed according to the parameters that have been previously learned. The system knows that this part of the image is a street because it looks like the street it saw during the training process,” he adds.

Bariamis and his team have fed the system thousands of screen shots from on-board video footage taken in German cities like Munich, Frankfurt, and Stuttgart. They also sourced images from Daimler’s Cityscapes Dataset, which breaks the world down into the exact same 19 categories. These annotated images help the car learn as it moves along, even if it’s traveling in a town it’s never been to before. “Artificial intelligence generalizes the unknown,” Bariamis tells us.

The software classifies the world around it even in a heavy rain storm, but it hasn’t been tested in the snow yet. Bariamis is optimistic, and he doesn’t think snow will impair the car’s vision. Right now, the only limitations his team has identified are linked to what the car has and hasn’t seen, and hardware issues. For example, the system has never “seen” a highway yet, so it might not be able to identify a toll booth. It also goes without saying that the car loses its eyesight if something – e.g., the viscous contents of an avian digestive system – suddenly covers up the camera.

The project is the work of Bosch’s forward-thinking research and development arm. Where it goes next depends entirely on the company’s corporate arm and its clients.

Bariamis told us the technology can be integrated into relatively basic driver-assistance features like adaptive cruise control, state-of-the-art semi-autonomous software, and even a fully-autonomous car. Crucially, it can be modified for various uses. The software we experienced in Germany sees the world in 19 colors, but it’s possible to either add more categories when more detailed information is required, or delete a few of them if they’re not needed.

The Warhol-esque view of the world showcased by Bosch’s BMW-based prototype is what will make the advent of robot cars possible in the years to come. It’s an integral part of the technology package that will help the automotive industry transition from building cars to manufacturing intelligent cars.

Ronan Glon
Ronan Glon is an American automotive and tech journalist based in southern France. As a long-time contributor to Digital…
Officers confused as they pull over an empty self-driving car

In what appears to be the first incident of its kind, police officers recently pulled over a self-driving car with no one inside it.

The incident, which took place on a street in San Francisco earlier this month, was caught on video by a passing pedestrian. It shows several traffic cops pondering about how to handle the incident after stopping the vehicle for failing to have its front lights on while driving at night.

Read more
How a big blue van from 1986 paved the way for self-driving cars
Lineup of all 5 Navlab autonomous vehicles.

In 1986, a blue Chevy van often cruised around the streets of Pittsburgh, Pennsylvania near Carnegie Mellon University. To the casual observer, nothing about it appeared out of the ordinary. Most people would pass by it without noticing the camcorder peeking out from its roof, or the fact that there were no hands on the steering wheel.

But if any passerby had stopped to inspect the van and peer into its interior, they would have realized it was no ordinary car. This was the world's first self-driving automobile: A pioneering work of computer science and engineering somehow built in a world where fax machines were still the predominant way to send documents, and most phones still had cords. But despite being stuck in an era where technology hadn't caught up to humanity's imagination quite yet, the van -- and the researchers crammed into it -- helped to lay the groundwork for all the Teslas, Waymos, and self-driving Uber prototypes cruising around our streets in 2022.

Read more
Aptiv’s machine learning-powered radar sees even what you don’t
lyft and aptivs self driving car program has come a long way but not far enough aptiv screen press

Aptiv traveled to CES 2022 to showcase the improvements it has made to its suite of advanced driver assistance systems. It notably leveraged the power of machine learning technology to help its self-driving prototypes detect and classify objects, even those that are out of sight.

Think of a self-driving car as a human being; radars are the eyes and machine learning technology is the brain. Fitting radars to a car's body allows it to scope out the environment it operates in. It can detect that there's a car in front of it, that there's a bike coming the other way, and that there's a traffic light it needs to stop for. These are fairly straightforward tasks that most self-driving prototypes already perform.

Read more