Skip to main content’s self-driving cars use dashboard displays so passengers won’t stress out

Passengers may not be skittish when self-driving cars are commonplace, but in the early years, most of us will be on high alert. Self-driving system developer employs multiple visualization technologies to reassure passengers, as well as to help company engineers understand what the system is “seeing” and how it performs. recently outlined four primary visualization tools it uses for internal study and passenger reassurance in Medium. The company described how it uses dashboard displays, 3D data visualization, annotated data sets, and interactive simulations in product development.

Onboard display

Onboard displays

Passenger reassurance and comfort motivates’s dashboard displays. The onboard display combines data from lidar sensors and full-surround cameras to create 3D images as the car drives. By enhancing the image with data from radar, GPS, and an inertial measurement unit (IMU), the system helps passengers understand what the vehicle is about to do as well, as what it picks up with the various sensors. Pointcloud

The many uses of AI visualization

Off-board analysis’s engineers use real-time data from cars to create 3D visualizations that include mapping, motion planning, perception, and localization and state estimation, plus a host of additional robotics elements. The full assemblage enables the company to dive deeper into self-driving performance.

Synchronizing the timing of the various sensor data signals is a crucial element of successful autonomous vehicle performance. By incorporating a wide range of vehicle sensor, mapping, and traffic network data into a single visualization, the engineers can tweak the various algorithms to enhance the timing coordination. This toolset also facilitates testing and variable analysis.

Annotated massive datasets

Annotated datasets

According to, it takes about 800 human hours to correctly label all the data collected during one hour of driving. Human annotators label the initial data sets. Deep-learning A.I. applies what it “learns” from the human-annotated data to label additional data quickly and reliably. The human annotators work on new types of data and quality check the machine-labeled data.

Visual data simulation


Working from what the company calls “massive libraries of scenarios,” engineers test and evaluate the company’s autonomous systems by using driving simulators in 3D visualized worlds. With the company’s autonomous system running in the background, the team can change elements such as traffic light patterns and pedestrian behaviors to observe how the self-driving program reacts and responds.

Editors' Recommendations

Bruce Brown
Digital Trends Contributing Editor Bruce Brown is a member of the Smart Homes and Commerce teams. Bruce uses smart devices…
The future of transportation: Self-driving cars? Try self-driving everything
GM electric flying taxi

Technology is reshaping every aspect of our lives. Once a week in The Future Of, we examine innovations in important fields, from farming to transportation, and what they will mean in the years and decades to come. 

Stroll around any CES (virtual or otherwise) in the last decade and it’s impossible to miss all the feels the tech industry has for transportation, self-driving cars in particular. Every major technology company has its fingers in cars, from the infotainment systems powered by Google and Apple to the operating systems driven by Blackberry and Linux to the components and circuits that make up the car itself, built by Qualcomm and Nvidia and NXP and a dozen more. (And don't get me started about this Apple Car nonsense.)

Read more
From Paris to NYC, Mobileye will bring self-driving cars to metropolises
A self-driving vehicle from Mobileye's autonomous test fleet navigates the streets of Detroit. (Credit: Mobileye, an Intel Company)

A Tesla in Autopilot mode can ply the highways of Northern California without issue, but when it comes to congested cities packed with erratic vehicle traffic, bikes, and pedestrians, cameras don’t always cut it. Or they didn’t, anyway. After years of testing, Intel-owned Mobileye intends to embrace the madness of the metropolis by rolling out self-driving cars in cities across the world.

On Monday, the first day of CES 2021, the company announced that Tokyo, Shanghai, Paris, Detroit, and New York City will all see fleets of Mobileye-powered vehicles rolled out in early 2021, if all goes well (regulatory issues are still being ironed out in NYC).

Read more
To reach level 4 autonomy, these self-driving cars head to winter boot camp
Sensible 4 winter driving

Is there a more magical seasonal sight than snowflakes falling on banks of snow under a white sky, the only bursts of color to break up the merry scene being a jolly holly bush or a Christmas robin hopping across the top of a frozen fence? Maybe not if you’re a human. If you’re a self-driving car, on the other hand, that scene is pretty darn terrifying.

Autonomous vehicles are increasingly great at parsing street scenes and safely navigating according to either camera images or bounced Lidar inputs. Unfortunately, snow is an issue for both cameras and laser scanners due to noise (read: falling snow) blocking the sensors, and white-out conditions preventing the camera from seeing surroundings properly.

Read more