Skip to main content

Drive.ai’s self-driving cars use dashboard displays so passengers won’t stress out

Passengers may not be skittish when self-driving cars are commonplace, but in the early years, most of us will be on high alert. Self-driving system developer Drive.ai employs multiple visualization technologies to reassure passengers, as well as to help company engineers understand what the system is “seeing” and how it performs.

Drive.ai recently outlined four primary visualization tools it uses for internal study and passenger reassurance in Medium. The company described how it uses dashboard displays, 3D data visualization, annotated data sets, and interactive simulations in product development.

Onboard display

Onboard displays

Passenger reassurance and comfort motivates Drive.ai’s dashboard displays. The onboard display combines data from lidar sensors and full-surround cameras to create 3D images as the car drives. By enhancing the image with data from radar, GPS, and an inertial measurement unit (IMU), the system helps passengers understand what the vehicle is about to do as well, as what it picks up with the various sensors.

Drive.ai Pointcloud

The many uses of AI visualization

Off-board analysis

Drive.ai’s engineers use real-time data from cars to create 3D visualizations that include mapping, motion planning, perception, and localization and state estimation, plus a host of additional robotics elements. The full assemblage enables the company to dive deeper into self-driving performance.

Synchronizing the timing of the various sensor data signals is a crucial element of successful autonomous vehicle performance. By incorporating a wide range of vehicle sensor, mapping, and traffic network data into a single visualization, the engineers can tweak the various algorithms to enhance the timing coordination. This toolset also facilitates testing and variable analysis.

Annotated massive datasets

Annotated datasets

According to Drive.ai, it takes about 800 human hours to correctly label all the data collected during one hour of driving. Human annotators label the initial data sets. Deep-learning A.I. applies what it “learns” from the human-annotated data to label additional data quickly and reliably. The human annotators work on new types of data and quality check the machine-labeled data.

Visual data simulation

Simulation

Working from what the company calls “massive libraries of scenarios,” Drive.ai engineers test and evaluate the company’s autonomous systems by using driving simulators in 3D visualized worlds. With the company’s autonomous system running in the background, the team can change elements such as traffic light patterns and pedestrian behaviors to observe how the self-driving program reacts and responds.

Editors' Recommendations

Bruce Brown
Digital Trends Contributing Editor Bruce Brown is a member of the Smart Homes and Commerce teams. Bruce uses smart devices…
Tesla issues stark warning to drivers using its Full Self-Driving mode
A Telsa Model 3 drives along a road.

Tesla in recent days rolled out a long-awaited update to its Full Self-Driving (FSD) mode that gives its vehicles a slew of driver-assist features.

But in a stark warning to owners who’ve forked out for the premium FSD feature, Tesla said that the software is still in beta and therefore “may do the wrong thing at the worst time.” It insisted that drivers should keep their "hands on the wheel and pay extra attention to the road.”

Read more
The future of transportation: Self-driving cars? Try self-driving everything
GM electric flying taxi

Technology is reshaping every aspect of our lives. Once a week in The Future Of, we examine innovations in important fields, from farming to transportation, and what they will mean in the years and decades to come. 

Stroll around any CES (virtual or otherwise) in the last decade and it’s impossible to miss all the feels the tech industry has for transportation, self-driving cars in particular. Every major technology company has its fingers in cars, from the infotainment systems powered by Google and Apple to the operating systems driven by Blackberry and Linux to the components and circuits that make up the car itself, built by Qualcomm and Nvidia and NXP and a dozen more. (And don't get me started about this Apple Car nonsense.)

Read more
From Paris to NYC, Mobileye will bring self-driving cars to metropolises
A self-driving vehicle from Mobileye's autonomous test fleet navigates the streets of Detroit. (Credit: Mobileye, an Intel Company)

A Tesla in Autopilot mode can ply the highways of Northern California without issue, but when it comes to congested cities packed with erratic vehicle traffic, bikes, and pedestrians, cameras don’t always cut it. Or they didn’t, anyway. After years of testing, Intel-owned Mobileye intends to embrace the madness of the metropolis by rolling out self-driving cars in cities across the world.

On Monday, the first day of CES 2021, the company announced that Tokyo, Shanghai, Paris, Detroit, and New York City will all see fleets of Mobileye-powered vehicles rolled out in early 2021, if all goes well (regulatory issues are still being ironed out in NYC).

Read more