Skip to main content

Cruise’s robotaxi service suspended by California regulator

Autonomous car startup Cruise has run into trouble in California after the state’s Department of Motor Vehicles (DMV) said Tuesday it was suspending its deployment and driverless permits with immediate effect.

The dramatic intervention comes just a couple of months after General Motors-owned Cruise was given permission to operate robotaxi services around the clock, but also follows a number of troubling incidents involving self-driving Cruise cars on the streets of San Francisco, where it’s been carrying out tests on public roads in recent years.

Recommended Videos

The DMV’s order of suspension, seen by TechCrunch, listed a number of factors that prompted the move. They include an accusation that Cruise withheld video footage from a live investigation relating to an incident earlier this month in which a female pedestrian was left trapped beneath a Cruise autonomous car immediately after she was struck by another vehicle. The woman is still recovering from her ordeal.

The DMV’s order said that Cruise failed to show all of the maneuvers made by the Cruise car in the immediate aftermath of the incident, and only found out that there was more to see after speaking with another government agency. However, Cruise claims it provided the DMV with “the full video.”

The DMV added that Cruise’s apparent failure to disclose the full video prevents it from effectively evaluating the company’s ability to safely operate its vehicles, a situation that presents a risk to public safety.

“Public safety remains the California DMV’s top priority, and the department’s autonomous vehicle regulations provide a framework to facilitate the safe testing and deployment of this technology on California public roads,” the regulator said. “When there is an unreasonable risk to public safety, the DMV can immediately suspend or revoke permits.”

Cruise must now enact particular measures to have its permit reinstated.

The current suspension affects Cruise’s fully driverless cars and not those with a safety driver behind the wheel. But in a statement emailed to Digital Trends, Cruise said it would be pausing operations of all of its driverless cars in San Francisco.

“Ultimately, we develop and deploy autonomous vehicles in an effort to save lives,” it said. “In the incident being reviewed by the DMV, a human hit-and-run driver tragically struck and propelled the pedestrian into the path of the AV. The AV braked aggressively before impact and because it detected a collision, it attempted to pull over to avoid further safety issues. When the AV tried to pull over, it continued before coming to a final stop, pulling the pedestrian forward. Our thoughts continue to be with the victim as we hope for a rapid and complete recovery.”

It explained that its team proactively shared information with the California DMV and other relevant bodies, “including the full video.”

Cruise has hit the headlines on a number of occasions after incidents involving its autonomous cars in San Francisco. In August, a collision with a fire truck resulted in Cruise being ordered to halve its fleet in the city. In the same month, another Cruise car got stuck in wet concrete.

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
Now anyone in LA can take Waymo robotaxi rides 24/7
A Waymo robotaxi picking up a passenger.

It just got much easier to take a robotaxi ride in Los Angeles. Waymo announced on Tuesday that anyone in the California city can now take fully autonomous rides, removing the need to join a wait list.

Alphabet-owned Waymo started offering paid robotaxi rides in Los Angeles earlier this year via its Waymo One app, but strong demand resulted in a wait list of nearly 300,000 people wanting to join the service.

Read more
Waymo, Nexar present AI-based study to protect ‘vulnerable’ road users
waymo data vulnerable road users ml still  1 ea18c3

Robotaxi operator Waymo says its partnership with Nexar, a machine-learning tech firm dedicated to improving road safety, has yielded the largest dataset of its kind in the U.S., which will help inform the driving of its own automated vehicles.

As part of its latest research with Nexar, Waymo has reconstructed hundreds of crashes involving what it calls ‘vulnerable road users’ (VRUs), such as pedestrians walking through crosswalks, biyclists in city streets, or high-speed motorcycle riders on highways.

Read more
Tesla posts exaggerate self-driving capacity, safety regulators say
Beta of Tesla's FSD in a car.

The National Highway Traffic Safety Administration (NHTSA) is concerned that Tesla’s use of social media and its website makes false promises about the automaker’s full-self driving (FSD) software.
The warning dates back from May, but was made public in an email to Tesla released on November 8.
The NHTSA opened an investigation in October into 2.4 million Tesla vehicles equipped with the FSD software, following three reported collisions and a fatal crash. The investigation centers on FSD’s ability to perform in “relatively common” reduced visibility conditions, such as sun glare, fog, and airborne dust.
In these instances, it appears that “the driver may not be aware that he or she is responsible” to make appropriate operational selections, or “fully understand” the nuances of the system, NHTSA said.
Meanwhile, “Tesla’s X (Twitter) account has reposted or endorsed postings that exhibit disengaged driver behavior,” Gregory Magno, the NHTSA’s vehicle defects chief investigator, wrote to Tesla in an email.
The postings, which included reposted YouTube videos, may encourage viewers to see FSD-supervised as a “Robotaxi” instead of a partially automated, driver-assist system that requires “persistent attention and intermittent intervention by the driver,” Magno said.
In one of a number of Tesla posts on X, the social media platform owned by Tesla CEO Elon Musk, a driver was seen using FSD to reach a hospital while undergoing a heart attack. In another post, a driver said he had used FSD for a 50-minute ride home. Meanwhile, third-party comments on the posts promoted the advantages of using FSD while under the influence of alcohol or when tired, NHTSA said.
Tesla’s official website also promotes conflicting messaging on the capabilities of the FSD software, the regulator said.
NHTSA has requested that Tesla revisit its communications to ensure its messaging remains consistent with FSD’s approved instructions, namely that the software provides only a driver assist/support system requiring drivers to remain vigilant and maintain constant readiness to intervene in driving.
Tesla last month unveiled the Cybercab, an autonomous-driving EV with no steering wheel or pedals. The vehicle has been promoted as a robotaxi, a self-driving vehicle operated as part of a ride-paying service, such as the one already offered by Alphabet-owned Waymo.
But Tesla’s self-driving technology has remained under the scrutiny of regulators. FSD relies on multiple onboard cameras to feed machine-learning models that, in turn, help the car make decisions based on what it sees.
Meanwhile, Waymo’s technology relies on premapped roads, sensors, cameras, radar, and lidar (a laser-light radar), which might be very costly, but has met the approval of safety regulators.

Read more