[ad_1]
Anyone living in San Francisco knows that the city has been a testing ground for hundreds of self-driving cars – and there is probably a good reason why. In light of a series of incidents, including a pedestrian who was seriously injured by a Cruise robotaxi last year, California law enforcement has its hands tied when it comes to issuing moving violations when no human is behind the wheel.
Currently, California law enforcement can only write traffic tickets to humans, not robots, meaning that autonomous vehicles operating in a driverless mode are only susceptible to parking tickets.
Now, as Cruise is in hot water for allegations that it misled the California Department of Motor Vehicles regarding the incident where a pedestrian got seriously injured, residents and activists are calling for tighter laws and new watchdogs, reports NBC.
Waymo had said that it has a permit to operate 250 robotaxis in San Francisco, and that it deploys about 100 of them at any one time, according to NPR. Cruise, which has stopped all services after the pedestrian incident, had run 100 cars in San Francisco and about 300 at night.
While autonomous vehicle makers argue that their cars won’t get better without logging in real-time hours behind the wheel, safety concerns of course abound. SFGate wrote back in August that the local fire department had logged almost 60 reports of “driverless AVs impeding their activities,” including an incident when firefighters had to smash the window of a Cruise car to stop it from running over a firehose.
“I think all of us are still struggling to understand whether [driverless cars] really are safer than human drivers and in what ways they might not be,” Irina Raicu, the director of the Internet Ethics program at Santa Clara University, told NBC. “It seems like while they make fewer of the kind of mistakes that we see from human drivers, they make interesting new kinds of mistakes.”
Tesla’s home state of Texas, however, does have legislation in place to issue moving violations to driverless vehicles. Arizona – a hotbed of autonomous driving – is working on the same, issuing legislation that says that autonomous vehicle owners “may be issued a traffic citation or other applicable penalty if the vehicle fails to comply with traffic or motor vehicle laws.”
Neither Cruise or Waymo have reported any deaths involving their autonomous cars. Still, it’s very early days, with Waymo tallying just over 7 million driverless miles, and Cruise having had logged 5 million miles before stopping operations. The National Highway Traffic Safety Administration states that humans, on average, cause one death about every 100 million miles driven.
Still, in light of Cruise’s fallout from the accident, Waymo has issued a fresh round of data that claims its robotaxis are safer than human drivers. It cites a 57% reduction in police-reported crashes and an 85% reduction in crashes causing bodily injury. That’s compared to human drivers over 7.14 million rider-only miles.
Of course, while Cruise has been the center of the controversy, it hasn’t been smooth sailing for Waymo either, which has had its own problems, mostly for blocking traffic. One incident reportedly involved five Waymo vehicles stalling on one street in San Francisco due to dense fog.
FTC: We use income earning auto affiliate links. More.
[ad_2]
Source link