Waymo said its driverless cars hit the public roads last month. The company did not say whether it was testing the driverless cars in environments considered challenging for autonomous vehicles, like bridges or tunnels, or more difficult conditions, like driving at night or in rain and snow — usually not a big concern in the dry Phoenix climate.
While the prospect of cars without emergency drivers may raise concerns among some passengers, Waymo said it had confidence in the safety of its self-driving technology. It has included backup systems like a secondary computer to take over if the main computer fails. And though the cars are driverless, they are not entirely without humans, at least for now. Waymo employees sit in the back seat of the cars, monitoring them, a company spokesman, Johnny Luu, said.
Once passengers join the tests, they will be able to contact Waymo support staff with a button inside the car. If the cars are involved in a crash, they are programmed to respond appropriately, including pulling off the road on their own.
Driverless cars are regulated by a patchwork of state laws. Arizona, like many states, has no restrictions against operating an autonomous vehicle without a person in the driver’s seat. On the other hand, California, where Waymo is headquartered, requires any self-driving car to have a safety driver sitting in the front.
In December, Waymo published a report for California’s Department of Motor Vehicles about how frequently its car “disengaged” — deactivating its autonomous mode because of a system failure or safety risk and forcing a human driver to take over. In the report, Waymo said this happened once every 5,000 miles the cars drove in 2016, compared with once every 1,250 miles in 2015.
Consumer Watchdog, a frequent critic of Alphabet, said that data demonstrated that the cars are not ready to drive without any human intervention and that Waymo was following the Silicon Valley model of “beta testing” a new technology on the public.
“It’s the wrong approach when you’re dealing with self-driving cars,” said John M. Simpson, a director at Consumer Watchdog. “When things go wrong with a robot car, you kill people.”
Researchers believe self-driving cars can be safer than cars operated by human drivers because they are programmed to adhere strictly to traffic laws, they don’t get distracted, and they usually refrain from taking unnecessary risks.
Timothy Tait, a spokesman for the Arizona Department of Transportation, said the state was on pace to exceed 1,000 automobile-related fatalities this year and that its top priority is the public’s safety — particularly by advancing efforts to reduce crashes and deaths on its roads.
“We are closely monitoring emerging technologies like self-driving cars that may ultimately support safer travel and open up opportunities for populations who today are unable to drive for themselves,” he said in a statement.
Waymo, which started as a research and development project for Google in 2009, maintains what many in the industry consider a technological advantage over its competitors. Waymo said its autonomous vehicles had driven more than 3.4 million miles on actual roads — with safety drivers — as well as running 10 million miles every day in a virtual simulator.
In his remarks, Mr. Krafcik said Waymo sees a ride-hailing taxi service as the first commercial application of the company’s driverless car technology, though there could be other uses in logistics and public transportation.
Taking the human out of the equation will fundamentally change transportation and change how people buy cars, said Mr. Krafcik, who was an executive at Hyundai Motors before joining Google.
“Because you’re accessing vehicles rather than owning, in the future, you could choose from an entire fleet of vehicle options that are tailored to each trip you want to make,” he said. “They can be designed for specific purposes or tasks.”
Continue reading the main story