COLCHESTER, Conn. (AP) — U.S. Sen. Richard Blumenthal, who took a firsthand look at self-driving vehicle technology on Tuesday, said it was frightening to see "no hands on the wheel" as his car approached a parked car and called for more safeguards to be added to federal legislation following two recent fatal crashes.
The bill awaiting action in the Senate should ensure people can manually override highly automated vehicles, the Democrat said. He called for the data and safety evaluations of such vehicles during an initial testing period to be made public and said any safety requirements should apply to cars already on the roads with autopilot functions.
"I'm not a luddite. I'm not simply standing in the way of progress. I believe that autonomous or driverless vehicles will be coming," Blumenthal said. "But in the meantime, while we're developing them, they have to be safe."
Blumenthal, a member of the Senate Committee on Science, Commerce and Transportation, said his safety concerns have been heightened with the recent fatal crashes involving a Tesla vehicle operating on autopilot in California and a self-driving Volvo SUV being tested by the ride-hailing service Uber in Arizona.
"There are a number of us who have reservations about simply putting these vehicles on the road, even during the testing period, without guarantees that in fact there will be potential safeguards and also data reporting," he said, referring to his Senate colleagues. "The most recent incidents involving Uber and Tesla certainly have given new force to those safety concerns."
David Friedman, director of cars and product policy and analysis for Consumers Union, the policy and advocacy division of Consumer Reports, said it's critical that safety provisions are added to the bill, which he said would potentially open the door for commercial sale and use of self-driving vehicles.
"The challenge right now is, I would argue, there's a race to be first instead of a race to be safe," he said.
If safety doesn't come first, he said, "you put people at risk and you potentially could set this technology back years if not decades."
Blumenthal on Tuesday was a passenger in two semi-autonomous cars already on the market during a visit to the Consumer Reports test track in Colchester. One car was a Tesla Model 3, which has the same autopilot technology as the car involved in the California crash. Blumenthal said he learned that a human being needs to override the system to avoid certain objects. In his case, it was a car parked along the track.
Blumenthal, who rode with a Consumer Reports employee who didn't have his hands on the steering wheel, said it was frightening to be headed toward the parked vehicle without the guarantee his vehicle was going to stop and "looking next to me and seeing no hands on the wheel." The Consumer Reports employee used the manual override system to avoid a crash.
"It would be funny, but it's serious because this technology is at its toddler state of development, and that's why we need more testing, more guarantees of safety and more protections," Blumenthal said.
Consumer Reports has advocated that drivers of vehicles with semi-autonomous features still need to pay close attention to the roadway.
"You need to be an alert driver," said Jennifer Stockburger, director of operations at the Auto Test Center. "It's meant to ease your fatigue on a long trip, but it should not be used as an autonomous driving function. You as a driver need to stay engaged."
Such vehicles allow drivers to manually override the self-driving technology.
One of the vehicles Blumenthal tested, a Cadillac Super Cruise, has a camera that can determine if a driver's eyes are on the road. Also, the autopilot technology can only be used when driving on divided highways.
Stockburger said Consumer Reports would like to see similar technology included in more vehicles.