Try 3 months for $3
BIZ-AUTO-DRIVERLESS-FAIL-LA (copy)

An Uber self-driving Ford Fusion sits at a traffic light and waits to turn onto a street in May 2018 in Pittsburgh.

Half of American adults think automated vehicles are more dangerous than traditional vehicles operated by people, while nearly two-thirds said they would not buy a fully autonomous vehicle, according to a new Reuters/Ipsos opinion poll, Reuters reported April 1.

We’re not surprised, and we’re right there with the majority of the poll respondents when it comes to not trusting self-driving vehicles. Not yet.

The findings are similar to those in a Reuters/Ipsos poll last year, and are consistent with results in surveys by Pew Research Center, the American Automobile Association and others. In March 2018, after the 2018 Reuters/Ipsos poll, an Uber vehicle operating in self-driving mode struck and killed a pedestrian in Arizona.

Relatively few U.S. residents have seen or ridden in a self-driving vehicle, and proponents say suspicion of unknown technology is a factor.

“People are comfortable with things they know,” said investor Chris Thomas, co-founder of Fontinalis Partners and Detroit Mobility Lab. “When everybody understands the game-changing attributes of automated vehicles, how they can give you back all that time to read or work or sleep, they will start to ask about the value of that recaptured time.”

We understand that Thomas has a vested interest in extolling the virtues of self-driving vehicles. But considering how often we see drivers of conventional vehicles clearly looking away from the road and down at their phones — or worse still, texting while driving — we’re confident that less attention paid to the road by human drivers, to read or work or sleep, is the last thing we need.

Tesla, based in Palo Alto, California, came under fire after the death of Joshua Brown, a former Navy Seal and entrepreneur, in a Tesla Model S in Florida in May 2016, KGO-TV in San Francisco reported.

Brown’s car was on autopilot when his car ran into a semitrailer on the road. Both Brown and the car’s driverless technology failed to detect the white side of the tractor-trailer against a brightly lit sky, so the brake wasn’t activated. The driver of the semi said Brown was watching a Harry Potter movie at the time of the crash, though accident investigators found no evidence to corroborate that claim.

For companies investing in autonomous vehicles, public mistrust is a problem, Reuters reported. The passenger death in Florida and the pedestrian death in Arizona are not acceptable losses.

We want to see ample evidence of crash-test dummies having the experience of not being crushed or run over, and to see that knowledge applied to eliminate or significantly reduce incidents such as the one which took Joshua Brown’s life, before we’re ready to welcome self-driving cars to the intersection of Washington Avenue and Ohio Street in Racine, or to the numerous on- and off-ramps of the Marquette Interchange or Milwaukee.

(0) comments

Welcome to the discussion.

We welcome reader interaction. What are your questions about this article? Do you have an idea to share? Please stick to the topic and maintain a respectful attitude toward other participants. (You can help: Use the 'Report' link to let us know of off-topic or offensive posts.)