Can robot cars trust human beings?

Image result for Can robot cars trust human beings?

Three years ago, Google’s self-driving car project abruptly shifted from designing a vehicle that would drive autonomously most of the time while occasionally requiring human oversight, to a slow-speed robot without a brake pedal, accelerator or steering wheel. In other words, human driving was no longer permitted.

The company made the decision after giving self-driving cars to Google employees for their work commutes and recording what the passengers did while the autonomous system did the driving. In-car cameras recorded employees climbing into the back seat and climbing out of an open car window while the car was in motion, according to two former Google engineers. “We saw stuff that made us a little nervous,” Chris Urmson, a roboticist who was then head of the project, said at the time. He later mentioned in a blog post that the company had spotted a number of “silly” actions, including the driver turning around while the car was moving.

Johnny Luu, a spokesman for Google’s self-driving car effort, now called Waymo, disputed the account, but said behaviour like an employee rummaging in the back seat for his laptop while the car was moving and other “egregious” acts contributed to shutting down the experiment.

We humans are easily distracted by our games and phones. And automotive engineers, computer interaction designers and, yes, lawyers, wonder if the self-driving cars they are working on will ever really be able to count on us in an emergency. “Do you really want last-minute handoffs?” said Stefan Heck, chief executive of Nauto, a startup based in Palo Alto, California, that has developed a system that simultaneously observes both the driver and the outside environment and provides alerts and safety information.

Nauto’s data shows that a “driver distraction event” occurs, on average, every 4 miles. Heck said there was evidence that the inattention of human drivers was a factor in half of the approximately 40,000 traffic fatalities in the United States last year.

Last month, a group of scientists at Stanford University presented research showing that most drivers required more than five seconds to regain control of a car when — while playing a game on a smartphone — they were abruptly required to return their attention to driving. Over-trust was what Google observed when it saw its engineers not paying attention during commutes with prototype self-driving cars.

Solving the over-trust issue is a key to autonomous vehicles in the Level 3 category, where the computer hands off to humans.New York Times News Service.

[Source:- thehindu]

Loknath Das
Author

Loknath Das

Leave a Reply

Your email address will not be published. Required fields are marked *