Hard yards ahead for developers of driverless tech

BY IAN PORTER | 28th Sep 2017


DEVELOPERS of autonomous vehicles face enormous difficulties due to the complexity of the problems to be solved and any misstep could endanger or delay the adoption of the new technology, according to a US expert.

Presenting at the 2017 Australian Intelligent Transport Systems (ITS) Summit in Brisbane this week, Steve Shladover, who heads the mobility section of the Partnership for Advanced Transportation Technology (PATH) at the University of California in Berkeley, said the other side of the problem is there is no way road authorities can design a test that could adequately assess an autonomous system.

Dr Shladover said automation of aircraft was much simpler than what motor vehicle industry is now dealing with.

“If you’re flying at 30,000 feet and something goes wrong, the pilots have tens of seconds to respond and take corrective action,” he told GoAuto after addressing the summit.

“When you’re driving at 100km/h, your system is going to have to respond in a tenth of a second in order to prevent a serious problem. In an urban area there are other vehicles, pedestrians and cyclists you might hit, there could be a dozen of those you are in danger of hitting.

“You will need to know your speed and position relative to each of them, and to a very high accuracy, in order to avoid a crash.”Dr Shladover said the dilemma for the car-makers is where to draw the line between the utility of the vehicle and its safety.

“You don’t want to be in a position where your vehicle is applying the brakes all the time for reasons that are unnecessary. That would be unacceptable for the user,” he said.

This is why Google’s autonomous cars have been involved in several accidents, he said.

“I live in Google’s neighbourhood in California and their cars drive very, very conservatively, so conservatively that they violate the expectations of other drivers,” he said.

“At stop signs they wait several seconds before they move. Most California drivers will just slow down and maybe not even stop before they start accelerating again. That’s how (Google) have had several of their rear-enders in recent years.”Dr Shladover said legislators in California were now wrestling with the issue of how to regulate autonomous vehicles without hindering their development.

“Originally the State of California asked us whether we could design something like an acceptance test for an automated vehicle, like you have a licencing exam for a driver,” he said.



Left: University of California Partnership for Advanced Transportation Technology head of mobility Steve Shladover“What would be the licencing exam you would give an automation system? We had to tell them, ‘Sorry, that’s not possible.’“The complexity of the test you would have to go through to provide a high level of assurance of safety for that system would be unmanageable.”In his presentation, Dr Shladover demonstrated that autonomous vehicles were not, in fact, safer than human drivers. It might be a possibility in the future, but it is not a reality yet.

“Their statistics are nowhere close to what human drivers are doing all the time,” he said.

Compulsory data forwarded by Waymo (Google/Alphabet’s autonomous vehicle subsidiary) to the State of California shows that the Waymo autonomous cars covered an average 8000km between “critical events”.

In contrast, US traffic safety statistics show that human drivers cover three million kilometres between injury crashes and 15 million between fatal crashes.

Dr Shladover said this reality has been lost among the excitement and hype that has surrounded the development of autonomous cars in recent years. The public perception of autonomy and its state of readiness is far ahead of what the technology can do now.

A good example of this is the Tesla AutoPilot system, which was heralded as an autonomous system.

“It’s not autonomous. It’s a Level 2 system, that’s all it is. This is one of the problems. They tend to go into hype mode and start misleading people into thinking it is more than it really is,” he said.

“The reason several people have been killed in their Teslas is that they were misled into thinking it was something more than it really was. They failed to supervise the way they were supposed to and they got into trouble when the system encountered situations it couldn’t handle.”Dr Shladover said California had issued no licences allowing the use of autonomous vehicles on its roads, although 42 companies hold licences that permit trials to be conducted. As in other countries like Australia, Level 1 and Level 2 systems are allowed as they contain some systems required for automation but are not capable of full automation.

“There is no technical standard that we can point to and say, ‘Here’s the things you have to meet, here’s the set of requirements your vehicle must show it can meet in order to go on the road.’ It would be incredibly complicated to create such a standard and that hasn’t happened yet.”With no way to test autonomous systems for safety, Dr Shladover said authorities in the Golden State were focusing on developing a way of safely testing driverless vehicles.

The proposed protocol includes the use of restricted areas for testing, wireless connection to properly licenced remote operator and many other details.

There is legislation in the US Congress which would overrule the California regulations and, in fact, prevent any state from regulating automation. Dr Shladover is worried about this legislation because it does not set any performance parameters on systems developers.

“The problem is there is no national approach,” he said. “What they did in their recent document is they said to industry, ‘It would be nice if you did this, and nice if you did these other things, but we are not going to require you to do anything.’“This is where you have to look at the distinction between the good guys and the bad guys. You have the good guys in the industry who have done a thorough job of engineering their systems for safety, considering all of the hazards they are going to encounter, and they are fine.

“But then you have the fly-by-night operators, those that are technically incompetent, who are going to put a vehicle out on the road that’s dangerous, that is going to kill people, and there is no prohibition against that (in the Congressional legislation).

“It has had quite a lot of industry support and there are industry groups that have been pushing it quite aggressively. And the traffic safety people haven’t quite got their act into gear yet to see just what’s going on there and recognise the need to get more vocal.”Dr Shladover said the danger is that the very positive public perception of autonomous vehicles, despite data that shows they are currently much less safe than human drivers, could see the Congressional legislation waved through.

“It’s the public perception that’s the important thing in the political domain and at the moment that public perception is somewhat distorted,” he said.

“If we get to the point where some of these vehicles start running over children and killing children, that perception could flip very quickly.

“The industry has created this image of perfection that is unrealisable and that means that when the public realises that’s not what it is going to be, there could be a pretty sever backlash.

“As someone who has worked in this field for 40 years, I don’t want to see that backlash. I want to see this become reality but I am worried there is a definite potential for backlash because of the overhype.”

Read more

ANCAP moots autonomous technology testing role
Google’s Waymo ahead in driverless race
Full Site
Back to Top

Main site

Researching

GoAutoMedia