Enable unprecedented levels of automation and agility with cloud computing solutions.
How to make self-driving cars safe
Last year, a Florida man became the first person to die in a crash involving autonomous driving technology. Forty-year-old Joshua Brown had his hands off the wheel when his car slammed into a semi-trailer making a left turn across his lane. The incident caused considerable consternation in the media, not the least for underlining the glaring absence of autonomous vehicle (AV) regulations then in place. "The fatal crash," the Los Angeles Times said, "highlighted what some say is a gaping pothole on the road to self-driving vehicles: the lack of federal rules."
The newspaper had a point. While AVs have increasingly been in the news in recent years, there are currently no federal regulations covering their manufacture and use. While that's not so unusual—technology often runs ahead of government attempts to regulate it—when it comes to autonomous technology, the issue is particularly fraught. AVs could transform society as radically as the introduction of the car itself, but if governments can't figure out how to regulate the technology properly, its adoption could stall.
Time is of the essence. Already, human drivers are sharing the highway with cars that can stay in their lane and brake or accelerate to match the pace of traffic, and soon fully autonomous vehicles will be commonplace. Who will write the rules? Who should?
Who writes the rules?
In the United States, the answer is complicated by the fact that responsibility for the regulation of cars and trucks is split between the federal government and the various states. The National Highway Traffic Safety Administration (NHTSA) has the authority to write what are known as federal motor vehicle safety standards (FMVSS). "A vehicle that complies with all federal motor vehicle safety standards is allowed to operate anywhere in the U.S.," says Ben Husch, natural resources and infrastructure committee director at the .
These rules cover everything from bumper height to the specifications of brake and gas pedals. At present, the NHTSA hasn't enacted any rules relating to autonomous vehicles. They might not need brake pedals, for instance, but would presumably require a demonstrable level of competence at self-navigation.
According to Husch, state regulations cover vehicle use, including licensing, registration, traffic safety, law enforcement, and insurance liability. As of April 2017, 13 states and the District of Columbia had enacted some kind of legislation or executive order. Their scope varies widely. Some, like Utah and Alabama, established commissions to study the issue and report back with recommendations. Other states, including California and Florida, have enacted detailed regulations concerning how AV testing may be carried out.
The lack of standardization could easily lead to confusion as the technology becomes more widely used. "If we had a single regulatory entity, that would make things simpler in many ways," says James Anderson, a senior behavioral scientist at . "We probably want to avoid a crazy-quilt patchwork setup where every state has a different set of requirements. That seems very inefficient and could potentially really slow down development."
To prevent this, the NHTSA last year issued a report, "," that lays out guidelines for the states to follow in crafting their rules. "A manufacturer should be able to focus on developing a single [AV] fleet rather than 50 different versions to meet individual state requirements," the report notes.
A major issue upon which much of the regulation of AVs will hang is: Who counts as the driver? The answer will depend on the sophistication of the technology. SAE International, an engineering organization, developed . Level 0 designates cars that have no autonomous features, while Level 1 signifies cars that can take on some tasks some of the time (for example, via cruise control). At SAE Level 3, cars can do some of the driving and monitor the environment, but a human must stand ready at all times to take back control if needed. At Level 4, cars can drive themselves without human monitoring, but only under certain conditions. Level 5 cars can operate under all conditions just as well as a human.
When it comes to regulation, the key transition occurs around Level 4, when a human driver is no longer required to be at the wheel. Without a human being on hand, who is responsible should something go wrong? Today, the vast majority of vehicles on the road are at Level 1. The most advanced vehicles, able to drive themselves under specific conditions as long as they have human oversight, are at Level 3. As long as such vehicles meet the FMVSS for conventional vehicles—essentially, having a human operator who can take over when needed—any incremental improvements in safety and comfort provided by automation are just gravy.
It's when you try to take humans out of the loop that the question of responsibility gets tricky. At Levels 4 and 5, it's not so obvious who the operator is. Is it the owner? The company that built or sold the car? Or perhaps the company that wrote the machine-learning algorithm that's doing the actual driving? If the car is functionally driving itself, then instead of the states licensing drivers, the federal government would have to assess the car's skill as part of its safety standards.
Some experts see a tremendous sense of urgency in getting these issues sorted out. Several manufacturers say they expect to have fully autonomous cars on the road by 2020. Could the lack of clarity put the brakes on these ambitions?
There's no need to panic, according to Eric Morris, a professor of urban studies at Clemson University who specializes in transportation issues. "There is no way they're going to have a fleet of autonomous cars by 2020," he says. "I don't think they'll have it by 2030." He points out that human drivers supervising one major autonomous fleet . In order to be truly autonomous, the technology needs to perform thousands of times better.
Morris predicts that it will take many years for autonomous vehicles to achieve Level 4 or Level 5 functionality. "I'm relatively sanguine about the fact that the regulations are not going to be super-problematic," he says. "By the time we get to the world of 'OK, can the car drive by itself places and should we allow this?' we will be used to the fact that we've driven thousands and thousands of miles with the car driving itself with no problem. And so it will be easy to say, 'OK, we can just take this one further step.'"
Another approach will be to start with Level 5 autonomy in very limited conditions, in situations that are easier to achieve and safer in the event of mishap. "Even today you can buy low-speed shuttles that are truly autonomous, for low-speed operation on campuses or in cities," says Rand's James Anderson. "That may give us some experience with what autonomy in those conditions looks like. As these AVs become more common and more flexible, their geographical operating domain could expand, and eventually they might be able to operate anywhere."
Whichever path we ultimately take to full autonomy, the societal benefits could be tremendous. Every year some 35,000 people are killed in motor vehicle accidents in the U.S. alone, with driver error a major cause of death. Autonomous technology could erase that carnage. It could also ease traffic congestion, reduce fuel consumption, and free up enormous quantities of productive time.
While society overall stands to benefit, some individuals will no doubt suffer. No technology is ever perfect. It seems highly probable that there will be accidents involving autonomous vehicles. Those accidents could result in injury or death. Who will be liable when things go wrong?
The details will have to be worked out in the courts, but Morris doesn't foresee significant problems. "I don't see how legal liability for a driverless car is really any different from [liability] for automakers right now," he says. "What happens now if your car blows a tire and you hit someone and kill them? Is the tire maker responsible? Well, no, actually. If there's some sort of systematic defect that they covered up, then yes, you can sue the tire maker. But if it's just the tire operating in normal conditions, you can't sue the manufacturer if you get a flat tire and get into an accident as a result."
While it would be comforting for manufacturers to iron out all regulatory issues well in advance, the fact is that it's impossible to work out detailed rules for a technology that doesn't fully exist yet. "It's important to recognize that we're very much in early days; there's a lot we don't know; and whatever predictions we might make, they're probably going to be completely wrong," says Anderson. "What we need to do is make sure we collect the information we need and learn the ways we're going to be wrong so we can change course."
In late June, Congress took the first steps on that journey. A package of draft bills circulated in the House of Representatives would make the NHTSA the lead agency for the regulation of self-driving vehicles and preempt states from passing laws that would restrict the deployment of AV fleets. There's still a lot to do, but the message is clear: We may not know quite where this road to autonomy will take us, how much it'll be regulated, or how long it will take to get there, but at last we've started moving.
Self-driving car safety: Lessons for leaders
- Nobody wants to stymie innovation, but we do need to create "the rules of the road." If governments can't figure out how to regulate the technology properly, its adoption could stall.
- One issue is what exactly those rules need to encompass. It can be argued that AV manufacturers are no more or less responsible for vehicle behavior than a tire manufacturer.
- Some legislation is in place already, but its scope varies widely. That's part of the problem.
This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.