Subscriber Benefit
As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe NowIn the finger-pointing and confusion that marked the end of the most recent legislative session, regulation for autonomous vehicles in the state was one of the issues left on the table.
Gov. Eric Holcomb last month called for a special session of the Indiana General Assembly to address some of the unfinished business, but self-driving cars will have to wait.
Legislative impasse
“It’s dead,” said Rep. Ed Soliday, R-Valparaiso. “We’re not bringing it back, at least this year.”
Soliday is the chair of the Committee on Roads and Transportation and the author of House Bill 1341, which sought to regulate autonomous vehicles in Indiana.
Transportation safety is a subject close to Soliday’s heart. He was previously the vice president of safety, security and quality assurance at United Airlines. He also served for 12 years at the National Academy of Sciences, eight of those as a member of their Aeronautics and Space Engineering Board.
“We want to encourage innovation, but we also want to protect the safety of Hoosiers,” he said.
Soliday said the committee worked such automakers as Toyota, General Motors, Mercedes-Benz and Audi — along with trucking companies such as Cummins, which he praised as the most cooperative, and the Self-Driving Coalition for Safer Streets, founded in 2016 by Ford, Lyft, Uber, Volvo Cars and Waymo. He said the acrimony between lawmakers and stakeholders in the industry was intense.
“This is probably the most political thing, and nastiest thing I’ve ever been involved in,” he said.
Soliday said the take-it-or-leave-it approach favored by some of the companies involved left legislators in the Indiana House of Representatives with little room to negotiate.
“I have never seen this level of pressure, spinning, dancing that I saw in this,” he said. “The House was never going to go along with unfettered access. … There was one line we weren’t going to cross, and that was: no accountability, no standards.”
First pedestrian death
Other states have allowed these companies to undertake such testing with relatively few restrictions. For example, a 2015 executive order signed by Arizona Gov. Doug Ducey directed state agencies to “undertake any necessary steps to support the testing and operation of self-driving vehicles on public roads within Arizona.”
That all changed when Elaine Herzberg, 49, became the first recorded self-driving vehicle fatality. Herzberg was struck March 18 by an Uber in self-driving mode in Tempe, Arizona. A 22-second video released after the crash by the Tempe Police Department shows Herzberg walking from a darkened area onto a street just before the vehicle strikes her. A human backup driver was also in the car, but the video suggests the driver was paying attention to her smartphone.
In Arizona, companies such as Uber only need to carry minimum liability insurance to operate self-driving cars, according to the Associated Press. They are not required to track crashes or report any information to the state.
By contrast, California requires a $5 million insurance policy, and companies must report accidents to the state within 10 days and release an annual tally documenting how many times test drivers had to take over.
Patrick Reilly is a partner at Faegre Baker Daniels LLP where he concentrates his practice on complex litigation including product liability and sports law matters. He said at some point the federal government will have to institute nationwide standards as the current state-by-state model becomes increasingly untenable.
“Let’s say there’s a state government that requires a safety driver in the seat to intervene,” he said. “Or, maybe the government doesn’t require a safety driver, but somebody is nevertheless in the driver’s seat and fails to intervene. Are they liable?”
Because of the crash, Ducey suspended Uber’s testing in the state March 26.
“Events like what happened in (Tempe) are much like the Space Shuttle (Challenger disaster), they set the program back far more than having some standards that are out there,” said Soliday.
Who (or what) is at fault?
Zach Peterson, a spokesman for Aptiv — the auto-parts maker that supplied the radar and camera on the Volvo SUV that struck and killed Herzberg — told Bloomberg News Service on March 26 that Uber had disabled the standard collision-avoidance technology in the vehicle.
Reuters reported March 28 that Herzberg’s family had reached a settlement for an undisclosed amount with Uber, ending a potential legal battle.
As this case was settled out of court, the question of liability in future cases remains. This new legal puzzle remains mostly unanswered, precisely because there is little to no caselaw to cite.
Lewis Wagner LLP partner Robert R. Foos, Jr. represents commercial carriers and commercial drivers in catastrophic injury and wrongful death cases. Foos said under normal circumstances, the reasonable person standard would apply for cases of negligence in the operation of a motor vehicle. But in the instance of self-driving cars, he said it would probably be tried as a product liability case.
“Taking the human element out of it altogether raises some really interesting issues for us,” he said.
Greg Laker is a partner at Cohen & Malad LLP and chairs its personal injury practice group. He said if the technology in the vehicle failed and killed someone, a lawyer would likely take that case. But injuries such as a broken leg would be much harder to litigate.
“Who is going to sue GM or Ford and take on that monumental defendant for a limited injury?” he said. “There’s going to be a real chilling effect. … I’m worried that a lot of people are not going to be able to have claims against what’s clearly a product defect.”
Judy Okenfuss is a managing partner of Ice Miller and the chair of the firm’s Internet of Things industry group. She said the coming of vehicle-to-vehicle communications in self-driving cars would help greatly reduce the number of accidents. In the cases where injuries still occur, algorithms would have been programmed.
“Right now, the person deciding is whoever is programming that algorithm,” she said. “Down the road, does the government legislate that? … Does a person determine their own? Then, it’s going be incredibly difficult, because everyone would probably program their car to save them, so at some point the benefits of autonomous are lost.”
Adam Ira, an attorney with Kightlinger & Gray, cited the civil liability defense of sudden emergency — when a driver is presented with a decision that gives them little time to choose between two options. He said this standard probably wouldn’t apply in the case of an autonomous vehicle.
“I think that the problem is the conscious, deliberate programming of a vehicle to make that choice in advance,” he said.
Jody L. Madeira is a professor at Indiana University Maurer School of Law. She said as time goes on, ultimate questions of responsibility would be decided on a case-by-case basis.
“I think a lot of people are just afraid of change,” she said. “But, there are some real concerns, too, when you waive all regulation, because we don’t know how these things are being programmed.”•
Please enable JavaScript to view this content.