Will we be liable for the decisions of our self-driving cars?

|
,

Post by Aron Solomon

It’s a perfectly nice Monday, and you are driving along your local mega road, listening to Bangles’ “Manic Monday,” sipping your extra-hot no-foam-four-extra-shots latte and thinking, yeah, this song is still so darn good.

Estimated reading time: 5 minutes

As you signal to get into the parking lot of your city’s Kosher pet bakery, The Katz Meow, you hand motion the person trying to drive through the busy intersection that it’s fine to go. They wave back, proceed and are T-boned by an SUV and killed.

Do you have any legal liability? All you did was look, it seemed safe, and you waved your hand. You do have liability. This is known as a waving accident and while we all do “the wave” out of our good nature and a desire to help, each time we do, it potentially attaches legal liability. 

In most jurisdictions, when we wave for someone to go (or stop), and it contributes to an accident, there is the potential for us to be held legally liable. This has been the law in New Jersey since Thorne v. Miller, a 1994 case.

In Thorne, an accident occurred on a roadway with two lanes of traffic in each direction. There was a lot of traffic, which was the norm in the area, and the defendant, Miller, wanted to make a left turn out of a parking lot into the westbound lane of a four-lane road. Cook, the other driver, was driving in the outer eastbound lane. When he was in front of the parking lot entrance when Cook waved Miller into his path, indicating that she could pass through to exit. He made this motion not once but twice.

You can guess how the rest of this story goes. As Miller crossed the westbound lane, she hit not one but two cars. The people who were injured in the accident sued both Cook and Miller. Cook’s lawyer tried and failed to have him removed from the case, arguing that Cook never struck anyone. Cook’s entire position was that the wave was intended as simply a helpful gesture and that, of course, no legal liability should attach to it.  The court disagreed, kept Cook in the case and acknowledged that Cook’s waving gesture contributed to the crash.

Okay. New scenario:

You are driving along in your self-driving car. 

You are being alert, doing all of the things you’re expected to do in your self-driving car when the car itself behaves in a way that you neither programmed nor expected. It’s not the classic doom and gloom scenario painted by Luddites who feel we should have never forsaken that darned horse, but rather an unexpected movement or signal that the car self-initiated.

Again, the move in itself isn’t a catastrophic one in our example — it’s not like the car lurched into oncoming traffic or failed to stop at a school crosswalk. But rather something more subtle. The horn sensor honks to encourage someone else to go. The lights flash twice, again, indicating movement. Or the opposite — the car does something to encourage other cars to stop and interrupt traffic flow.

Interior of a Tesla self-driving car

When these things happen, as they surely will with at least the same frequency we see today with people waving other drivers into potential danger, will there be any way for us to escape liability with a Bart Simpson-esque “I didn’t do it.”

Nicole Lombardi, a lawyer at Lombardi and Lombardi, P.A., in New Jersey, observes that:

“In Pennsylvania and New Jersey, for example, you can be held legally liable if you wave another car into traffic. It’s not much of a logical leap to hold drivers responsible for the actions and inactions of the self-driving car they’re in and responsible for. How this area of the law develops will be very interesting to watch over the next very few years.”

Another interesting aspect to consider will be how our qualifications to drive and be insured will evolve along with the capabilities of our cars. Will people with computer skills who can better understand and prospectively diagnose self-driving car behaviors qualify for lower insurance rates of more advanced driver’s licenses that grant privileges the rest of us don’t have?

This all creates an entirely new paradigm for how we drive and interact with our vehicles. Today, no one talks about “piloting” their car, nor do pilots say “I’m driving this 777 to SFO,” but this shift in terminology may be a handy starting point. A pilot is liable for making the decisions they were trained to make. In fact, if one takes a look at horrific air disasters (please don’t, it’s a horrible YouTube rabbit hole you should avoid), some of the worst, such as Tenerife, have at the core of the tragedy pilots not piloting according to standard procedures. 

So as we evolve from driving our cars to piloting them — ensuring that all functions are working as they should be, safety checks are done, and receiving the necessary training on the machine and its core technology, we should also look at what we are legally liable for, and what that liability looks and feels like will also continue to evolve.


About Aron Solomon

Aron Solomon is the Head of Digital Strategy for NextLevel.com and an adjunct professor of business management at the Desautels Faculty of Management at McGill University. Since earning his law degree, Solomon has spent the last two decades advising law firms and attorneys. He founded LegalX, the world’s first legal technology accelerator and was elected to Fastcase 50, recognizing the world’s leading legal innovators. His work has been featured in TechCrunch, Fortune, The Hill, The ABA Law Journal, Law.com, The Boston Globe and many other popular publications.

Previous

Amazon counters Apple Music with free upgrade to Amazon Music HD

Google I/O ’21 recap: Search, privacy, Android, Wear OS, and more…

Next

Latest Articles

Share via
Copy link
Powered by Social Snap