Who’s responsible when a driverless car has an accident?

Who’s responsible for an accident needs to be considered now rather than when we’re scrambling to implement laws

Eamonn BrosnanWe haven’t reached the point where driverless vehicles ply our roadways. However, as a recent incident on an Alberta highway demonstrates, we might be edging closer.

How close are automakers to producing such a vehicle and, more importantly, how close are lawmakers to allowing an artificial-intelligence-driven vehicle on city streets?

The industry has a six-tier scale to define self-driving autonomy in a vehicle:

  • Level 0: Your typical car with nothing more than an onboard navigation system or old-style cruise control.
  • Level 1: Includes vehicles that use a single computer-controlled driving system, like adaptive cruise control or lane centring.
  • Level 2: Partial driving automation, containing multiple driver-assist systems, such as adaptive cruise control or lane centring.
  • Level 3: Includes the driver-assist systems from Level 2 and can make environmental driving decisions, like passing a slow-moving vehicle or detecting environmental conditions that the vehicle could make decisions about.
  • Level 4: High driving automation vehicles that typically don’t require human interaction or oversight. They include the full suite of computer-controlled driving systems, as well as an advanced artificial intelligence (AI) capable of reacting to changing road conditions, traffic, pedestrians and a wide variety of other variables.
  • Level 5: Full driving automation vehicles that don’t even have regular human driving controls.

In Alberta, the ‘driver’ of a Tesla was pulled over and charged with speeding and dangerous driving. Both front seats were reclined and the driver and passenger were sound asleep.

Does Tesla actually make a self-driving vehicle?

Though the company certainly does build systems above Level 0, they’re not capable of truly autonomous driving. Tesla’s autopilot system, as included in the model involved in this incident, is considered a Level 2. The marketing name Autopilot is perhaps a bit ambitious, as such systems require a human to be in the driver’s seat and aware and able to take control of the vehicle without notice.

These systems can’t handle a vast array of hazards. Things like sudden severe weather, dirt or a rock chip covering a radar sensor or camera, low trailers, lane detours due to road construction, lane markings wearing out or covered, and even corners with heavy traffic can confuse the system.

Tesla, like other automakers, has extra safety features to ensure drivers are awake and driving. If the vehicle doesn’t receive input from a driver, it will automatically slow down and then stop on the side of the road.

However, as with any computer technology, clever people have devised workarounds or after-market applications to help drivers circumvent these safeguards. In this case, it appears the driver circumvented the safeguards.

Many automakers offer Level 2 systems but there’s only one Level 3 system on the market: the Audi A8. And anything below Level 4 still requires an attentive driver who can take control of the vehicle. Several companies are developing Level 4 systems. Level 4 systems can’t be used outside very specific test regions in certain cities.

Canada doesn’t have legislation permitting driverless (Level 4 or 5) vehicles on public roadways.

Ultimately, though, automated vehicles will be mainstays on our roadways and why not?

AI-controlled vehicles will be safer and traffic will flow faster.

But serious legal considerations still need to be resolved and not just by the corporations developing these systems. Citizens should have input on how these vehicles operate. The insurance industry will also have concerns, although insurers will no doubt be thrilled with technology that lowers accident rates.

But questions like who’s responsible when an automated system has an accident should be considered now rather than when we’re scrambling to implement laws for existing technology.

And regulations are needed related to testing these vehicles and ensuring their security systems prevent the inevitable malicious hack attempts.

I’m hesitant to endorse Level 5 automated vehicles without an override that disengages all automated systems so human  control can bring a vehicle to a safe stop.

Until the laws change, remember that you are tasked with controlling your vehicle, regardless of its level of sophistication. It doesn’t matter how automated the vehicle might be, until the laws say differently, there must always be a licensed, insured, sober and awake driver behind the wheel, capable of taking control without any hesitation.

Eamonn Brosnan is a research associate with the Frontier Centre for Public Policy.

Eamonn is a Troy Media Thought Leader. Why aren’t you?

For interview requests, click here. You must be a Troy Media Marketplace subscriber to access our Sourcebook.

© Troy Media


driverless car

The views, opinions and positions expressed by columnists and contributors are the author’s alone. They do not inherently or expressly reflect the views, opinions and/or positions of our publication.

You must be logged in to post a comment Login