Driverless Cars—A Shift in Risk
35
Share

Driverless Cars—A Shift in Risk

An interview with James Anderson

CLE Credit — Approved in 4 States
AZ · General
0.5 cr
CA · General
0.5 cr
CT · General
0.5 cr
NY · Areas of Professional Practice
0.5 cr

 When algorithms take the wheel and human drivers move to the back seat, who's to blame when an accident occurs? The future of driverless cars is already here, with Waymo test offering its autonomous taxi service in Phoenix, AZ and more companies like GM, Nissan, and even Amazon entering the race to market. As driverless cars become the norm, the laws governing its development and use will have to adapt accordingly. RAND Corporation’s James Anderson discusses the complicated legal and policy issues that will need to be contemplated, including tort liability, the insurance regime, cybersecurity, and the regulatory framework.

Additional Resources

The Levels of Driving Automation 
According to the SAE

  • Level 0: no automation.
    • Example features of level 0 include automatic emergency breaks or lane departure warnings
  • Level 1: driver assistance. A level one car must provide either steering or acceleration support to the driver. Most cars manufactured today include level 1 automation. 
    • Example features of level 1
  • Level 2: partial automation. Car can steer, accelerate and brake and requires driver to respond to traffic conditions.
    • Examples of level 2 automation includes the Tesla Autopilot.
  • Level 3: conditional automation. At level 3, a car can manage most aspects of driving, but driver must be available to take over in certain scenarios.
  • Level 4: high automation. At level 4, a car can perform all driving functions under most conditions. Driver option to control the car.
  • Level 5: full automation. Car can perform all driving functions under all conditions without driver assistance. 

Who is responsible when a driverless car is stolen?

"In the future, when cars can drive themselves, grand theft auto might involve a few keystrokes and a well-placed patch of bad computer code. At that point, who will be liable for the damages caused by a hacker with remote control of a 3,000-pound vehicle?  n the future, when cars can drive themselves, grand theft auto might involve a few keystrokes and a well-placed patch of bad computer code. At that point, who will be liable for the damages caused by a hacker with remote control of a 3,000-pound vehicle?"  Article by RAND

 

About James Anderson

One of the challenges of regulation is being confident enough that you've got it right and can usefully specify the appropriate regulations.

James Anderson is a Senior Behavioral Scientist and the Director of both the Justice Policy Program and the Institute for Civil Justice at the RAND Corporation. He has been the principal investigator on a wide range of projects, ranging from policy implications of autonomous vehicle technology to understanding the effects of indigent defense systems. Prior to joining RAND, he clerked for the Honorable Morton Greenberg of the United States Court of Appeals for the Third Circuit and worked as an assistant federal defender representing death-sentenced prisoners. He has been funded by the National Institute of Justice, the National Institutes of Health, the Bureau of Justice Statistics, the State of Pennsylvania, the Institute for Civil Justice, the Robert Wood Johnson Foundation, the Department of Defense, and the National Science Foundation. He has authored numerous works, published in the Stanford Law Review, the Yale Law Review, and the Oxford University Press among many others.