Programmed to Kill

A self-driving car is something that has been dreamed of, but considered unobtainable, for decades. Now, self-driving cars are being tested by Uber in the streets of Pittsburgh and San Francisco.1 Although there is an inexhaustible list of benefits self-driving cars bring to society, there are also a number of legal issues.  Advances in technology tend to outdate current laws, and self-driving cars are not the exception. One of the biggest legal issues facing courts and lawmakers is: in the case of an inevitable crash, how should the car act and who do we hold liable if the car reacts the “wrong way”?

In the near future, there will be instances where a self-driving car will have to act when an accident is inevitable. When that happens, what should the car be programmed to do? Should the car try to minimize damage to society, or should it protect the passengers at all costs? Minimizing damage to society seems like a good idea until you are the one inside the car. Many people would feel uncomfortable riding in a car which might prioritize the wellbeing of others over their own life.2 Instead, people would probably prefer a car that values their lives more than others.3 However, a self-driving car that puts the life of its passengers above all else would be extremely dangerous, as it could cause a substantial amount of harm to others to protect a single passenger. At some point, someone will have to decide how to appropriately program self-driving cars.

The issues in programming self-driving cars also raises a question of liability. Is the manufacturer liable for programming the car liable, or should the consumer be made liable as the owner of the car?
In many ways, holding the manufacturer liable for any injuries caused by a self-driving car’s programming makes sense. The manufacturer directs the car what to do in the case of an emergency. As such, the car arguably suffers from a design defect, and the manufacturer should be held liable for the damages that stem from the car’s programming.4 In the majority of jurisdictions, it is enough to show that a product “does not perform safely as an ordinary customer might expect it to” to show a design defect.5 A car that is programmed to minimize damaged might be different from the ordinary consumer’s expectation, which is that the car should protect the owner first and foremost. However, such liability might disincentivize manufacturers from further developing self-driving cars.6 A reasonability test could be applied to the actions of programmers, but that would then require the court to determine what programming is reasonable and what programming is not after the damage is already done.

Instead of the manufacturer, owners could be held liable for the decisions made by self-driving cars. Holding the owner responsible also seems consistent with traditional notions of driver liability. However, owners of self-driving cars do not actually exercise control over the car. Holding owners of self-driving car would hold owners liable for the programming of manufacturers.7 Not only does this go against basic notions of justice, but it also disincentivizes consumers from purchasing cars, which would stifle technological advancement and deprive society of a social good.8

The introduction of self-driving cars to the world will undoubtedly change the legal landscape for automobiles. A simple reasonability test can no longer be applied to drivers of cars. Moreover, it is unclear who to hold liable and for what to hold people liable for. Hopefully lawmakers are proactive in enacting legislation to solve these issues.

  1. Marco della Cava, Uber testing self-driving cars in San Francisco, USA TODAY, (Sept. 22, 2016, 3:42 PM), http://www.usatoday.com/story/tech/news/2016/09/22/uber-testing-self-driving-cars-san-francisco/90847962/.
  2. Will Knight, How to Help Self-Driving Cars Make Ethical Decisions, MIT TECHNOLOGY REVIEW, July 29, 2015, https://www.technologyreview.com/s/539731/how-to-help-self-driving-cars-make-ethical-decisions/.
  3. Id.
  4. Nick Belay, Note, Robot Ethics and Self-Driving Cars: How Ethical Determinations in Software Will Require a New Legal Framework, 40 J. Legal Prof. 119, 123 (2015).
  5. Baker v. Lull Eng’g Co., 573 P.2d 443, 455-56 (Cal. 1978).
  6. Belay, supra note 4, at 124.
  7. Belay, supra note 4, at 125.
  8. Belay, supra note 4, at 125.

Author: Patrick O'Donnell

Patrick O’Donnell graduated from The College of New Jersey in 2015 with a B.S. in Economics. During his first year of law school, Patrick decided to focus solely on his studies. As a second year law student, Pat was an Associate Editor on the Rutgers Computer and Technology Law Journal. Pat is now a Managing Articles Editor on the Rutgers Computer and Technology Law Journal. In his spare time, Pat likes to hike and read.