Automated Cars Should Never be Called “Driverless”

The concept of self-driving cars is nothing new nowadays. People have dreamt of the self-driving car since at least the 1930s.1 Unfortunately for those dreamers, the actual automation of consumer vehicles was nothing more than science fiction until recent years.2 Nevertheless, today it seems everywhere you look another company is trying to get into the market; companies from Google to Tesla, Apple, Toyota, and even Uber are getting on the autopilot bandwagon.3The list includes seemingly every car, computer, technology, and leading transportation company you can think of totaling an astounding 33 corporations to date.4 But how safe is the idea in the first place to those on the road? Furthermore, how much research has been put into just that question? Even more worrisome, how much as the heads of these corporations paid attention to the safety research in their quest to corner the new market?

Peter Valdes-Dapena with CNN explains that the greatest danger may be in the name.5 He argues that the term “autopilot” “…invites the driver to take their feet off the pedals and hands from the steering wheel for long stretches of highway travel.”6 But what drivers may be missing is that not all “autopilots” are equal. In 2014, the Society of Automotive Engineers International (SAE International) set out a classification system consisting of six different levels of automated vehicles.7

Level 0: No Automation: the full-time performance by the human driver of all aspects of the dynamic driving task, even when enhanced by warning or intervention systems.

Level 1: Driver Assistance: the driving mode-specific execution by a driver assistance system of either steering or acceleration/deceleration using information about the driving environment and with the expectation that the human driver perform all remaining aspects of the dynamic driving task.

Level 2: Partial Automation: the driving mode-specific execution by one or more driver assistance systems of both steering and acceleration/ deceleration using information about the driving environment and with the expectation that the human driver perform all remaining aspects of the dynamic driving task

Level 3: Conditional Automation: the driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task with the expectation that the human driver will respond appropriately to a request to intervene.

Level 4: High Automation: the driving mode-specific performance by an automated driving system of all aspects of the dynamic driving task, even if a human driver does not respond appropriately to a request to intervene.

Level 5: Full Automation: the full-time performance by an automated driving system of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver.8

Knowing this, it makes much more sense that that Tesla Model S Owner’s Manual, for example, says some things that you might not expect given your prior understanding of “autopilot.”9 The Tesla Model S can do things like, maintain the car’s lane position, maintain a safe following distance behind traffic ahead, and change lanes when you signal.10 The car can even stop when there is something ahead; however, the manual warns that it will not always activate.11 In instances where there is a non-moving object in your path or when you are moving more than 50 miles per hour and a moving vehicle changes lanes revealing a stationary vehicle, the system is unlike to brake.12 “Drivers are also warned that the system is intended for use by a fully attentive driver and only on highways without intersections.”13

Despite the misleading nature of the term “autopilot” which can understandably cause drivers to operate these cars in an unsafe manner, the term is unfortunately not the only reason for fear. Seth Fiegerman with CNN reports that employees of the perceived leader in the field, Tesla, worried that the company was not taking every possible precaution to ensure the safety of the vehicles.14 Fiegerman cites that “[t]hose building autopilot were acutely aware that any shortcoming or unforeseen flaw could lead to injury or death . . . .”15 But Tesla founder and CEO Elon Musk believes that autopilot has the potential to save lives by reducing human error; a source close to Tesla says his driving force is “don’t let concerns slow progress.”16 Some Tesla employees struggled with this, telling CNN Money in interviews that they knew they were “pushing the limits” and that they were scared “someone was going to die.”17 David Keith, an assistant professor of system dynamics at MIT Sloan School of Management, says “It’s hard to believe a Toyota or a Mercedes would make that same tradeoff . . .[b]ut the whole ethos around Tesla is completely different: they believe in the minimum viable product you get out there that’s safe.”18

In the United States, there is little legislative history governing or prohibiting the use of automated cars.19 As of 2016, only eight states and the District of Columbia have enacted Autonomous Vehicle Legislation.20 Additional states are following behind; for example, Arizona Governor Doug Ducey signed an executive order in August 2015 directing agencies to “undertake any necessary steps to support the testing and operation of self-driving vehicles on public roads within Arizona.”21

In January 2016, U.S. Transportation Secretary Anthony Foxx unveiled new policy that updates the National Highway Traffic Safety Administration’s (NHTSA) 2013 preliminary policy statement on autonomous vehicles as well as a commitment of almost $4 billion over the next 10 years to accelerate the development and adoption of safe vehicle automation.23 Nevertheless, it seems California is leading the march against vehicle automation; a Senate bill proposed at the end of 2015 aims to reduce automation attempts despite promises by manufacturers to reduce the 94 percent of accidents that are caused by human error and bring everyday destinations within reach of those who might otherwise be excluded by their inability to drive a car.22

Senate Bill 1298 first establishes “certain vehicle equipment requirements, equipment performance standards, safety certifications, and any other matters the department concludes is necessary to ensure the safe operation of autonomous vehicles on public roads, with or without the presence of a driver inside the vehicle.”23 Meanwhile, the second “requires people to operate their autonomous cars.24 In addition, “driverless car manufacturers would also need to put their vehicles through a third-party safety test and provide measures to report accidents or car software hacks.”25 Here, California would set forth a strong focus on safety legislation surrounding autonomous cars despite the frustration it causes to manufacturers.

  1. Marc Weber, Where to? A History of Autonomous Vehicles, Computer History Museum (May 8, 2014),
  2. Id.
  3. 33 Corporations Working on Autonomous Vehicles, CB Insights (Aug. 11, 2016),
  4. Id.
  5. Peter Valdes-Dapena, The most dangerous thing about Autopilot is that it’s called Autopilot, CNN Money (July 7, 2016),
  6. Id.
  7. Automated Driving Levels of Driving Automation Are Defined in New SAE International Standard J3016, AUTOMATED DRIVING
  8. Id.
  9. Valdes-Dapena supra note 5.
  10. Id.
  11. Id.
  12. Id.
  13. Id.
  14. Seth Fiegerman, Elon Musk’s push for autopilot unnerves some Tesla employees, CNN Money (July 28, 2016),
  15. Id.
  16. Id.
  17. Id.
  18. Id.
  19. Bryant Walker Smith, Automated Vehicles Are Probably Legal in the United States, The Center for Internet and Society at Stanford Law School (Nov. 1, 2012),
  20. Autonomous Self Driving Vehicles Legislation, National Conference of State Legislatures (Sept. 13, 2016), In some of these states the legislation merely defines the term “autonomous” vehicle, while, in others, the legislation expressly allows individuals with valid drivers licenses to operate an autonomous vehicle.[note]Id.
  21. Id.
  22. Sarah Buhr, A Proposed California Law Would Require Drivers For Driverless Cars, TechCrunch (Dec. 16, 2015), (citing efforts stated by Google to reduce accidents caused by human error and expand access to transportation to those who are unable to drive).
  23. Id.
  24. Id.
  25. Id.

Author: Erica Hauser

Erica Hauser graduated from Seton Hall University in 2008 with a B.A. in Graphic Arts and Interactive Design. Now a 4LE at Rutgers Law school, she is now a Senior Editor of the Computer and Technology Law Journal. During her time at RU Law she has served as the Secretary of the Evening Students Association, as a Clinical Law Student with the Health and Special Education Law Clinic, and has volunteered over eighty hours with Legal Services of New Jersey as a Law Clerk within their HCAP workforce. Erica enjoys painting and reading and hopes to spend time traveling in the future.