Two counts of vehicular manslaughter have been filed against the driver of a Tesla that ran a red light and struck a car, killing two people in California; all while Autopilot was engaged. A report from AP says Kevin George Aziz Riad may be the first person in the US to be charged with a felony for a crash involving a partially-automated system. He has since entered a not guilty plea Riad, a limousine driver, was behind the wheel of a Tesla Model S as it left a freeway and ran a red light at high speed in December 2019. The car then struck a Honda Civic, killing Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez. Riad and a woman passenger were hospitalized with non-life threatening injuries. Investigators from the National Highway Traffic Safety Administration (NHTSA) confirmed that autopilot was in use at the time of the crash. The same system has been blamed for several other crashes. In some cases, Teslas were reported crashing into vehicles that were parked at the side of the road; which would seem avoidable if the driver were paying attention. A 2021 report from the NHTSA’s Office of Defect Investigations listed 11 incidents since 2018 where Teslas with autopilot or “Traffic-Aware Cruise Control” engaged struck vehicles on the side of the road. In a 2018 case, a Tesla slammed into the rear of a fire truck parked at the side of Interstate 405 in Culver City, California. The investigation, helmed by the National Transportation Safety Board (NTSB), found that the Tesla’s autopilot “permitted the driver to disengage from the driving task” before hitting the parked fire truck. The report cited “inattention and overreliance on the vehicle’s advanced driver assistance system” as the driver’s lack of response for the crash. The report further stated that Tesla’s automatic emergency braking system did not activate and there was no attempt from the driver to stop the vehicle. No one was injured in that crash, as the Tesla was traveling around 30 mph. The autopilot system was also blamed for three other fatal crashes, including two in Florida. In both cases—one in 2016 and the other in 2018—the Teslas’ autopilot systems failed to stop the cars before they crashed into semi-truck trailers. Another California incident saw the death of a Tesla driver whose Model X accelerated just before it crashed into a freeway barrier. Autopilot was also engaged in that incident. In a similar case not involving a Tesla, charges were filed against Rafael Vasquez—who goes by Rafaela according to a report from the BBC—the backup driver of a self-driving Uber vehicle undergoing testing in Arizona back in 2018. In that case, Vasquez was streaming an episode of the Voice on her phone when the Volvo SUV she was in struck and killed Elaine Herzberg as she crossed the street on her bicycle. Her death led Uber to cancel their self-driving vehicle testing. The company was also found not criminally liable in the case and was not charged. The difference between this case and the others mentioned is this vehicle was meant to be fully autonomous with a driver serving as a backup. So are these crashes the fault of the driver or the system? Is autopilot/autonomous driving as safe as it could be? Should there be any room for failure? Before you can answer those questions, It helps to know how the autopilot system and its sensors work. According to Tesla’s Autopilot webpage, each vehicle has full self-driving capabilities through “software updates designed to improve functionality over time.” Models are typically fitted with front, rear, and side cameras. Front cameras can see up to 250 meters and rear cameras can see as far back as 100 meters. There are also two types of driving systems available, Autopilot and Full Self-Driving Capability. With Autopilot, drivers get traffic aware cruise control, which matches the speed of a Tesla to that of surrounding traffic and Autosteer, which assists with steering in a clearly marked lane. The step up into the Full Self-Driving Capability system grants owners the ability to autopark their car, change lanes automatically, and summon their Tesla (making it drive without someone behind the wheel to where you are standing). In beta are features to help the Tesla navigate itself using a GPS route, and the ability for Autopilot to recognize stop signs and traffic lights. When either system is activated, drivers must agree to keep their hands on the wheel and to pay attention at all times. The vehicle will also warn drivers in an “escalating series of visual and audio warnings” if they do not place their hands on the wheel. If the warnings are ignored, Autopilot is disengaged for the remainder of the trip. With all of these features and sensors, one would think it would make operation safer since the car seems to literally have eyes all over the road. So could it be the fault of the driver after all? More importantly, is it the driver’s fault that they have a heightened sense of security because of these automatic systems? Or is it the fault of a clever sales pitch? Tesla describes their autopilot system as an “advanced driver assistance system that enhances safety and convenience behind the wheel.” And when used properly, Autopilot “reduces your overall workload as a driver.” So if the “workload” of being a driver is reduced, do drivers feel that they don’t need to pay as much attention? While the “workload” may be reduced, Tesla says the system may only be used by a “fully attentive driver,” and even warns drivers that they must agree to keep their hands on the wheel and be ready to take control of the car at any moment. The same webpage adds that the Autopilot system “does not turn a Tesla into a self-driving car nor does it make a car autonomous.” But if the system doesn’t turn a Tesla into a self-driving car, why does the company advertise a package called “Full Self-Driving Capability?” That same package advertises the ability for the car to drive itself without a driver present when summoned—the Summon feature—and has a feature in beta that will allow the vehicle to navigate a highway on its own from on-ramp to off-ramp—Navigate on Autopilot. Despite the multitude of seemingly autonomous features, Tesla’s Autopilot barely fits the true definition of autonomous: denoting or performed by a device capable of operating without direct human control. Yes, a Tesla can drive itself, but only with specific directions. The car cannot make its own decisions, outside of choosing to stop when its sensors detect an imminent crash. So how liable is a driver in these situations? According to personal injury attorney Steve Vasilaros, the driver is 100 percent responsible. “It’s not a failsafe system, the failsafe is the driver,” said Vasilaros. “The ultimate responsibility of the way that car is operated is the driver.” In the case of defending someone who may have been charged in a scenario where a car on autopilot causes a crash, one might argue that the manufacturer still bears some kind of responsibility. After all, it’s their system, and it’s been documented that systems have failed before in certain crash scenarios, so it makes sense that a manufacturer might have some fault in the crash. Vasilaros, however, argues that the manufacturers don’t take responsibility. The law states a driver bears full responsibility for their car when they get behind the wheel. “The statute says the driver is responsible. No one else, it’s as simple as that,” said Vasilaros. In the case of Riad’s deadly crash in 2019, charges have been filed but the case still has to go to trial. Seeing as a case such as this has yet to be tried in the United States, there’s going to be a lot of eyes watching how everything pans out. “The entire legal system is watching this case,” added Vasilaros. And indeed it is. The outcome of this case will surely set a new precedent for prosecuting drivers charged in accidents where a partially-automated system was activated. Now there’s a new set of questions: are companies going to address the shortcomings of autopilot systems? Is there anything a company can do to prevent human error or make drivers pay attention more? Or, is there really no way for companies to address this problem? The ball is now in the court of lawmakers and the owners of those companies to determine how the future is laid out for automated driving.
Crashing On Autopilot: Whodunit? The Car? Or The Driver?
Jan 28, 2022 | 3:33 PM