Google Bears “Some Responsibility” for Self-Driving Car Hitting Bus in California
For years, Google has touted the nearly spotless record of its self-driving vehicles. The company frequently emphasized the safety record of its self-driving cars and routinely pushed the narrative that the only problem with the self-driving vehicles is the fallibility of other human drivers with whom the self-driving car shares highways and roadways. In fact, the company routinely cited an array of impressive statistics regarding the safety of the vehicles. The company frequently cited the fact that over the six-year history of the program and until this accident, the car had only been involved in 17 minor accidents over the course of more than two million road miles. Furthermore, Google nearly always qualified the circumstances surrounding those 178 previous accidents in stating that, “Not once was the self-driving car the cause of the accident.” Unfortunately for the company and proponents of autonomous vehicles, it appears that this streak has come to an end. In a recent crash by one of the semi-autonomous cars, the fault has been attributed the self-driving vehicle.
How Did the Autonomous Car Hit the Bus?
In what is likely the first of its kind, a computer-controlled vehicle has hit another vehicle. However, it appears that the circumstances surrounding the accident were rather complex. On one hand, even the human driver, whom California law requires to be present in self-driving and semi-autonomous vehicles, did not intervene because he claimed that he expected the bus to behave differently. However, on the other hand, if the promise of significantly improved safety through self-driving cars and trucks is to be realized, these vehicles will need to improve significantly so that they can handle any conceivable situation. In this instance, the crash reportedly occurred on February 23, 2016, in Mountain View, California while the vehicle was traveling eastbound on El Camino Real in autonomous mode. According to news reports the self-driving vehicle was a Lexus RX450h. According to the accident report filed by the company with the California Department of Motor Vehicles (DMV), the self-driving Lexus approached the road’s intersection with Castro Street. The car apparently moved to the right side of a lane to pass traffic stopped at the light and to make a right turn. However, the car apparently did not detect that sandbags were positioned in the right lane around a storm drain until it had already moved down the lane. In order to proceed, the car needed to merge back into the flow of traffic proceeding straight at the intersection but still traveling in the right lane. Reportedly, several cars passed and then the car attempted to merge back into the flow of this traffic while a Santa Clara Valley Transportation Authority bus approached.
The vehicle apparently seemed to “think” that the bus would allow the car to merge to avoid the obstacle in the lane. However, the bus apparently maintained its speed and the self-driving car and its human driver failed to react to the changed circumstances. Therefore, the self-driving Lexus struck the bus making contact in the flexible, accordion-like center area of the articulated bus. According to a statement released by Google, the speeds involved in the accident were relatively low. The company stated that the car was moving at “less than two miles per hour.” The bus was reportedly traveling at less than 15 miles per hour. While there were no reported personal injuries, property damage to both the bus and the car was reported. The bus suffered minor damaging to its flexible articulating center portion. There were 15 passengers on the bus who were transferred to a second vehicle shortly after the car accident.
Who Was At Fault for the Bus and Autonomous Car Accident?
While fault has not been officially determined in the accident, Google did appear to anticipate that it may soon face blame. In a statement released by the company following the accident it admitted, “we clearly bear some responsibility, because if our car hadn’t moved, there wouldn’t have been a collision.” However, this assertion was qualified by the company stating that “…our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.” There was no police investigation into the accident. Further, the California DMV is not responsible for determining fault in an accident. However, a spokesperson for the Santa Clara Valley Transportation Authority indicated that the authority would conduct its own investigation into the circumstances surrounding and giving rise to the accident. Furthermore, an investigation to determine the exact apportionment of fault in the accident is still pending.
Google States it Is Using This Accident to Improve Computer’s Driving Algorithms
Google states that the accident will be used to improve the quality and responsiveness of its self-driving software. Google claims that it has run, “thousands of variations on [the accident] in our simulator in detail and made refinements to our software. From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.” While it is certainly a positive development that Google will incorporate this data and begin to model for different types of vehicles, the implications of this announcement are rather unsettling. Even a novice driver understands that different vehicles accelerate and stop in different manners subject to the physics inherent in the characteristics of the vehicle. For instance, large, heavy vehicles like 18-wheelers, dump trucks, box trucks, and commercial moving vans accelerate more slowly due their immense weight and other factors. Likewise, these larger vehicles also come to a stop more slowly due to the amount of momentum created by their weight multiplied by the speed the vehicle is traveling. Most truckers, bus drivers, and other commercial drivers therefore keep greater following distances. Furthermore, due to the large size and significant weight of these vehicles, their active safety features including their ability to maneuver to avoid an accident are significantly less than that of a smaller vehicle. On a smaller scale, any driver of an SUV can tell you that the vehicle’s higher center of gravity makes it less nimble and less able to swerve to avoid obstacles compared to a sports car or a roadster. The fact Google is only now incorporating aspects of vehicles characteristics of larger vehicles like buses and their resultant effects on driver behavior is troubling. Many have assumed that the systems were advanced enough to account for the fact that the behavior of a driver of a sedan is likely to difference significantly from that of a commercial truck. It is certainly a positive that the automated vehicles will now account for these differences in behaviors and react more “gracefully,” but in light of Google’s push to remove restrictions placed by California on its vehicles, this revelation is troubling and carries legal and ethical concerns.
Are California’s Autonomous Car Rules and Regulations Justified?
While many have accused California of getting in the way of progress or permitting other states to take the lead in semi-autonomous and autonomous vehicle development, it appears that there are sound justifications for the approach. Under regulations adopted by the California DMV, a human operator must be capable of taking over for the autonomous vehicle at all times. Furthermore, while the human driver remains liable for all crashes and traffic violations he, she, or the computer system may cause or incur. Since it appears that the current systems do not fully account for every variable that a human driver can perceive and use to modify his or her actions or behaviors, it appears that the current human-software alliance is justified. While this type of system only permits for a National Highway Transportation Safety Administration (NHTSA) defined level 2 autonomous vehicle, programmers can use driver data to continue to improve software logic. In the meanwhile, these semi-autonomous vehicles can be assisted by the careful guidance of a human driver.
Injured by a Semi-Autonomous Vehicles Self-Park, Lane Assist, or Summoning Feature?
The technology and features deployed in new vehicles are certainly impressive and are likely only a harbinger of what’s to come. But, the technology is still immature despite the many recent and rapid developments. Thus many of these features, while impressive, are still imperfect and a potential source of catastrophic injury or death. If you or a loved one have suffered a serious injury due to an autonomous vehicle, contact that attorneys of The Reiff Law Firm. For more than 36 years, our personal injury lawyers have fought for severely injured Pennsylvanians. To schedule a free and confidential consultation, call us at (215) 246-9000 today.