At one point, Tesla founder Elon Musk said that self-driving autonomous vehicles were no more than three years away. Recently, he revised his estimate downward and now believes that autonomous vehicles produced by his company are no more than two years out. This is despite engineers from rival companies stating that they have the same technology as Tesla, but believe that it is not yet ready for widespread deployment.
Furthermore, after Tesla’s much-hyped self-driving automobile launch of its autopilot feature and Musk’s recent pronouncements, recent actions by the company show only mixed confidence in the technology that is, quite literally, driving these vehicles. While a new “summoning” feature was unveiled, the company also implemented restrictions on the autopilot feature apparently intended to reduce the uses of the autopilot system that Musk characterized as “crazy.” However, Musk probably should have realized that the enthusiasts that purchase his vehicle are likely to want to push the limits of the system to discover its full capabilities. However, whether you characterize these behaviors as “crazy” or merely irresponsible exuberance for a system that, the fact remains that the autodrive feature has been pared back by the company.
Understanding Tesla’s “Summoning Feature” and Its Risks
Version 7.1 of the Tesla software that controls its vehicles introduces a new beta feature known as “Summon.” On its surface, the Summon feature looks like something out of 1980s science fiction. The feature can permit a Model S of Model X Tesla to park itself when you arrive home.
Additionally, when you decide that you want to head out to work for out for the night, the Summon feature can allow the vehicle to autonomously back itself out of a garage and to the street, or another location, where you are waiting.
One owner of a Tesla vehicle posted a video of the technology in action:
However, vehicles moving by themselves, do present certain risks. For instance, the vehicle appears to interface with automatic garage door openers. It is not beyond the stretch of the imagination to imagine an absent-minded owner filming or showing off his vehicle by slowly walking and duplicating the vehicle’s path while he or she films. The individual may walk under the garage door as it is automatically closing. In most scenarios, a personal injury would likely be prevented by motion detectors and other safety features, but these features are far from foolproof and can fail. If you are concerned after receiving an injury from a self-driving car, contact one of our car accident attorneys at The Reiff Law Firm.
Furthermore, it is unclear as to the exact level of sensitivity the vehicle’s rear camera is equipped with or the effectiveness of the underlying software in detecting a hazard behind the vehicle. For instance, it is likely that the vehicle would stop if there was a young child standing in its path. But, would it recognize the child if he or she was in a potentially unanticipated position such as laying on the ground while using sidewalk chalk in the driveway? Furthermore, if there was a piece of inconsequential debris that the vehicle could safely travel over, such as an empty plastic bag temporarily inflated by the wind, would the vehicle proceed or stop unexpectedly? Furthermore, what would occur if a family member left items in the area where the vehicle typically parks? Would the vehicle alert the owner that it was unsuccessful in parking? Or would it simply idle in the garage doorway permitting entry to anyone who might want enter the home? While the software requires the owner to stand near-by, it apparently does not require line-of-sight which would provide visual confirmation that the vehicle completed its task successfully.
These are all important safety questions and perhaps explains why the feature is still decidedly in beta status. However, unlike software code that runs on a desktop computer, bugs and glitches in car could be the difference between life and death.
The Restrictions Placed on Tesla’s Autopilot
Despite Musk’s recent pronouncements regarding the autopilot feature including his statement made during a conference call to reporters that “[It’s] probably better than a person right now” a number of restrictions were recently placed on the feature. The update now sets an electronic limit on the speeds over the posted speed limits that vehicles can travel on certain restricted roads. Restricted roads include residential streets and any road that is not marked with a center divider. On these roads, the system will be limited to traveling no more than 5 mph over the speed limit. Furthermore, the update changed the vehicle’s behaviors when cornering causing it to slow as a human driver would. This change shows the importance of autonomous vehicles conforming their driving to expected behaviors. Failure to do so, as in the previous update, can result in an increased risk of a collision with a human driver due to unexpected driving behaviors presented by the vehicle.