There is little debate in the auto industry that semi-autonomous and eventually autonomous systems are the future. In fact, most people fully expect that driving will be something of a lost art in the future as increasingly advanced computerized systems assist and eventually supplant human drivers in autonomous commercial trucks, personal vehicles, taxi fleets, warehouses, and anywhere else vehicles are manually controlled.
Aside from considerations of convenience and efficiency, one of the main motivators of this switch from human-controlled cars, trucks, buses, trains, and more are the perceived safety benefits that will come with humans giving up driving. Accidents due to fatigue, human error, illness, and other inherent human conditions would become a relic of the past. Theoretically and ideally, vehicle accident rates on roads, highways, and in the workplace would be reduced to zero and a post-accident world would come into existence.
Unfortunately, with any great advancement and significant societal shift, there are bound to be false starts, complications, and mistakes made. The transition to autonomous vehicles guided by computers is no exception to our propensity to overestimate our ability to plan and engineer while underestimating our ability to make oversights and to miss both foreseeable and unforeseeable complications. Engineers from Jaguar charge that their colleagues at Tesla may be falling victim to this decidedly human impulse.
Potential Problems and Limitations With Tesla’s Autopilot System
Many critics of autonomous and semi-autonomous vehicles have focused on a mishap with Tesla’s autopilot and claim that it may set self-driving cars back by decades.
According to reports from users of the vehicles and others, Tesla Model S drivers running software 7.0 have virtual carte blanche when it comes to engaging the autopilot suite of their semi-autonomous driving systems. Tesla released its beta auto pilot semi-autonomous driving systems in October 2015and virtually left the door open to owners to use the system responsibly – however, that individual might define responsibly. In some cases, it is clear that responsible meant very different things to different people.
One user reported that the autopilot system was not keeping him on the roadway as expected. The user wrote:
So far I have a little over 300 miles on autopilot, mostly 20 miles at a time on my commute to and from work. The first day when I was in the right lane, as I approached exit ramps, it would dive for the exit ramp. I quickly learned to apply torque to the wheel to hold the car on the interstate until I had passed the exit.
However, the interesting part is that the system seemed to learn from driver corrections. The user continued to write that:
Each day the system seems to have less tendency to follow the exit ramps as I pass. The last two days it only gave a momentary wiggle and moved over maybe six inches towards the exit ramp then it recovered and moved on down the road. This morning it gave only a very slight hesitation, so little that I did not have to correct it at all. I find it remarkable that it is improving this rapidly.
Another driver who was utilizing the system found that while it was learning fast, human control was still absolutely required. This user wrote that:
I noticed that on sharply curved ramp connecting I-80 west with CA-113 north in Davis, the first time it took the curve at full speed and wasn’t able to stay in lane resulting in a “take control immediately” alert. After a few more times on this curve with firm pressure on the steering wheel it’s now learned to slow down and today had no issue taking the curve. Definitely learning.
However other users aren’t so sure that the system is “learning”, report unwanted and potentially dangerous behavior, and attribute the apparent improvements to changes in driver behavior and expectations. One user wrote, “I agree that the perceived improvements may be due to changes in driver behavior more than anything else. I personally have learned what sections of roadway the car can handle and which it cannot and am more likely to disengage in areas where I know the car will have trouble. AP is still an impressive achievement and I enjoy using it for about 90% of my daily commute.” However another user reported troubling behaviors by the autopilot system, “I was driving on the freeway, no curves, but the Tesla got spooked by a truck passing my on the left that had a plow on the front. It jerked to the right and the alarm went off for me to take over. The truck wasn’t encroaching as far as I could tell. Perhaps the plow was closer than the wheels and that’s what it sensed.”
This autopilot free-for-all resulted in many owners posting hair-raising videos of rear crashes using autopilot. Consider this video where auto-steer’s failure causes a vehicle to drift across the double-yellow line and into oncoming traffic. From the audible warning that auto-steer had failed, the driver had less than one second to take corrective action.
Likewise, in this YouTube video, a Tesla vehicle being controlled by autopilot veers off to the right when exiting a road. If the driver did not keep his or her hands on the steering wheel, it is highly likely that a collision would have occurred.
Unfortunately, many drivers believe that keeping their hands on the wheel and maintaining their focus and concentration with using autopilot defeats the purpose of the system. In fact, some users have even argued that Elon Musk, the founder of Tesla, himself does not keep his hands on the wheel when demonstrating the technology to the press. In other words, consumer expectations and foreseeable uses of this technology do not square with proper and safe operation of these systems.
Engineers From Jaguar Characterize Tesla’s Autonomous Systems Implementations As “Very Irresponsible”
Engineers from Jaguar state that their vehicles have all of the technology found on Tesla vehicles, but engineers with the company have refrained from enabling autonomous driving technologies because they do not believe that they are safe or ready for widespread deployment.
Jaguar states that its engineers made the decision to require human intervention not because its semi-autonomous collision mitigation braking isn’t strong enough to stop the car, but instead because of concerns that drivers would disengage and tune-out due to the “false sense of security” provided by current-gen autonomous systems. Thus, in Jaguar’s autonomous systems the vehicle will begin to slow itself and provide suggestive feedback to the driver, but the driver him or herself is still responsible for applying the brakes and bringing the car to a halt.
XF project manager Stephen Boulter believes that Tesla’s decision to provide autonomous features with few to no restrictions was “very irresponsible.” He states that he is worried about the type of high-profile tragic event that “could set the technology back a decade” due to fears and safety concerns by the public and lawmakers. Boulter’s fears aren’t entirely unfounded.
Consider that in the days of the 1920s and 1930s, it was believed that inflatable dirigibles would usher in a new era of luxurious air travel, not unlike the travel by large ocean liners of the time. In fact, in the years leading up to the Hindenburg disaster, air travel of this type had already begun in earnest. But, after the tragic events of the Hindenburg, public fears over blimps, airships and dirigibles essentially ended serious consideration of commercial travel through this technology. Rather than a world where luxurious travel was commonly available by airship, the technology was relegated to niche uses like the Goodyear blimp. Furthermore, fears over the flammability of hydrogen likely delayed further research into hydrogen as a fuel source due to safety concerns.
Engineers from Jaguar worry that Tesla is gambling with a rapidly developing but still immature technology. By forcing the technology into the primetime before it is ready, these engineers fear that a catastrophic event will occur that causes the public to lose faith and trust in the technology.
Automakers have good reason for concerns about premature deployment. Despite their concerns with semi-autonomous technology, Jaguar, like many other automakers is still moving forward with fully autonomous technical testing. Volvo, the autonomous leader, states that they prefer full autonomy and is pushing the envelope as fast and as far out as possible. In fact, Volvo claims that they will have 100 customer-leased fully autonomous XC 90’s on the road of Stockholm in 2017.
Tesla Tamps Down on Ability to Use Autopilot Without Restrictions
As described above and confirmed by Tesla CEO Elon Musk, many owners were doing crazy things with the autopilot system. As such, Mr. Musk and Tesla decided to place additional restrictions on the autonomous systems perhaps in an attempt to avoid additional regulatory oversight in light of Hong Kong’s government forcing Tesla to disable auto steer and auto lane change for Model S cars in its city.
In a recent update known as Tesla 7.1, the company ratcheted down the accessibility of autopilot. The changes to the system now keep the car at or below the speed limit when autopilot is engaged. Furthermore, the software update places a number of additional limits on auto-steer, one of the most beloved but contentious self-driving systems in their autopilot model.
Auto-pilot technology remains present, albeit in a more limited fashion, in Tesla vehicles. However other high-end vehicles makers have decided to not implement technology in their vehicles not because they can’t or lack the hardware and software, but because engineers and corporate executives feel that semi-autonomous technology is still too dangerous for widespread deployment.