Elon Musk is a magnetic and charismatic man. His companies, Tesla, SpaceX and the Boring Company all promise to redefine transportation. And, to a degree, they have. The Tesla Model 3 was the most hotly-anticipated car of all time. The SpaceX program has made promising strides in the realm of proposed commercial space flights. The Boring Company recently signed a deal with Chicago to bore a tunnel from O’Hare International to Downtown Chicago.
Yeah there is a specter looming, and it’s expectations. Musk and Tesla have poorly communicated and obfuscated the capabilities of Tesla’s Autopilot feature. In a recent Tweet, Musk referred to an upcoming Autopilot update as offering “full self-driving features.” This is in spite of a few fatal crashes that are purported to have happened due to overreliance on the Autopilot system. At what point does murky marketing language become irresponsible? What is the culpability of Musk and Tesla in these deaths?
The Walter Huang Incident
Walter Huang was killed when his Tesla impacted a central lane divider on a California highway. His vehicle was being controlled by the Autopilot software at the time of his death. The vehicle had been on autopilot for about 19 minutes when Huang died. Reports after the incident described the event as having occurred due to Huang’s overreliance on the Autopilot software. However, what degree of culpability can be said to rest with Tesla, and its CEO? Elon Musk is renowned for overhyping his vehicles and referring to Autopilot as “Self-Driving.”
“Self-Driving?”
Another fatal incident in Florida occurred in 2016 involving a Model S in Autopilot. More recently, Teslas in Autopilot have gotten in incidents involving parked fire trucks in both Salt Lake City and Los Angeles, respectively. These incidents have led to a more “nagging” reminder system in Teslas, reminding drivers to keep their hands on the wheel and eyes on the road. However, many have commented that such features actively contradict claims that the system is self-driving.
While the Autopilot software is certainly impressive, it is clearly not self-driving. Assertions by Musk to the contrary are irresponsible at best and downright negligent at worst. As long as the system is causing fatal accidents and requires constant driver monitoring, it can’t be called self-driving. Calling it anything other than what it is, which is advanced cruise control, is anti-consumer and dangerous.
Maybe the day will come when Tesla vehicles are truly self-driving. Perhaps Tesla is on the verge of a breakthrough, and next year their vehicles will be totally autonomous. However, until such a day that the system doesn’t cause fatal accidents, that promise might as well be for a distant future we will never live to see.
No comments so far.
Be first to leave comment below.