The brand’s cars can be equipped with an optional Full Self-Driving (FSD) software, which grants limited vehicle autonomy to those who subscribed to include the add-on. Naturally, the feature is usually endorsed by company CEO Elon Musk on Twitter, who even claimed during an earnings conference call in January that he is “highly confident the car will be able to drive itself with reliability in excess of human [interaction] this year.” However, several accidents involving Tesla-branded cars have been recorded in the US as of late, with numbers totalling up to 20 cases – some being fatal. The most recent traffic collision was an unspecified Tesla vehicle which crashed into an overturned truck on a highway in California – killing the EV’s driver. Federal highway safety regulators and the state’s highway patrol are currently investigating the cause of the crash, with some speculation that Tesla’s FSD software may be the one to blame.
FSD Beta V9.0 will blow your mind. — Elon Musk, the 2nd (@elonmusk) April 29, 2021 The California DMV noted in the memo that Musk’s claims of Tesla cars’ self-driving capability on Twitter are extrapolated. It confirmed this after a conference call with Tesla representatives including the company’s autopilot engineer CJ Moore, who explained that its vehicles could only achieve Level 2 (L2) autonomy at this current time. For those unfamiliar, L2 autonomy only handles partial tasks such as acceleration, steering and braking, but will still require the driver’s full attention. This can is often used to assist drivers when driving down highways at regulated speeds – think of it as somewhat of a more advanced version of cruise control. Level 2 is not capable of getting you from your point of origin to your destination automatically.
— Elon Musk, the 2nd (@elonmusk) April 29, 2021 “Tesla indicated that they are still firmly in L2,” California DMV said in the memo. “As Tesla is aware, the public’s misunderstanding about the limits of the technology and its misuse can have tragic consequences.” Another crash in late April which involved a 2019 Tesla Model S that killed two onboard individuals was first thought to have been caused by the car’s Autopilot system. Police noted that both individuals were seated in the front passenger seat and in the back of the vehicle, with no one seemingly driving the vehicle. Musk himself refuted this on Twitter, revealing that recovered data logs from the crashed vehicle indicated that the Autopilot system was not enabled and the owner did not include FSD in their purchase.
Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD. Moreover, standard Autopilot would require lane lines to turn on, which this street did not have. — Elon Musk, the 2nd (@elonmusk) April 19, 2021 Regardless, as noted by the California DMV, the term “self-driving” or “autonomous driving” itself is capable on causing harm. This is especially when the feature is often associated with the potential capabilities of Level 5 autonomy which does not require any human supervision whatsoever. That, however, is merely a proposed technology milestone that has not been fully achieved by any manufacturer just yet, and may take several more years before it becomes a reality. (Sources: Reuters / CNET | Header Image: Tesla Model X)