Unfortunately on the 18 of April 2021 two people died after crushing into a tree in fatal accident in Woodlands, Texas. The car that these two people were in, was a Tesla and there was no-one in the drivers seat. This crash has sparked a discussion whether Tesla is false advertising its cars driving abilities or at least whether their naming scheme is misleading. So far Tesla denied that the vehicle was operating either in Full Self Driving mode, because the vehicle didn’t have an active subscription to this system, but they also said that according to the cars data logs autopilot was also not engaged. People tried disproving that the vehicle could be operating in these modes by showing that if you unbuckle the seatbelts, the car warns the driver and eventually pulls over, as it is shown in this short video:
What the video fails to show is that this entire system can be extremely easily cheated by just bucking the seatbelt behind your back. By then the seats weight sensor is disengaged (it foolishly assumes that when the seatbelt us buckled there is someone sitting there). The last thing that separates you from “driving” from any other seat is cheating the sensors embedded in the steering wheel, which has been proven to be extremely easy (for example by hanging a water bottle on the steering wheel).
Of course, Tesla shouldn’t be responsible for their clients stupidity, but they should prevent them from exploiting these systems, especially this easily. Although now, on their website Tesla is explicitly saying that their Autopilot and Full Self Driving systems are only drivers assistance and so called “hands-on” (the steering wheel) systems, why name them as if they allowed you to leave everything up to the car, when this is clearly not the case. Even recently, in emails to California DMV, Tesla has admitted its supposedly Full Self Driving system, that is only available to 2000 beta drivers, is only working as a level 2 (hands on) autonomous system (in a scale from level 0 to level 5). In my opinion all of the Tesla’s “autonomous” systems in their offerings are just Advanced Drivers Assistance Systems, and should be Advertised as such – everywhere. The naming scheme is also highly suggestive, indicating the cars can do much more than they actually can.
Elon Musk has defended the name “Autopilot” saying that on airplanes, there also is an autopilot system and the pilot still has to be attentive and ready to take over. I think this argument is invalid, because pilots (especially commercial) are actually trained to use these features and have to have a certain amount of experience in order to be able to operate such machines.
Although for now the official states that the car was not operating in any of Teslas autonomous modes, how was it even operating if there was no-one in the drivers seat? Assuming that the car actually had autopilot engaged, this is not the first accident involving a Tesla car operating in this mode. Despite the statistics saying that Tesla’s autopilot is almost 10 times safer that “average” vehicle, with one accident every 4.19 million miles traveled with autopilot engaged vs one crash in every 484,000 miles for human-operated cars. Even though the number is impressive it could still be easily improved, if all of people driving Teslas understood what the systems that their cars are equipped with really were.
Stay safe, and remember to always pay attention to the road, whether you are using ADAS systems or not. Don’t text and drive. Stay hydrated.
Sources:
https://edition.cnn.com/2021/04/19/business/tesla-fatal-crash-no-one-in-drivers-seat/index.html