Again and once more trouble with the “autopilot”!
Tesla’s supposed ability to drive autonomously has brought the company a lot of trouble. Among other things, several accidents are said to be attributable to the assistance system, some with fatal outcomes. Now Tesla’s “autopilot” is calling the authorities once more.
The US transportation authority NHTSA is currently investigating several accidents from the past few years. In the crashes, Tesla vehicles were driven by “autopilot” into existing accident sites, some of which were already being used by first responders. It is said that there were numerous injuries and in two cases fatalities. So far, the circumstances and causes have not been clarified.
According to media reports, in 16 cases the NHTSA was able to determine that the “autopilot” relinquished control of the vehicle – less than a second (!) before the impact. The driver has no chance to take over the wheel in this short time.
Tesla boss Elon Musk had repeatedly asserted in the past that data analyzes had shown that the autopilot was not active at the time of the accident – and that the company was not to blame.
However, if the following investigations, extended to up to 830,000 cars, turn out that the autopilot was deliberately deactivated immediately before the crash, for example to avoid liability, Tesla might face major problems. The image of the car manufacturer is likely to suffer enormously.
Tesla boss Elon Musk, who has declared corporate communications to be a top priority and prefers to speak up via Twitter, has not yet responded to the latest allegations.
Tesla’s “autopilot” is not autopilot
It is still unclear whether Tesla also has to fear legal consequences. Because: Even if the brand likes to use the term “autopilot” and thus suggests that the vehicles are able to drive autonomously, in reality these are “only” assistance systems at the so-called Level 2.
This means that the cars accelerate themselves, brake automatically and steer – but the driver always has sole control and is not allowed to let go of the steering wheel.
Not so at Mercedes: As the first manufacturer in the world, the Swabians recently received permission for so-called Level 3 autonomy. The driver can also do other things, such as reading e-mails, while the car takes control. However, he must always be able to regain control.
Mercedes is planning a transfer time of ten seconds for this. If the driver does not react within this period, emergency braking (safe for the following traffic) is initiated.
In addition to numerous accidents that are said to be related to Tesla’s autopilot, complaints from drivers have been increasing recently.
On the NHTSA website, among other things, many report so-called phantom braking, i.e. emergency braking on the open road without there being any obstacles, other vehicles or any other apparent reason.
A problem with which Tesla is not alone. The drivers of other brands also regularly report that the assistance systems have a “life of their own”.