Tesla Motors Inc was in the news last week after a Model S reportedly went rouge – now; another Model S has refused to activate autobraking and it forced the driver to take over from Autopilot before ramming into another car at 40 miles per hour. Tesla has developed its autopilot program as the first stride towards outfitting cars will self-driving abilities. In fact, Tesla has smartly marketed autopilot to look more like a self-driving tech than a driver-assist system.
Nonetheless, industry experts seem to agree that Tesla has the best in-class driver-assist technology in the market with its autopilot program. However, rivals such as Ford, Volvo, and Daimler AG thinks Autopilot is overrated and that its loud hype is making it hard for consumers to pay attention to its limitations. Now, it appears that Elon Musk has spread the hype too thin and drivers are starting to pay more attention to the weakness of autopilot.
Can you trust autopilot with your life?
Ars technical reports that a driver, Arianna Simpson has claimed that the safety features in her Model S went into hibernation and it refused to come to her aid to prevent her crashing into the back of another vehicle at speed. Simpson says she was driving north from Los Angeles on I-5 with autopilot in control when the car decided to act irresponsibly.
In her words, “All of a sudden the car ahead of me came to a halt. There was a decent amount of space so I figured that the car was going to brake as it is supposed to and didn’t brake immediately. When it became apparent that the car was not slowing down at all, I slammed on the brakes but was probably still going 40 when I collided with the other car.”
However, Tesla has denied any liability in the car crash and it has data from the vehicle log in its defense. The firm says its data points revealed that Simpson hit the brakes to deactivate autopilot and traffic-aware cruise control – the action was said to have returned the car to manual control. Simpson’s recount of the event and Tesla’s call log seems to say the same thing – she applied the brakes and took over from autopilot.
An official statement from the firm says, “Safety is the top priority at Tesla, and we engineer and build our cars with this foremost in mind… Autopilot is by far the most advanced such system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility.”
Yet, we cannot ignore the fact that the driver was forced to take an emotional decision after it was apparent that autopilot might not stop the car. A test scenario of the crash might show autopilot stopping the car if the human did not take over but not many people can honestly claim that they have the mental fortitude to trust a computer with their lives if a crash at 80 miles per hour is imminent.
Autopilot is starting to make serious mistakes
On Thursday, we reported that the owner of a Model S claimed that the car drove itself off before it crashed into the back of trailer. Tesla Motors was quick to refute the claims of the driver. In fact, the firm enlisted data (complete with date and timestamps) from the vehicle logs to prove that the driver acted irresponsibly by activating Summon on the side of the road instead of using it on private property.
The driver of the “rogue Model S” has maintained his innocence despite Tesla’s “data-driven” claims and that he should have seen the car driving off in the 20 seconds to 1 minute he spent beside the car. More so, if Summon was in control as the firm claims, common logic dictates that we ask why Summon couldn’t activate autobrake to stop the car from crashing into a trailer.
The less-than-intelligent (sometimes outrightly stupid) actions of some drivers when Autopilot is engaged is also making regulators lose sleep over the suitability of the technology for public roads. Some drivers pick a book to read when autopilot is at work, others jump into the backseat to watch a video or take a nap; thankfully, a Canadian expert has warned people to think twice about having sex in the car while trusting autopilot to take them to from point A to point B.
Many of us are enthralled by the idea of getting inside a car and throwing up our feet, doing something more interesting than driving, while the computer takes us from point A to point B. However, autopilot is far from the point where it can absolve you of driving responsibility.
Tesla hires Audi veteran to build Model 3
The Model S is probably the most reliable car on the road and Consumer Reports has never seen a car with such a high level of approval from buyers. However, the recent issues with autopilot might start to chip off the impregnable armor of Model S’ reputation. The Model X still has many issues that need to be fixed (2700 units already recalled) and it doesn’t appear as if the SUV will ever get the kind of positive approval that the Model S enjoyed.
Tesla Motors has forayed into the mass market for automobiles and it is no longer news that the Model 3 is the most important vehicle for Tesla now. The firm has already recorded almost 400,000 preorders backed by $1000 deposits and the car is still a work-in-progress that won’t hit the roads until 2017. However, Tesla wants the best odds of success for the Model 3 and the firm has hired Audi AG’s Peter Hochholdinger as its new VP of vehicle production.
The hiring of Peter marks a deviation for Tesla’s usual pattern of hiring software engineers and people who don’t have much experience building cars. However, Peter has 22 years of experience working across the Audi manufacturing chain and he was behind the production of Audi A4, A5 and Q5 vehicles. More interesting is the fact that Peter oversaw the production of almost 400,000 cars when he was at Audi. I think Musk made a smart hiring decision that could increase the odds of success for the Model 3.