In response to reports that Autopilot equipped Model S EVs have been developing minds of their own, Tesla Motors Inc has released a software update with additional safety features designed to prevent vehicles from going rogue. The over the air software update began going out to all Model S owners on Tuesday.
The firm was the subject of media reports and conspiracy theories last week when a couple of Model S vehicles were reported to have gone rouge. In fact, some folks went has far as hinting that hackers are now playing Chess with the cars. The first Model S that allegedly went rogue started itself and then crashed into the back of a trailer.
Of course, Tesla denied that its Autopilot system was to blame – data logs indicate that the driver might have activated Summon accidentally. In the second instance, a woman claimed that the Autopilot in her Model S refused to activate autobraking when the car ahead of her suddenly slowed down. The failure of Autopilot forced her to take manual control but it was too late to prevent her from slamming into another vehicle at 40 miles per hour.
Your Model S won’t be going rogue anytime soon
A posting on Reddit in the early hours of the day suggests that Tesla is sending out an over-the-air update that will make it harder for Summon to act on its own accord. The update will now mandate drivers to confirm the direction of travel: forward or reverse in order for them to be aware that Summon has been activated.
Before now, it appears that it was too easy to unwittingly activate Summon through the gear selector inside of the car. The update now ensures that the drivers make conscious decisions to engage Summon by choosing the direction of the auto park feature. Before you can “mistakenly” engage Summon, you’ll need to activate it via the gear selector, choose the direction of the drive, and then enable Continuous Press; hence, there’s a lower likelihood that you’ll unknowingly empower your car to drive itself off.
However, it is a little too early know if the new feature will become another distraction that drivers will learn to respond to without actually engaging their brains. Studies have shown that most people have become over-saturated with Software EULA warnings that they just go through the motions without using their brains even when the warning says, “this software may harm your computer”.
Model S won’t go rogue, but Tesla can’t say the same for crashes
Interestingly, the software upgrade does practically little to stop in the car from crashing into objects right in front of them. In the twin instances of Tesla gone rouge that was recorded last week, it is obvious that the cars can’t seem to avoid driving into a trailer and another car on the high way.
Tesla has explicitly said that drivers should keep an eye on the car while Summon is engaged; yet, it doesn’t make sense if people cannot trust their cars to apply the brakes until a crash. Many people will now start thinking twice about trusting autopilot and such skepticism is fodder fire on which Tesla’s rivals tend to thrive.