Tesla Motors Inc released the Autopilot Mode with a hope of reducing accidents on the streets due to human errors, but at the same time, it did issue a warning that human assistance would be needed. However, people have not paid heed to the warnings, and have been trying crazy stunts leading to accidents, drawing this feature into controversy.
Tesla has been mostly quiet on this, but now, it has taken to Twitter to discuss the issue, and have reminded the correct usage of Autopilot to people.
Autopilot – how does it actually work?
On Friday, the EV firm took to its official Twitter account, and shared Wired’s report on the actual working of its Autopilot. Tesla tweeted the Wired report, saying, “Let’s break down how Tesla’s self-driving autopilot actually works http://bit.ly/2by1A7K.” Over a week ago, another case of a Tesla car crash was reported, and probably this led the firm tweet this.
The aim of the report is to inform consumers that the Autopilot does not entail the absence of supervision by the driver, and that despite being self-driving, Tesla cars and not at all designed to be driverless. The video report from the Wired explores the different hardware components that enable the self-driving feature to function. In its video, Wired has reviewed how Autopilot radars and camera work together to make driving safe, easy and fun.
In the video, a person is seen running his Tesla Motors Inc Model S on Autopilot mode that informs him to keep his hands on the wheel. But, the driver does not listen and says, “I’m supposed to keep my hands on the wheel, but you can see how relaxing it does feel. It lulls you to this sense of security.”
No fault of Tesla
A report in Electrek says that on August 7 the new crash took place in Kaufman, Texas. The victim confessed that he was not paying attention, when his Model S hit a guardrail.
Speaking to Bloomberg, the driver said, “I used Autopilot all the time on that stretch of the highway. But now I feel like this is extremely dangerous. It gives you a false sense of security. I’m not ready to be a test pilot. It missed the curve and drove straight into the guardrail.” He added that the car didn’t stop after that, but continued to accelerate after it first hit the guardrail.
Speaking in Tesla’s defense, Electrek said that contrary to what the driver suggests, Tesla Motors Inc does not give a ‘false sense’ of security. Rather, the feature is meant for reducing the workload of the driver and making driving more secure, but still the driver needs to be vigilant while on the road.
Two weeks back another accident involving a Model S took place in Beijing, China. But, there the driver admitted that he was paying attention to his phone instead of the road when the crash took place, reported Electrek. Despite that the driver blamed the autopilot feature for the mishap.