Tesla Motors Inc has made waves with the release of its autopilot feature on the Model S. The feature from Tesla is the best in-class driver-assist system in the market right now. The autopilot feature uses a system of radar, cameras and sensors to change lanes, speed up or slow down, and ease into parking spots. The tech world was smitten with the autopilot feature from Tesla, and the rave reviews and YouTube videos showing the feature at work have all boosted Elon Musk’s image.
The release of the autopilot feature has given birth to some new issues about the safety of road users when humans hand over the wheels to bots. YouTube has been a home to video showing many of the crazy things that Model S driver do once autopilot is engaged. A driver started snacking and another driver climbed into the backseat. Now, Tesla wants to constrain autopilot in order to force driver to behave smartly.
Tesla constrains autopilot
The crazy choices that drivers are making behind the wheels once autopilot is engaged has forced Tesla to start thinking about changing how the feature could be used. CEO Elon Musk, while speaking during the earnings call hinted the need to extra constraints on the feature. In his words, “There’s been some fairly crazy videos on YouTube … this is not good. And we will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things with it.”
On Tuesday, James Chen, VP regulatory affairs at the firm confirmed the plan to handicap the autopilot. He says, “we are looking at ways that we can make Autopilot a little bit more user friendly and perhaps a little less capable of being abused… We have had some people do some pretty scary things, and we are looking at ways to improve the system so that we don’t see this type of abuse.”
Tesla Motors has said that the autopilot feature does not make the car a self-driving car. In fact, drivers are supposed to keep their hands on the wheels all the time (or touch the wheel at intervals) so that they can take over when the need arises. Khobi Brooklyn, while speaking for the firm in Oct. said, “We’ve been very clear with our customers what the intention of these features are, and we trust our customers and we expect them to be responsible.”
Tesla Motors must take offensive to avoid the defensive
Tesla Motors is yet to convince regulators in Hong Kong that its autopilot feature can be trusted. The last thing it needs now is an accident caused by human error while autopilot is engaged. The move by the firm to handicap the autopilot feature is a smart move that would save the firm from headaches down the road.
You can imagine the public outcry that Tesla would have to contend with should a careless Model S driver runs into another road user just because he left the wheel to autopilot. NHTSA, which has been largely mute about rules for self-driving cars since the Tesla Motors release, might suddenly discover its rule book, Tesla bears would have a field day, and folks that have an ax to grind with the firm would start writing their negative reviews.