With the rapid production and increase of self-driving cars on the road, how do we know if we are safe to be around or in one?
Tesla, Inc. is a fully electric automotive company founded in 2003, releasing their first production car, the Model S in 2012, then the Model X in 2015, and the new Model 3 released in 2018. Their goal is to make driving safer for the public and reduce the risk of injury and driving related fatalities. In order to achieve their goal, Tesla has been working on and continuously improving their driver assistance and fully-autonomous Autopilot technology that is available for all of their cars.
However, in the past, Tesla has made headlines for its Autopilot being involved in accidents. Three fatal accidents and two non-fatal, where the Autopilot failed to recognize an object on the road. Two of these accidents taking place in 2016, where the technology was still in its first iteration. Another fatality being in 2018, where the Autopilot failed to recognize a concrete barrier while speeding up.
Despite the recent crashes, ever since 2012, when Tesla released their first production car—the Model S—they have kept its safety reports and statistics as a closed book. This was most likely due to the fact that the technology was new and Tesla wanted to continuously collect data and release it when the car could stand up to its predicted safety ability.
Tesla, Inc. recently announced they will be releasing quarterly safety reports. On October 4, they released promising data of their Q3 safety regarding Autopilot. Another report released was approved by the National Highway Traffic Safety Administration (NHTSA) on Tesla’s car safety in a crash.
Included in Tesla’s report on October 4, they released data announcing that they register a “crash-like event” (including near miss crashes) every 3.34 million miles driven with Autopilot engaged. Without Autopilot, 1.92 million miles. Tesla compared this data to NHTSA’s 2016 data where they found a car crash every 492,000 miles.
Furthermore, the NHTSA concluded that all three production vehicles of Tesla, the Model X, Model S, and Model 3 have the third, second, and first lowest probability of injury ever tested by the NHTSA respectively, all gaining a 5-star rating on the 5-star scale.
The safety of autonomous vehicles is questioned often by the public. Many people are skeptical about its safety due to the media’s portrayal of Tesla’s few faults in its technology. However, data and third-party experts tell a different story.
According to The Guardian, newspaper, “Autonomy gives drivers a false sense of security.” The newspaper company reported a Tesla stopped in the middle of a highway with a man fast asleep while using Autopilot (the car stopped due to not receiving any human feedback). Due to the fact that Autopilot gives this feeling, The Guardian claims that people will become distracted gazing off or on their smartphones and thus, drivers will be slow to react in situations where the software does not know what to do. Aside from the negative report, the newspaper does not say that the technology should be abandoned or that it is not safe but rather that the technology is “not failsafe.”
Wired magazine’s article, “Tesla’s Favorite Autopilot Safety Stat Just Doesn’t Hold Up” written on May 5, 2018, questions how accurate Tesla’s statistic about Autopilot reducing crash rates by 40 percent. Experts interviewed state that the statistic is unreliable due to too many variables on how to classify what a safe driver is or how an average driver drives. Age is also a factor in this statistic, along with one-third of accidents being caused by drunk driving, which skews the data of non-autonomous vehicles. Even though Tesla has released reports stating how “safe” their Autopilot is, Wired reiterated the fact that “humans are deeply bewildered about the semiautonomous features.”
Similar to the previous article by Wired, written just thirteen days later, “Is Tesla’s Autopilot Safe? Finding Out Demands Better Data,” reiterates how “Every time a Tesla with its semiautonomous Autopilot feature crashes it makes headlines.” The article goes on to explain how “many are fearful” and how the investigations of the Autopilot crashes by the NHTSA make people give Tesla and their technology a cold shoulder. Even though the article questions Tesla’s data, it continues to say that Autopilot has the ability to save lives and “minimize the fallout when human drivers get distracted.”
Harvard graduate and Ph.D in Physics, Tim Menke broke down how autonomous driving systems work. He explains how these autonomous driving systems work using high-tech machine learning algorithms in order to train itself. This process of machine learning allows for the computer to choose what actions to take in certain situations. When using Tesla as an example of the feature, Menke explains that “the technology is not yet ready for full autonomous driving,” due to the incident of a driver being killed when the “partial self-driving feature crashed into a turning truck that it did not recognize as an obstacle.” Despite the accident, Menke strongly believes that the “new technology has potential to deeply transform transportation.”
The new Autonomous Vehicle Technology Study team at MIT is studying exactly how drivers use the Autopilot feature in Tesla cars. They are conducting an ongoing study using their own hardware and software to collect data of how not only the Autopilot feature works but how humans work with it. Some of the data being collected is the characteristics of the road, human reactions, and their emotions. Lex Fridman, is one of the members of this team and he says that with the data they will be able to understand how humans behave and interact with the Autopilot system. On terms of how safe the technology is today, he is also in agreement that until autonomous driving systems are “sufficiently developed to ‘solve’ the full driving task… human beings will remain an integral part of the driving task.”
Most experts seem to be in agreement that the technology of autonomous driving is capable of driving itself, it still has much room for improvement and a long way to go before it will be able to eliminate the job of a human driver. Although experts state that the technology is safe as long there is a human paying attention at the wheel, some of the public is still skeptical about the idea. Due to the public’s skeptical opinion, those who have no experience with autonomous driving can be surveyed in order to find out their opinions of the technology.
In order to find information on Tesla’s Autopilot safety rating, I first decided to visit Tesla’s own website to find a solid background of the company, their issues, and successes. I then searched through a database via researchIT CT and found popular sources from Wired magazine and The Guardian. Through this process, I was unable to find sufficient academic sources about the Autopilot due to the topic still being fairly new. To do so, I went directly to Massachusetts Institute of Technology and Harvard University databases to find their own research on the topic.
To collect useful data and find people’s opinions on their safety of Tesla’s Autopilot, I asked only those of ages fourteen and older, who have never driven a Tesla before to fill out the survey. This was
useful because it gave me a general idea of how the public views the safety of a machine making its own decisions to drive you.
The survey was filled out by twenty-eight participants in ages ranging from fourteen to thirty-four over a three week period. Fourteen (50%) of these participants were in the age group between fourteen and seventeen. Eight (28.6%) of the participants were in the ages between eighteen and twenty-four. Six (21.4%) of the participants were in the ages between twenty-five and thirty-four.
According to the collected data, three (10.7%) of the participants are affected by the media’s negative point of views on Tesla’s Autopilot as they answered “Yes” (choices being “Yes,” “No,” “Maybe”) to whether or not the media affects their opinion on the technology in a negative way. When asked if they would ever use the Autopilot feature when given the opportunity, four (14.3%) chose “No” and eight (28.6%) of them chose “Maybe.” These statistics give off the assumption that the technology is still a skeptical concept to many. When asked to rate the safety of the Autopilot on a scale of one-to-ten, fifteen (53.6%) of the participants rated the feature a “Seven” or higher, five (17.8%) of them choosing a “Nine” or “Ten”.
When asked why they chose the rating they did, many of the responses fell under the category of “it provides safer driving, without human error.” However, when asking for other opinions on Tesla’s Autopilot, two of the seven responses received negative opinions of the technology with one response stating “Don’t rely on technology [it] is potentially lethal.” However, five of the responses believe that the technology is safer than a human driving by themselves, but still has long way to go before it will be a standalone technology.
With information from experts and respondents of the survey, it seems to show that many people do in fact believe in the safety of autonomous vehicles and Tesla’s Autopilot specifically. Even though there are people who seem to disagree and believe that the technology should not be trusted, the data drawn from the millions of miles driven by these autonomous vehicles prove to the public that with the help of artificial intelligence, safer driving is foreseen in the very near future.
However, electric cars make up only 1 percent of the car market today, but looking at Tesla's sales, their cars are being purchased at an exponential rate. These purchases will help contribute to the improvement of Autopilot. If we take a look at the current trend of data already collected by Tesla, their Autopilot technology can only improve as more traffic situations are faced. Those who neglect the technology now, will inevitable come face to face with it one day.
Through the research of experts and gathering of data through this survey, opinions between the experts and the respondents, both seem to agree that the technology is not one-hundred percent reliable. Although not one-hundred percent reliable, both groups also agree that autonomous vehicles will soon be safe and able to further reduce motor vehicle accidents.
With engineers from Tesla and other companies continuously improving their own autonomous driving features, the ability of Autopilot to drive itself seems to only be getting better. As software updates are created, more data is collected, and machine learning is conducted, autonomous vehicles will learn the ways of the road and how to properly and efficiently navigate them.
“News | Tesla.” Tesla, Inc, www.tesla.com/blog.
Bloomberg.com, Bloomberg, www.bloomberg.com/news/articles/2018-06-07/tesla-model-x-in-california-crash-sped-up-seconds-before-impact.
Solon, Olivia. “Who's Driving? Autonomous Cars May Be Entering the Most Dangerous Phase.” The Guardian, Guardian News and Media, 24 Jan. 2018, www.theguardian.com/technology/2018/jan/24/self-driving-cars-dangerous-period-false-security.
Marshall, Aarian. “Tesla's Favorite Autopilot Safety Stat Just Doesn't Hold Up.” Wired, Conde Nast, 4 May 2018, www.wired.com/story/tesla-autopilot-safety-statistics/.
Marshall, Aarian. “Is Tesla's Autopilot Safe? Finding Out Demands Better Data (And a Lot More Math).” Wired, Conde Nast, 17 May 2018, www.wired.com/story/tesla-autopilot-crash-safety/.
“Self-Driving Cars: The Technology, Risks and Possibilities.” Science in the News, 28 Aug. 2017, sitn.hms.harvard.edu/flash/2017/self-driving-cars-technology-risks-possibilities/.
“Tesla Autopilot Miles.” MIT Human-Centered AI, hcai.mit.edu/tesla-autopilot-miles/.