Response to Tesla’s Autopilot defects is not an overreaction


Kayla Caplin, Staff Writer

Every day, thousands of people around the world are injured or killed in car crashes.  In the United States, motor vehicle accidents are the leading cause of death for teenagers today.  Almost all accidents are caused by humans, due to faulty vision, drowsiness, intoxication, or other errors. 

In an effort to make cars safer, electric vehicle manufacturer Tesla introduced a feature called Autopilot in 2014 that lessens humans’ role in driving by making cars semi-autonomous–able to steer, accelerate, and brake on their own on certain marked roads.  While this may seem like a great milestone on the road to producing fully self-driving cars, recently the function has caused a lot of controversy.  

In August, the National Highway Traffic Safety Administration announced that it had opened an investigation into the safety of Tesla’s Autopilot feature following eleven accidents.  The software caused the death of one woman and injuries to seventeen other people.  

These accidents were alarming; most involved crashes into stationary objects, such as concrete barriers and parked emergency vehicles, that Autopilot failed to recognize.  A number of traffic safety and engineering experts welcomed the investigation and criticized the Autopilot software’s defects. 

However, others questioned whether all the attention paid to the Autopilot crashes is warranted.  After all, approximately 38,000 people died in 2020 due to non-autonomous car crashes.  This makes the eleven autonomous crashes–which happened over the course of several years, among the hundreds of thousands of Autopilot-equipped Teslas on the road–seem like an insignificant number in comparison.  Why, some may ask, are we not more alarmed by the hundreds of non-autonomous car crash deaths that take place each day in the US?

“The number of autopilot deaths in Teslas sound really scary and it’s awful that so many people have lost their lives but would the number be less, the same, or more if there was a person in control of the wheel rather than a computer?” asked senior Ellie Shapiro. 

Human driving is inherently dangerous.  Last year’s 38,000 deaths related to regular car crashes overshadow the handful related to Tesla’s Autopilot.  Although it is difficult to fathom how some lives could be exchanged for others, this calls into question whether or not more of those 38,000 people who died would still be alive today had Tesla’s Autopilot been driving instead of humans.  This raises an important question: are accidents caused by computational errors really on the same plane as those caused by human errors?

“I don’t think we are overreacting to Tesla’s autopilot defects.  Even if there were more than 38,000 normal car crashes last year, they were caused by human error.  In Tesla’s Autopilot crashes, they can all be fixed and avoided by solving the error in computer programming which would have eliminated the eleven preventable accidents,” said freshman Naomi Caplin.  

Human error is inevitable.  A software defect is not.  It does not make sense to accept deaths that could have been prevented by recalling vehicles and making changes to a computer system, which are precise actions that the NHTSA could force Tesla to take, depending on the results of the current investigation.

“We are underreacting to Tesla’s Autopilot defects.  A computer is not capable of driving completely safely and although humans cause their own accidents, Tesla’s system is rightfully being investigated and should be tested more,” said senior Talya Pecullan.  

Moreover, accidents caused by computers are completely random.  While the number of crashes caused by human drivers is considerable, individual human drivers have a degree of control over their own fate that autonomous cars remove.  Crashes may be inevitable overall, but most individuals can avoid inane crashes by simply driving responsibly.  When people place their lives in the hands of a third party, such as a computer, it is essential for this party to be completely reliable. 99. 99% safety is not enough, when the unlucky 0. 01% of people are determined by chance.  

When an accident can easily be avoided with an updated program, it is essential that this is looked into and fixed to ensure the safety of everyone on our highway systems.  Tesla’s Autopilot system should not be regarded as a standard of safety until it is no longer possible for a mere programming error to cost human lives.  The standard for Tesla’s autonomous driving features should be nothing less than perfection.