By now, there is no doubt that you've heard about the fatal car crash that occurred on Saturday, May 7 that left 40-year-old Joshua Brown dead and the Tesla Model S he was in completely totaled.
On a similar note, there is no doubt that you've heard about the fallout that resulted from the event. The general public and much of the media have begun to question the viability of the vehicles that were once viewed as the future of driving. The National Highway Traffic Safety Administration (Nhtsa) itself has begun to look into the subject, especially after the high-speed rollover on the Pennsylvania Turnpike that occurred on July 1.
While there is nothing wrong with questioning just how viable autopiloted vehicles are, there is definitely something wrong with the way the public and much of the media is going about it. Why? Because statistics suggest that, generally speaking, such vehicles are safer than those driven by a human.
According to the Nhtsa, in 2014, the year when these statistics were last recorded, there is one fatality for every 100 million miles driven in the country. Conversely, according to Tesla, the accident on May 7 was the first death in 130 million miles of driving on Autopilot.
Yes, the death of Joshua Brown was a tragedy, but it also happened to be one of many that went unnoticed on that fateful day. May 7, 2016 marked the occurrence of hundreds of other crashes, including two that were fatal, such as one in Chicago that left one person dead and six injured, and another on Florida's I-95 that left four people dead.
The numbers get even more staggering when considering car crashes on a global scale, with the Association for Safe International Road Travel noting that nearly 1.3 million people die each year in vehicle crashes worldwide. In other words, according to that statistic, there is a fatality every 60 million miles, or, put into perspective for this discussion, about 3,287 people died in a car crash worldwide on May 7.
When considering those numbers, doesn't the coverage of this one event seem a little extreme? It's not even like the Tesla Model S was branded as foolproof, either. It's already been established that the technology behind the vehicle was in beta, and the driver was warned repeatedly to keep his hands on the wheel.
"Many unforeseen circumstances can impair the operation of Traffic-Aware Cruise Control," the vehicle's manual tells drivers. "Always drive attentively and be prepared to take immediate action."
Unfortunately, Brown was unprepared to take such action because he was allegedly too busy watching Harry Potter to notice.
However, even with all of these facts on the table, people are already calling for a ban on semi-automated vehicles. Forget the fact that the Department of Transportation notes that 94 percent of car accidents are caused by human error, or that this one could be chalked up as a mix of both technological error (which was known from the get-go) and human error; humans are infallible when it comes to operating heavy machinery, so the technology must be to blame.
This incident is most assuredly a tragedy, but the bottom line is that the crash is not an indictment of semi-automated vehicles. As they drive around in Autopilot, these vehicles can learn from their mistakes and collectively improve as the software is updated and Tesla shares what it has learned with others — something humans simply can't do.
Perhaps, instead of trying to use this incident to cast blame on a single entity, we should take the opportunity to try and learn the limitations of the same technology that so many are trying to condemn.