What Tesla’s First Autopilot Fatality Teaches Us
What to do when your product kills someone.
Tesla is in a pickle today.
It has a product feature that has killed its user. Now Tesla has to walk the fine boundaries between legal defense, public relations sensitivity, technological defense, and moral culpability.
The National Highway Transportation Safety Administration (NHTSA) has opened an investigation into the circumstances surrounding the death of a Tesla motorist employing Tesla’s much hyped Autopilot feature.
The motorist was cruising on a divided highway when an oncoming tractor-trailer made a left turn in front of it, clearing enough space that the cruising Tesla drove right under the trailer, removing windshield (and one would assume killing the inattentive drive instantly at that moment) before continuing to cruise into several more collisions. A diagram of the accident is here.
So here’s the diagram of the #TeslaCrash entirely Tesla driver’s over reliance on an incomplete system. #RIP pic.twitter.com/6SM1cxJ6zG
— TK (@BookMire) July 2, 2016
It’s a straight up oncoming left turn–not some freakish move by another motorist that could not have been judged. And, the Tesla drove right through the truck without seeing it.
That’s scary.
Now, there are a few issues related to the use of new technology that come to mind here, namely that there are risks to using new technology in general, and in particular when it involves moving your body at high speed. The first people to fly in airplanes faced a much higher risk of death than those of us who partake of air travel do today. It’s part of the process.
But… When the technology is uniformly hyped, results in a fatality, and then its purveyor only defends the technology, then the purveyor is on the hook.
Okay, well, no, probably not legally. As Tesla not so subtly posted in its blog about this crash, the company immerses the user–every user–in disclaimers about the technology while allowing the user to use it. From the blog post:
It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
That is a mouthful that basically says “use at your own risk.” And, that’s fine, but to place “public beta” software in charge of multi-ton hunks of metal and plastic moving at 90 feet per second seems a bit…aggressive. And so I’m betting Tesla will face a bit of backlash about the technology.
Which brings us to Tesla’s other defense… the one using numbers. In its blog post, Tesla says:
This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.
And, this is fine, but it’s also indicative of a marketer or PR person using statistics–not a person who understands the stats themselves.
You want to make a fair comparison of the safety of your technology, Tesla? Great, then let’s do a couple of things…
First off, let’s remove from the 130 million miles cited the mileage merely using active cruise control, because as I understand it those miles are aggregated in there. That is sure to be a LOT. That will leave the mileage that Tesla’s are being used with “autosteer” engaged. That is the technology in question. That means that this fatality came with far fewer miles driven than the company claims.
Second, the company is appealing to the base rates of fatalities as evidence that Teslas on “autosteer” are safer than other cars; but the base rates used are highly misleading. Okay, so there is a fatality in the U.S. every 94 million miles driven. That’s fine; but it includes, for instance, people who die from overloading their 1978 Ford Pinto and then losing control after a blowout. It includes drunk drivers. It includes people driving old cars, and cars with mechanical deficiencies, and cars with bald tires. It also includes fatalities where the driver was killed by the negligence of other drivers–not simply their own.
In short, it’s not a base rate that’s comparable for Teslas driven by sober drivers on divided highways. That rate is different, and likely less flattering for Tesla.
The correct base rate starts with the rate of fatalities of people driving brand new luxury automobiles. And, that matters. In the chart below, you see the car models with the lowest driver death rates, and the models with the highest rates. Tesla isn’t in league with the Kia Rio or the Hyundai Accent (two cars that happen to have drivers who die at high rates). To suggest it is so is to pad the PR.
The correct base rate also includes cars on divided highways, and where the dead driver was the inattentive one who drove into an avoidable accident.
So what?
First, the facts are actually not likely to be in Tesla’s favor even though Tesla has attempted to lead with facts. The legal case probably is in Tesla’s favor, but it’s not clear the moral case will be…because:
Second, I’m betting that Tesla drivers, like the one in the unfortunate crash incident, are using this nascent technology and expecting that it will not run them broadside into a tractor trailer. The hyperbole surrounding the tech, and the feel good ego boost of owning a six figure investment in new technology, likely clouds judgment.
Third, when it comes to new products, it’s probably too aggressive to put highly dangerous equipment in people’s hands and call it a “public beta.”
As someone who is a fan of the prospect of self driving cars but who had no idea the aggressiveness of Tesla’s placement of this technology onto actual roads, I’d say it’s time for them to go back to the drawing board.
One fatality does not make a new product category go away, but Tesla faces a very fine line between legal defense and technology evangelism. When your product kills someone, especially when it kills someone during a routine circumstance where the technology should have obviously worked, resorting to bad statistics and legal disclaimers is a bad idea.
The right thing to do is to fix the technology, not to spin its goodness.
As always, caveat emptor. And be careful out there.