Tesla Motors is receiving a lot of attention in the news these days for accidents and even a death related to its autopilot feature that released in beta last October.
Tesla’s response to customers who have crashed their cars in autopilot has been measured, objective, and data focused. History has shown this may be the exact WRONG way to respond.
In a July 6 Wall Street Journal article, Mike Spector, Jack Nicas, and Mike Ramsey reported on Arianna Simpson, a venture capitalist in San Francisco, who got into an accident in her Tesla when the autopilot did not behave as expected:
Tesla responded that the April crash was [Ms. Simpson’s] fault because she hit the brakes right before the collision, disengaging Autopilot. Before that, the car sounded a collision warning as it should have, the car’s data show.
“So if you don’t brake, it’s your fault because you weren’t paying attention,” said Ms. Simpson, 25. “And if you do brake, it’s your fault because you were driving.”
Tesla’s Engineering Culture
The common theme in Tesla’s response to Ms. Simpson and other customers is the review of the log data shows the car worked properly, the driver was at fault, and the owner’s manual clearly states the limitation of autopilot and the driver’s responsibility to remain alert and take control as needed.
This reminded me of my experiences when a customer would complain of a bug in my product because the system performed in an unexpected way. The engineering team would investigate and determine the code was behaving exactly as designed. The team and I would then joke “It’s not a bug. It’s a feature!” Joking aside, I would explain to the team why this behavior was frustrating the user. When we followed-up with the customer later, we agreed that our design missed the mark, the behavior was undesirable, and that we would fix it.
The point being that from a customer’s point of view the product either does what they want it to do or it doesn’t.
That the product is working as designed doesn’t matter if the design doesn’t solve their problem or, even worse, causes a more serious issue.
As I thought about Tesla’s response to Ms. Simpson, it reminded me of Malcom Gladwell’s analysis of Toyota’s missteps in responding to complaints of sticking gas pedals in his May 4, 2015 article in the New Yorker.
In the wake of the sticky-pedal problem, customers started complaining that Toyotas were prone to sudden, unintended acceleration. “Whenever someone called in to say, ‘I’ve had an episode of unintended acceleration,’ Toyota would dispatch a team of engineers,” said Roger Martin, a former dean of the University of Toronto’s Rotman School of Management and a member of the advisory panel that Toyota put together during the crisis. “And [the team] would do a thorough examination of the car and pronounce it fine—because it always was—and assure the owner that everything was going to be fine. They were probably just pressing the accelerator when they thought they were pressing the brake. There wasn’t a problem. Just be more careful next time. And [Toyota] got more and more complaints.”
Martin then worked with the Toyota engineers to view the customer with empathy. He asked them:
‘What do you imagine the [driver is] thinking? They’re shaking like a leaf at the side of the road and after that whole experience they are told, “The car’s fine. Chill out. Don’t make mistakes anymore.” Of course they are not going to be happy. These people are scared. What if instead [Toyota] sent people out who could be genuinely empathetic? What if you said, “We’re sorry this happened. What we’re worried about is your comfort and your confidence and your safety. We’re going to check your car. If you’re just scared of this car, we’ll take it back and give you another, because your feeling of confidence matters more than anything else.” ’
Martin explains “It was a sort of revelation. [The Toyota engineer] wasn’t a dumb guy. He was an engineer. He only thought about doing things from an engineer’s standpoint. [Toyota] changed what those teams did, and they started getting love letters from people.”
On July 6th, Tesla posted on their website that autopilot has been
- used safely in over 100 million miles driven
- customers using Autopilot are statistically safer than those not using it at all
- collision on Autopilot was a statistical inevitability
- the system provides a net safety benefit to society
I believe Tesla’s claims are all true and, at the same time, absolutely no comfort to any driver whose autopilot failed to avoid a collision.
When your car is the one in the accident, you are not thinking about “net safety benefits” to society. You’re thinking “Could this feature get me killed?” I speculate Tesla, like Toyota, has an engineering driven culture and that they might benefit from a little less data and little more compassion.
I reached out to Ms. Simpson to learn more about the customer experience after her accident. Tesla does call all owners immediately after any crash. Kudos to them for being proactive and providing a human touch point. However, the call quickly becomes clinical and focuses on collecting the facts.
On Ms. Simpson’s call, they said they would be in touch in two days with the results of their diagnostics. It took a week and several attempts by Ms. Simpson to re-contact them for the follow-up. Upon her request, she finally spoke with a manager to review Tesla’s analysis. He basically said it was driver error and that because Tesla has disclaimers, they aren’t liable in any event.
Might Tesla better connect with their customers by acknowledging how scary the accident must have been for the driver, especially after putting their trust in the car’s autopilot to keep them safe.
Like Toyota, Tesla could emphasize that their utmost concern is the driver’s comfort, confidence and safety. They could explain the limitations of the autopilot and why it was unable to deal with the situation that caused the accident. They could talk through with the driver autopilot’s current limitations and how to more safely use it. If the driver is no longer comfortable with the feature, Tesla could offer to permanently disable the feature so it cannot be activated by ANY driver of the car (see note).
A Big Part of the Product Manager’s Job is to be the Voice of the Customer.
Part of that job is helping our organizations understand and develop empathy for our users. When a crisis breaks on our product, we need to be there to guide the response and ensure the organization considers the customer’s point of view. Toyota learned this lesson the hard way. Is Tesla repeating the same mistake?
Note: enabling autopilot on the car is a multi-step process where the driver acknowledges it is in beta, it will not work in all situations, that they will keep their hands on the wheel, and they must be alert at all times. The driver can also disable autopilot themselves. Still the driver may take more comfort in knowing the car company disabled it, that it would require a call to Tesla to re-enable it, and that no other driver, such as a son or daughter of driving age, would be allowed to turn it on.
That the product is working as designed doesn’t matter if the design doesn’t solve their problem or, even worse, causes a more serious issue.Click To Tweet A big part of the product manager’s job is to be the voice of the customer. Click To Tweet
Meet the Author
Greg Cohen is a 15-year Product Management veteran with extensive experience and knowledge of Agile development, a Certified Scrum Master, and former President of the Silicon Valley Product Management Association. He has worked and consulted with venture startups and large companies alike, and has trained Product Managers and Product Owners throughout the world on Agile development.
Greg is the author of the 280 Group’s Agile course as well as the best-selling Agile Excellence for Product Managers book.
You can download the first two chapters of Agile Excellence for Product Managers here:
Senior Principal Consultant and Trainer