Back to top

Image: Bigstock

What Autonomous Driving Can Teach Us About Risk

Read MoreHide Full Article

Today, I’m going to go in a slightly different direction from a typical “Know Your Options” column. Instead of describing a specific options trading strategy, I’m going to explore the mindset of a successful trader – in the context of recent news about accidents in vehicles with autonomous driving capabilities.

Overall, it’s a story about how we view risk in general – often to our own detriment.

In the interests of full disclosure, I’ve been bullish on the shares of Tesla (TSLA - Free Report) for a long time, and I own them in my personal trading account. I also own a Model 3 and I like it very much.

Based on the most recent delivery figures, the company is now on pace to sell almost a million vehicles a year and they have literally billions of miles of data from cars they’ve sold previously that they use to continually refine their auto-pilot technology.

There have also been some widely reported failures that make for spectacular news stories. There have been tragic injuries and fatalities as well as property damage and occasionally even dangerous battery fires that present a significant challenge to the firefighters who respond.

Stories about drivers who ignore warnings to continue paying attention when auto-pilot features are engaged and, in some cases, appear to have not even been in the driver’s seat prior to accidents add a salacious quality to those reports. As Tesla and almost every other automaker pursue fully autonomous self-driving capabilities, there will almost certainly be more high-profile accidents with the technology itself taking at least a portion of the blame.

How we treat that information as a society is a function of how rationally we’re dealing with risk.

In a news story that I read recently in the New York Times about a fatal accident involving a Tesla Model 3 and the ongoing lawsuit that followed, autonomous driving technology was blamed for not slowing the vehicle fast enough when another car encroached into its lane.

I use the example of auto fatalities frequently in my regular discourse about risk in general. For most people, driving of riding in a car is the most dangerous thing we ever do - yet it has become such a common part of modern life that we can do it daily without giving the danger even a brief thought.

The National Highway Transportation Safety Administration (NHTSA) reported 36,096 traffic fatalities in 2019, which is a bit lower than long-term averages. That means that in the United States, approximately 100 people a day leave the house to go about their daily business and tragically never come home.

100 people. Every. Single. Day.

Many people are apprehensive about air travel. The concept of a plane crashing is dramatically terrifying. In terms of commercial flights however, it almost never happens. Fatalities globally have averaged about 300/year for the past 20 years - and there have been many years in which not one single such death occurred in the US.

That figure is very similar to the number of people who die each year falling from ladders.

In fact, not only is flying in general much safer than driving a car, even skydiving is safer than driving a car. You are literally more likely to be killed in the car on the way to the airport go skydiving than you are during the act of intentionally jumping out of the plane.

Most traffic fatalities involve some degree of error on the part of one or more of the drivers.

Theoretically, autonomous driving could significantly reduce the danger associated with motor vehicle travel, but the way the risk would be mitigated won’t necessarily be immediately apparent to most people. Because even the most highly developed autonomous driving systems will never be perfect in all situations, there is likely to be a large group of potential users who will remain doubtful about the technology despite the significant overall reduction in risk that they offer.

It’s not possible to do a controlled experiment, but many experts think that traffic deaths could conservatively be cut in half (or much more) with the widespread adoption of self-driving functionality.

But, even if the overall number of fatalities is reduced by 80%, the remaining accidents are likely to be both very visible and extremely terrifying. Imagine an interview with a driver who credibly claimed that the car he was driving intentionally steered or accelerated right into a dangerous situation and caused a fatal crash.

Upon hearing that, most people wouldn’t be thinking about the 80 people who didn’t die that day who otherwise would have, because they’re invisible and unknowable. Instead, they’ll be thinking about the 20 people who not only died, but were killed because the car made a mistake. Rogue self-driving vehicles are the stuff of science fiction nightmares. Focusing on that rather than the overall mitigation of risk is an unfortunately common mistake in human cognition.

It's not our fault. Human evolution has hard-wired us for snap judgements. For thousands of years, there simply wasn’t time to do a detailed mathematical analysis when confronted with a potentially dangerous situation. “Going with your gut” in situations that made you afraid was often the best way to stay alive.

Think about it. An 80% reduction in deaths means that 29,000 people will be spared each year, but we’re likely to instead pay attention to the horror-movie aspect of the accidents we will see.

The NYT article also implies that because the Tesla system relies on a driver having their hands on the steering wheel to determine that they’re paying attention, drivers find it easy to “cheat” and fool the system into continue driving for them as they read, watch movies, or even doze off. It’s also stated that some competitors use a more comprehensive system of in-cabin sensors to determine things like where the driver’s eyes are focused.

The implication is that autonomous driving will work better if the car can impose stricter rules on driver behavior.

An anecdote from my floor trading days demonstrates why I think this logic is misguided.

In 1997, the firm I worked for was developing software for handheld computers to use on the exchange floor. One of the big advantages of the technology was that we could manually enter the trades we had executed into the handheld and immediately see the effect on our overall position.

(Prior to that, a human “runner” would physically carry a trading ticket to an adjacent booth and enter the trades into a conventional computer, taking 15 minutes or more before we could see position updates.)

The problem is that we made mistakes. When it was busy, we would frequently enter the wrong price, quantity, or buy/sell operator. The position updates would contain incorrect information that wouldn’t always be caught right away, so we’d rely on an erroneous report.

Some of my fellow traders asked our boss to have an extra confirmation dialogue box added to the interface in the hopes that we could catch an error before it was entered into the position.

The (very smart) boss somewhat sarcastically responded, “Sure, how about if I give you TEN dialogue boxes? Do you really want to enter this trade? Click ‘OK.’ Are you sure? Click ‘OK.’ Are you really sure? Click ‘OK’ …You know that when you’re busy, you guys are just going to click ‘OK’ ten times in a row and still not pay attention to the accuracy of the inputs.”

He was met with a bunch of nodding heads. We knew he was right. We didn’t need the technology to enforce additional rules, we really needed to be more careful. The technology was hugely valuable, but it still took a competent human operator to make it work.

I feel like autonomous driving is going to evolve the same way. We’re still going to need to pay attention! The technology will be able to significantly reduce the risk of driving or riding in a car, but no number of safety features will completely relieve us of the responsibility to participate.

Even more importantly, we’re going to have to view the reduction in risk rationally. The news media has a built-in predisposition to present stories in the most spectacular way. I’m not claiming inherent bias or anything like that. Most fact-based news organizations try extremely hard to present information that’s fair and accurate. But there’s also an entertainment aspect to their end product.

In terms of headlines ,“Runaway Self-Driving Car Causes Fiery Crash” is simply a lot more interesting than “Automobile Transportation is Getting Steadily Safer.”

Trading, Investing and Life

You’ve read this far. Why is this in “Know Your Options”?

Accurately ascertaining risk and acting accordingly is the key to successful trading and investing, but it also helps you strike a balance between danger and reward in all aspects of life. Those who can sort out what’s merely emotionally uncomfortable from what’s actually risky are going to have the best results in the long run.

Going with your gut is good, but carefully analyzing the numbers when you have time is better.

-Dave

Want to apply this winning option strategy and others to your trading? Then be sure to check out our Zacks Options Trader service.

Interested in strategies with profit potential even in declining markets? Maybe our Short List Trader service is for you.


 


See More Zacks Research for These Tickers


Normally $25 each - click below to receive one report FREE:


Tesla, Inc. (TSLA) - free report >>

Published in