create a make-believe history of grand
designs and chess-master-like wisdom
(since winners tend to write the accepted
accounts), the planning fallacy projects
those same fanciful renderings forward
with the idea that the future can somehow be managed—and perhaps controlled—despite the lack of any actual
historical support for the notion.
As Adam Gopnik sagely points out in
a recent edition of The New Yorker, “[w]
hat history actually shows is that nothing
works out as planned, and that everything
has unintentional consequences.” Indeed,
“the best argument for reading history
is not that it will show us the right thing
to do in one case or the other, but rather
that it will show us why even doing the
right thing rarely works out.”
The planning fallacy is related to opti-
mism bias (think Lake Wobegon—where
all the children are above average) and
self-serving bias (where the good stuff
is deemed to be my doing while the bad
stuff is always someone else’s fault). We
routinely overrate our own capacities and
exaggerate our abilities to shape the future.
Thus the planning fallacy is our tendency to underestimate the time, costs
and risks of future actions and at the
same time to overestimate the benefits
thereof. It’s at least partly why we underestimate the likelihood of bad results.
It’s why we think it won’t take us as long
to accomplish something as it does. It’s
why projects tend to cost more than we
expect. It’s why the results we achieve
aren’t nearly as good as we expect and
why they are so often disastrous.
It’s why I take three trips to Home
Depot on Saturdays and why it takes
me all day to finish a household chore I
expected to take maybe an hour (which
then doesn’t work right or look right).
As John Lennon put it, “Life is what
happens to you while you’re busy making other plans.”
The key lesson, then, should be interpretively Hippocratic: First and
foremost, do no harm. Humility is
paramount. As Nate Silver emphasizes
in his book, The Signal and the Noise,
we readily overestimate the degree of
predictability in complex systems. We
need to promise less and expect less.
Things are still not likely to turn out
the way we hope, expect or claim, but
at least our embarrassment will be less
when they don’t—when life happens.
A second lesson is related: Avoid
errors. A famous study by the U.S. Institute of Medicine concluded that up
to 100,000 people die each year due to
preventable medical errors. Since physicians are among the smartest and most
highly trained professionals imaginable,
desperately trying to do the right thing,
the aggregate level of error in human
life must be almost unimaginably large.
Unfortunately, even though most of us
are willing to acknowledge a wide array of poor decision-making exhibited
broadly and generally, our bias blindness means that we tend to fail to expect
or recognize the problem personally.
Sadly, errors cost us more than good
decisions help. When nearly everyone
is smart together, nobody wins. When
nearly everyone screws up together,
nearly everyone loses and loses much
more than they otherwise would have.
The more universal the error, the greater
the loss will be. Because this tragedy of
errors is such a major problem, dealing
with risk first is absolutely essential for
good investing. Thus learning what not
to do is more important than learning
what to do. As Charley Ellis famously
established, investing is a loser’s game
much of the time—with outcomes
dominated by luck rather than skill
and high transaction costs. If we avoid
mistakes we will generally win.
The third lesson should be obvious:
Plan for the worst even if and as you
hope for the best. In a financial context, this lesson has several particular
applications, including the following:
1) Because we discount future risk
too much, we ought to be particularly
skeptical about our various estimates
of results and outcomes and ought to
consider more carefully the consequences if (when!) things don’t turn
out as well as we planned.
2) We should value the benefits of
guarantees (when available) more than
the benefits of potential. Accordingly,
we should typically be concerned more
about the costs of failure than about opportunity costs. Income annuities look
particularly attractive in this context.
Another reason why this general problem is particularly acute for advisors is
the so-called “authorization imperative.” Our plans and proposals must be
approved by our clients and we have
a stake in getting that approval. This
dynamic leads to our tendency to understate risk and overstate potential.
Perhaps we see it as easier to get forgiveness than permission or perhaps it’s
just a sales pitch. Or maybe we have
convinced ourselves that we’ve got everything covered (confirmation bias!).
Whatever the reasons and despite its
strategic benefits, we run the risk of
Things rarely turn out the way we
expect. We never have everything covered. Life happens. Act accordingly.
You have been warned.
Bob Seawright is chief investment and
information officer for Madison Avenue
Securities in San Diego. His Twitter feed
is @RPSeawright and he blogs at “Above
the Market” ( rpseawright.wordpress.com).
A N N U I T Y Analytics