Saturday, June 8, 2019

The most important chart for long-term investors


Time is the investor's best friend. The longer the investment horizon, the less the investment returns depend on factors such as crashes and current valuation levels. It is known that the chance for losing in the stock market on a 20-year period has historically been about zero. This post attempts to expand on this fact and take a look at how risky the U.S. stock market has actually been for long-term investors.

As usual, we'll use data from Robert Shiller to answer the questions. The data begins from the year 1871, long before the actual S&P 500 index was created. We'll only consider lump-sum investing, since dollar cost averaging is another story.

Let's first look at the inflation-adjusted returns for an U.S. investor, including reinvested dividends.  Keep in mind, that the U.S. stock market has been one of the best performing in the world, and future returns are likely to be lower because of high valuations and lower productivity and population growth. The upper and lower bands are the 95 percent prediction intervals, i.e. 95 percent of the time the investment return has been between these bands. The y-axis tells how many times your investment would have been multiplied. Notice that the axis is logarithmic.

This chart demonstrates how uncertain investing is. The range of outcomes is very large, but it doesn't necessarily tell the full truth. France once had a 66-year period where stocks didn't beat inflation, for Italy the longest streak was 73 years and for Austria a painful period of 97 years. This is why global diversification is important. There has however been a 5 percent chance that the investment would have increased 64-fold in the U.S. for the same period. The risk works both ways.

Let's also look at the nominal, non-inflation-adjusted returns to see how inflation eats returns:

The inflation in the U.S. has been quite high, over three percent annually. Inflation of course affects different companies in a different way, but the net effect is that lower inflation does not necessarily lead to higher inflation-adjusted returns.


Be sure to follow me on Twitter for updates about new blog posts!

The R code used in the analysis can be found here.

The original post had a problem with the calculation of dividends. The charts and code have now been updated, and the true returns were higher than in the original post. Sorry for the inconvenience.


1 comment

  1. Ah, the notion that one can predict the future by believing the past is still true. It is a fact that the 19th century to 20th century was a transition of profound tech progress. That translated to healthy returns to stocks, over some many decades. Read Robert J. Gordon's book.

    That's just not true any longer. There has been a massive structural change to both the USofA economy, and Western economies generally. That change is the shift to financialization; every country thinks it can be Switzerland. There's a good reason interest rates remain at 'historic' lows: all those entities with excess moolah prefer low risk, yet high return, placements. This is what led to the tsunami of moolah into residential mortgages, esp. in the USofA. (And the Dot Bomb before it.) Having that path closed, said moolah has been chasing Treasuries. Given that Treasuries are sold on a coupon value, rather than interest rate, the resulting price frenzy pushes down interest rate. Even if Treasuries were sold on an interest rate basis, the secondary market would instantly would adjust the rate down to what the market would bear. The loser would be the Damn Gummint, stuck paying the higher price.

    Why hasn't all that TARP/QE/etc. moolah generated massive inflation, one might ask? The answer: it has. But only local to asset markets, since the moolah was passed from the Damn Gummint to the 1%, not the 99% who actual consumables. And. so on.

    If you want to predict the future based on time series, only do so when you can prove the data generating process is static. God's processes qualify, more or less. Human ones not all, since some humans get to change the rules to benefit themselves; just ask Blythe Masters.

    ReplyDelete