Solving the Myth of Rate of Return

It remains the single most motivating factor for almost all investment decisions; yet many people are completely mistaken as to how rate of return really works. Using rate of return (ROR) as the primary way to evaluate the performance of an investment is practically universal. Early in our adult lives we become comfortable with the notion that the return provides a direct and accurate way to judge an investment’s value. Often, the pursuit of good returns leads us to change advisors, choose new allocations, or buy a given property.

 

 

It makes sense. The power of compound interest combined with a good return can solve a lot when it comes to building your nest egg for retirement. In my opinion, the interesting part is not our obsession with ROR; it is our understanding of it. To better understand rate of return, we must examine how misleading it can be during the two basic phases of retirement, the accumulation (or deferred growth phase) and the distribution (or income phase).

 

Let’s begin with the accumulation phase, which is the period of time when the account is growing without withdrawals in preparation for retirement. Most of us rely on a “long term” investment strategy in capital markets; we expect that if we invest for the long haul, the averages in the market will work in our favor. In fact, when I ask my clients what they expect for a long-term return, usually I hear 8 or 10%. More significantly, when I ask why they expect this rate, it is the same every single time: “because that is what the market averages.” So what exactly is the “average” anyway? Let’s examine the calculation.

 

You see, rate of return is actually comprised of two types of return: average return and actual return. The difference is simple: LOSSES. When calculating an average return, gains and losses are equal in weight. In other words, a +50% followed by a -50% leaves an average return of zero. This part is pretty straightforward. $100k invested would still equal $100k if you average zero. The interesting part comes when you calculate the actual return. Let’s take the same example and actually run the numbers; we’ll see what really happens. When you take $100k and apply +50%; your account will be worth $150k. Then take the $150k and -50%; now you only have $75k. That is a -25% loss from your original $100k. Why is this? Any time you have one year of losses your average return will not equal your actual return. Losses have greater weight and impact on your actual dollars than gains do.

 

Another way to look at it is to review the Dow Jones since 1930. If you add up every number and divide it by 81 years, the return “averages” 6.31%; however, if you do the math like we did above, you get an “actual” return of 4.31%. Why is this so important? If you invested $1,000 back in 1930 at 6.31% you would have $142k, at 4.31% you would only have $30k. As you can see from this example, the impact of evaluating average returns as the only measurement could be devastating. This, however, is only one part of the equation.

 

Erik Krom - Fox Business

Back to top