A Brief History of Asset Allocation

A Brief History of Asset Allocation

Glassbridge has put out an ambitious white paper about the “evolution of asset allocation across the investment management industry,” one that begins with the basics of the Capital Asset Pricing Model and ends with quantitative analysis and crowdsourcing.

The premise is that new strategies, and new ranges of data, are disrupting traditional allocation, and that a step back—a long view—can help asset managers and investors move through this disrupted landscape.

Value Weighting, Equal Weighting

The authors, Jim Kyung-Soo Liew, Robert A. Picard, and Daniel Strauss, start their historical review with the CAPM, built as it was on the idea that if the asset market is to clear, then the market portfolio must be on the minimum variance frontier.

They also acknowledge that CAPM has had trenchant critics, who have disapproved of the unobservable black box nature of this “market portfolio.” Liew, et al., cite Richard Roll for his notorious critique. They observe, much in Roll’s spirit, that in principle the CAPM includes everything, including fintech innovations not conceived in the 1960s, such as cryptocurrencies.

But the key point for the Glassbridge authors is that through subsequent investigation of the CAPM it became clear that the market portfolio was value weighted. The Glassbridge-affiliated authors discuss briefly what they call the canonical example of value weighting in this context, a 60/40 equity/debt split.

There are at least three recognized sorts of weighting. In addition to value weighting there is the simpler “equal weighting” approach. If an investor has 10 assets, or 10 asset classes, in his portfolio, he might simply arrange things so that each constitutes 10% of the whole.

A Third Way: Risk Parity

In contrast to either equal or value weighting, though, there is weighting by risk, generally known as the “risk parity” approach.

Given the 60/40 canonical example named above, for example, it didn’t take traders, investors, or their advisors very long to notice that the amount of risk generated by the equity in a 60/40 portfolio is a good deal more than half, and for that matter a good deal more than 60%. Following the risk-parity approach, the sensible response to this is to borrow, and use the borrowed money to load up on bonds, until the debt side of the portfolio reaches risk parity with the equity side.

This idea has been extended over time to cover a four-class portfolio. With a portfolio of stocks, bonds, commodities, and currencies, one can apply leverage to bring the risk level to 25% for each class.

The Glassbridge paper observes that there are risk-parity skeptics, and it paraphrases their views this way:  the approach “completely ignores the expected returns component of each asset. How can institutional investors allocate across assets without using any expected returns of these assets?” Skeptics also ask, “Why are we not using conditional expectations as well as knowledge of the business cycle phases?”

The Development of Quant Trading

Later in the report, these authors explain that what is nowadays called quantitative or quant trading has developed from pairs trading, a return-to normalcy strategy involving any two highly correlated assets stocks.

It was a natural step from pairs trading to the equity market neutral hedge fund strategy. That neutrality, these authors observe, can be “achieved across many different dimensions such as size, value/growth, momentum, etc., with almost scientific precision.”

The growth of sophistication in the use of such trades has been fueled by the science of computing, by machine learning and artificial intelligence. As a consequence of the new technology the Glassbridge report says, “finding patterns in the data is not the problem; the real problem now is making sure that the patterns that are found are robust enough to work out-of-sample. Can you create a robust out-of-sample prediction model?”

Crowdsourcing and a Bridge Too Far

The new technologies can derive some of their data (the “alternative data”) from what is called the “wisdom of crowds.” Empirical studies have shown, for example, that there are links between tweets and IPO price behavior.

In an experiment to determine whether crowdsourcing the task of finding robust patterns in data can produce alpha, a hedge fund recently placed a bet on Quantopian, a company that lets its users own any authored code and decide if they want to open their code or leave it private. The hedge fund offered “a substantial amount of capital if Quantopian’s users authored strategies could achieve some performance hurdles.”

Unfortunately, the experiment was not a success. That doesn’t mean that crowdsourcing isn’t useful, only that it hasn’t arrived at its utopia yet.

By way of conclusion, the Glassbridge affiliated authors affirm that the “future of institutional investment and asset allocation will be irrevocably changed by alternative data and algorithms.”

They believe that asset managers will have to keep digging deeper into the concepts behind new tools in order gain the edge.  Portfolios should be neither Q-less nor clueless.

Be Sociable, Share!

Leave A Reply

← Some Clarity from the Delaware Supreme Court for Activists National Market System: What to End; What to Mend →