Back to Portfolio for the Future™

Unlocking Alpha in the New Normal

May 6, 2013

By Louis Lovas, Director of Solutions, OneMarketData

If there’s one thing firms must have a strong grasp on in the financial markets, it’s data. These days, data comes from every direction possible, and it comes quickly. But to take full advantage of the ocean of information rushing toward you requires getting a handle on data and then finding meaning within the data to capitalize on opportunities.

We live in what I call the new normal, defined by higher regulatory scrutiny, increasing competition, tighter spreads, thinner margins and a lower risk appetite. To find alpha in this atmosphere, firms must broaden their data analysis to create smarter algorithms that reach beyond single-asset class safe harbors.

With this market uncertainty, defining a new normal does not warrant an emotional capitulation to lower returns and/or higher risk ahead.

In a relevant Linedata survey, a top challenge and subsequent goal for hedge funds (68%) and asset managers (59%) was maintaining investment performance. The undercurrent of the equity market’s exuberant highs is a continued downward trend in volumes and trader-loving volatility. Margins hinge on total market volume, whether making or taking liquidity, the net effect is a reduction in profits when there is less liquidity in the marketplace. According to TABB Group, U.S. equity trading on listed exchanges declined 18.5% in 2012. Some of that flow has moved to dark markets and broker internalization resulting in still fewer opportunities. Moreover, retail trading (DART) is flat to 6% down as reported by TD AmeriTrade, E*Trade and Schwab.

Compounding this is the reduction in market volatility. The CBOE’s VIX is off by 29% from last year as of this April. Declining volumes begets the fall in volatility therefore making it hard for institutional and quantitative market participants to make money.

Paradoxically that translates to the increased use of algorithms as firms look to squeeze alpha out of this diminishing pot.

Adapting to the new normal of the equity markets, diversifying into other asset classes and confidently managing the risks for healthy returns can still be achieved. Understanding, managing and analyzing data is the key. Data is the fuel for the engine of the trade life-cycle, beginning with research and discovery of new trade models and measuring executions against benchmarks for trade performance.

But it’s important to plan strategically for how to address the sea of data you’re inevitably going to have to tame as you diversify your trading strategies in the pursuit of alpha generation.

What will be more integral to your strategy, to go faster or to do more without getting slower?  With a focus on latency, fast access to market data is now a must have.  Quant strategy decision making is critically dependent on the immediacy of pricing data, often from multiple sources. With this in mind, the “Holy Grail” is actually to outsmart the crowd, which does not imply relying solely on speed, but also on being smarter than the competition. To this end, quants must demand access to a deeper pool of global historical data and use observations from the past to characterize relationships, understand behaviors and explain future developments.

Digging deeper into the data means you are increasingly dependent on data quality and accuracy. Tick data is usually derived from many sources. There are 13 major exchanges in the U.S. alone and 20 across Europe and Asia. Asia’s markets are characterized by unique technologies, cost structures, regulations and cultures. The disparity across markets is a natural barrier to algorithmic trading and can create challenges for efficient trading. The determinants of price discovery, volume, and trading patterns define the structure unique to each market, asset class and geography influenced by participants and current regulation.

To conquer this disparity across markets, data has to be analyzed and scrubbed clean.  Crossing borders also means global symbologies, which requires symbol maps and currency conversions. For risk management, data quality goes one step further for accuracy – statistically scrubbing algorithms. Once data is scrubbed, traders must then analyze the data for trading opportunities, which are often fleeting.

Market inefficiencies, the life blood of alpha generating strategies, are manifested by many things, including but not limited to human behavior, geo-political events and complex market structure.  Quants must apply an empirically-tested and rules-based approach to exploit these inefficiencies if they hope to outsmart the competition.  Nonetheless, trade models have a short shelf life and shifting market conditions can stress model performance. This creates a side effect of increasing demands for comprehensive historical data over extended time periods.

Historical analysis of high quality and comprehensive data can lead to the recognition of similar market conditions in the past, which can shed light on their consequences. Back-testing your models against past market conditions enables you to fine-tune algorithms, manage inherent risk and reveal alpha.

This new normal that we live in today is defined by diminishing volumes, wild rallies and uncertain regulatory policy. But it does not signal an end to profitability and the discovery of alpha hidden within the depths of our markets – on the contrary, when equipped with data and the tools to tame it, this is where the quest to tap profitability and alpha begins.