Skip to content

Minimum Correlation: Implementation in Excel

September 27, 2012

We are planning to release some excel worksheets next week showing the implementation of the minimum correlation algorithms and also maximum diversification, minimum variance and risk parity. Michael Kapler from Systematic Investor has worked with me to write the paper, and has written an interesting post on how to implement the algorithm in excel by calling from R.

He has also written an interesting post comparing the speed of the algorithm versus quadratic programming (mincorr is much faster). We both plan to show some follow up posts on different applications and variations of the algorithm. If you haven’t already added Systematic Investor to your list of blogs to read it is highly recommended for more advanced readers.

Minimum Correlation Algorithm Paper Release

September 21, 2012
This is a draft preview of the Minimum Correlation Algorithm paper. We will be releasing an expanded version soon including new data and analysis as well as a review of the research. Furthermore, we have also planned an SSRN release on a related but larger topic.
View this document on Scribd

New Position

September 20, 2012

I recently joined a new firm- Flexible Plan Investments– as the Vice President of Economic Research and Strategic Development. Flexible Plan was founded by Jerry Wagner, the co-founder of NAAIM and the creator of the Wagner Award for Active Management. Jerry is a pioneer in the industry for making absolute return strategies widely accessible to investment advisors in separately managed accounts. He has also been a major proponent of strategic diversification: combining a wide variety of uncorrelated strategies that are actively managed to achieve more stable returns across market cycles.  I am excited about the opportunity and  look forward to contributing on the research side. I will be more active on the blog going forward to share a lot of new ideas and directions for quantitative research.

Minimum Correlation Algorithm (MCA)

September 12, 2012

Stay tuned for the release of our forthcoming whitepaper: “The Minimum Correlation Algorithm (MCA): A Practical and Effective Diversification Tool.”  The paper will contain an overview of the algorithm logic as well as a comparison of the  performance and diversification score  of MCA versus Risk Parity, Maximum Diversification, and Minimum Variance. This heuristic algorithm is fast and relatively simple to calculate and can be applied effectively to both asset allocation and also to trading strategies. The complete “R” code for replication will be released in the paper. The scheduled release date is next Thursday.

NAAIM Wagner Award 2013: Call for Entries

August 31, 2012

Details here

Image

NAAIM Wagner Award 2013: Call for Entries

August 29, 2012

The National Association of Active Investment Managers (NAAIM) has announced the call for papers in the 2013 Wagner Award paper competition. In its fifth year, the Wagner Award promotes awareness of active investment management techniques and the results of active strategies through the solicitation and publication of research on active management.

A $10,000 grand prize will be awarded to the best paper that critically looks at how investors might better manage portfolios during market downturns, yet still capitalize on the market’s periods of superior performance. Papers must be of practical significance to practitioners of active investing.

The competition is open to all investment practitioners, academic faculty and doctoral candidates in the field. The goal of the Wagner Award is to provide academic substantiation of the viability of active management and evidence of the validity of active investing.

For further details and rules for submission can be found here.

Image

“The Final Problem”

July 22, 2012

“The Final Problem” http://en.wikipedia.org/wiki/The_Final_Problem is a Sherlock Holmes story that contains interesting references to modern game theory. Holmes- the legendary detective- is on a mission to foil an equally intelligent adversary- Moriarty. Holmes is forced to choose beyond conventional decisions to account for the fact that Moriarty is likely to deduce his response based upon what he himself would do in such a situation. A good analysis/synopsis of the problem is presented by Blume in presentation on game theory: http://www.pitt.edu/~ablume/Game%20Theory%20Principles/game-slides-3.pdf

Both central banks and large investors are constantly engaged in these type of games. This often baffles less sophisticated investors who are looking for market responses that reflect the favorability of decisions made by central banks. Instead, the response is a complex reaction based upon expectations and the degree to which they are already priced. It is worthwhile to think of “The Final Problem” as a useful analog when considering what direction the market may take given a policy event or announcement.

The Future is “Theory-Free”

July 20, 2012

The dusk is setting in on the world of factors and economic theory. If my views seem dramatic or extreme, consider that the markets did not make sense from a CAPM world view during the best (and simplest) of times. Today, the world markets are a strange and dangerous place. We are in the midst of the era of dominance for computerized trading (think Skynet from the Terminator). Floor trader jackets will soon become museum relics or sold as collector’s items on E-Bay. Most of the retail investors and individual traders have left the market entirely–outclassed, outgunned and tired of poor returns and being unable to compete. We are simultaneously in the throes of perhaps the greatest financial mess of all time. Our pithy attempts to control the uncontrollable with the aide of simplistic economic theory, political idealism, and crony capitalism has put the world in a perilous situation. The stress in financial markets reflect unprecedented levels of global debt, government intervention, unexpected shifts in sovereign risk, and banks that can lose billions for no obvious reason seemingly overnight.

The complexity of the modern world, and the degree to which it is interconnected, has rendered simplistic linear theory almost meaningless. Human behavior- a feature once predictable in its hybrid version of rationality and cognitive bias- no longer manifests in markets as a fluid mass of “greed” and “fear”. Instead, game theory and behavioural finance now share equal importance as players engage in Keynesian beauty contests with greater than six degrees of anticipation. (not to be confused with six degrees of separation) http://en.wikipedia.org/wiki/Keynesian_beauty_contest . The highly complex interaction between governments, banks, hedge funds, institutions, wealthy individuals, and high frequency firms is a multi-dimensional high stakes game where the rules change by the day. Players must interpret or anticipate others actions, the perception that other players have of their actions, and how rules might change and how this may affect their actions and perceptions. And this is a simplistic description……………..

The future of investing must be theory-free and focus on creating algorithms that can adapt to change and adequately capture the nature of complex systems. Ed Seykota once correctly observed that a surfer does not need to know the theory of fluid mechanics to learn how to surf a wave in the ocean– he needs only to be able to make probabilistic estimates and adjustments. This example puts the financial economist at odds with the scientist or the engineer. The future favors the practical problem-solvers, the inventors, the gamers, and the deep thinkers. Investing is becoming like the evolution in mixed martial arts– no single style or background is likely to win, nor is a stubborn resistance to learning about other disciplines. I predict that the theory police will resist this transition and be left standing naked before their disciples when the tide goes out. It is much easier and more comforting to explain the world in deterministic terms- replete with graphs of marginal utility and “proofs.” Of course it is also more comforting to play chess than engage in a high stakes game of “liar’s poker”.

It is important for quants to override the desire to be fed theoretical pablum and avoid trying to seek academic approval from high priests of a defunct religion. There is nothing wrong with learning finance, economics or econometrics- in fact it is crucial to understand these disciplines at a high level. But it is time to start thinking for yourself, and to stop being a sheep. Instead one should favor asking hard questions, learn to think independently, collaborate effectively, and most importantly spend their waking hours thinking about how to solve high-dimensional classification and game-theoretic problems.

Not Equal: A Comparison of “Risk Parity” and “Equal Risk Contribution”

July 19, 2012

The term “Risk Parity” is often confusing because it is defined and applied differently depending on the firm.  With the advent of multiple variations in tactical asset allocation and indexing strategies, it is perhaps not a stretch to claim that Risk Parity is as vague and elusive  as using the term “hedge fund” to describe a style of investment management. I have created a table below to clarify two different variations that are often used interchangeably and are not the same.   A good history of the origins of risk parity can be found here: http://en.wikipedia.org/wiki/Risk_parity. Systematic Investor– an excellent blog-  provides some useful R code and testing here: http://systematicinvestor.wordpress.com/2012/03/19/backtesting-asset-allocation-portfolios/. Meb Faber– always a great resource for new ideas- provides an application of Risk Parity to asset allocation as well as a good list of articles here: http://www.mebanefaber.com/2012/03/22/risk-parity-vs-endowment-model-vs-permanent-portfolio/

In my opinion, it is best to simply consider Risk Parity as a broad class of risk-budgeting schemes where the risk of each asset in the portfolio is leveraged (if necessary) to have the same volatility. Essentially, each asset has “equal risk”, and thus the portfolio is considered to be more diversified since returns and risk are less dependent on any one asset. In contrast, Roncalli http://www.thierry-roncalli.com/riskparity.html  coins the term “Equal Risk Contribution” (ERC)  which is a distinct sub-class of Risk Parity that seeks to equalize risk contributions from each asset to the portfolio. If both methods sound the same after reading various articles, that is because for the often cited two-asset case  they have the same mathematical solution and explanation for their validity. Nevertheless, not all Risk Parity portfolios are the same-  it is necessary to understand the differences between the equal risk and equal risk contribution approach to avoid improper application or interpretation. Below is a table that helps to define the similarities and differences of the two approaches and the relative advantages of each in the context of portfolio allocation.

Features and Qualities Shared By Equal Risk Contribution (ERC) and Risk Parity (RP)

Algorithm Characteristics Equivalency Equal Risk Contribution (ERC) Risk Parity (RP)
Requires Historical/Expected Asset Returns No No
Considered Heuristic Algorithms That Lack Strong Theoretical Support Yes Yes
Generally Uses Leverage Yes Yes
Generally Uses a Target Portfolio Risk Yes Yes
Allocation Across All Assets in the Universe Selected Yes Yes
Low Turnover Relative to Traditional Models Yes Yes
Comparatively Less Sensitive to Estimation Error than Traditional Models Yes Yes
Generally Superior Return and Sharpe to Equal Weight Portfolios Yes Yes
Better Returns and Sharpe than 60/40 Balanced Stock/Bond Portfolios Yes Yes
Solution for a Two-Asset Portfolio inverse volatility weighted inverse volatility weighted
Conditions for Being Equivalent to Equal Weight Portfolios Equal Asset Volatility and Constant Correlation Equal Asset Volatility and Constant Correlation
Conditions for Being Optimal on the Efficient Frontier Equal Sharpe Ratios and Constant Correlations Equal Sharpe Ratios and Constant Correlations



Features and Qualities That Favor Equal Risk Contribution (ERC)

Algorithm Characteristics Equivalency Equal Risk Contribution (ERC) Risk Parity (RP)
Objective Function volatility of risk contributions=0 inverse volatility weighted
Uses Beta/Covariance Data Yes No
Always Assumes Constant Correlation Matrix No Yes
Holdings Have Equal Risk Contributions to Portfolio Volatility Yes No
Performance Highly Sensitive to the Universe of Assets Selected No Yes (ie. 5 equities and 1 bond create imbalance)
Flexibility of Use for Risk Budgeting and Factor Tilts winner loser
Out of Sample Risk (Given the same target) winner loser
Out of Sample Return (Given the same target or leverage constrained) winner loser
Out of Sample Drawdowns winner loser
Out of Sample Sharpe Ratios winner loser
Out of Sample Diversification (Concentration and Average Correlation) winner loser
Favorable Hybrid Between Minimum Variance and Equal Weight Yes No
Can Be Equivalent to Minimum Variance When Correlation Matrix is Minimized Not Possible



Features and Qualities That Favor Risk Parity (RP) or Equal Risk

Algorithm Characteristics Equivalency Equal Risk Contribution (ERC) Risk Parity (RP)
Simple to Calculate for the Mathematically Challenged (Napkin Factor) loser winner
Intuitively Easy to Understand and Use in Most Software Packages No Yes
Best Sounding Name/Cool Factor loser winner
Practical Use for Dynamic Allocation With Long-Term Monthly Data loser winner
Sensitivity to Covariance Estimation Error loser winner
Generally Requires a Sophisticated Solver for Optimization Yes No
Speed of Calculation for Large Universe Slow Fast
Performance on very large data sets with limited sample size loser (too many covariances relative to sample size winner (fewer variables to estimate relative to sample size)

Adaptive Asset Allocation: Combining Momentum with Minimum Variance

July 17, 2012

The concept of Adaptive Asset Allocation (AAA) was presented in a whitepaper by Butler, Philbrick and Gordillo this summer: Adaptive Asset Allocation. One of the core principles of  AAA is that portfolio allocation should be dynamic versus strategic– in other words, an investor’s portfolio composition should adapt over time to respond to changes in both the expected return of different asset classes and also the overall risk of the portfolio.  This theoretically ensures that investors can adequately grow and preserve their capital and withdraw to meet liabilities through different economic regimes (deflation, inflation, and many other variants).

The alluring promise of AAA rests upon the ability to dynamically adjust to different economic conditions. In a world where assets compete for capital, the  best way to forecast economic conditions is to observe the relative pricing of the most liquid securities across markets.  The time series of major asset classes represent observed variables that allow investors to detect changes in the economy. Equity markets give us the ability to observe expected business conditions, commodities give us the ability to observe expected changes in inflation, while bonds give us the ability to observe expected changes in interest rates. Real Estate is a market that provides insight into all three factors, and is a direct measure of consumer purchasing power.  Observing world equity markets can tell us how business conditions are evolving globally.  These asset classes are akin to a diverse ecosystem where different species thrive at different times due to changes in the environment.  If we can detect shifts in the system, we can understand in a probabilistic sense which species will do the best across a range of likely scenarios.  By analyzing the financial network of asset classes,  it is possible to express our views more concisely through a portfolio allocation across assets that maximizes the chance of being right while minimizing the cost of being wrong.

The analysis of major assets classes through time series data requires integrating historical returns, correlations and measurements of volatility to create an efficient portfolio allocation. The goal of this framework is to dynamically identify the best performing asset classes and to manage risk at the portfolio level. The paper presents an example of a strategy (not the actual proprietary strategy) that effectively integrates these three variables into simple  and robust framework. The logic for this example is very straightforward yet powerful in its simplicity and is validated by academic research. Momentum is chosen as the method to integrate expected returns information since the rank of historical returns (versus the raw return) represents a robust method of dynamically identifying  which assets will perform the best in the near future (2 weeks to 3 months). It is currently well-accepted by even the greatest skeptics that momentum is a powerful and reliable anomaly.  Minimum-variance optimization (MVO) is chosen as the method to manage risk at the portfolio level. It is well-accepted that MVO is effective at managing portfolio risk because it minimizes variance as a function of both  asset cross-correlations and volatilities. Since MVO does not use expected returns and instead relies on the highly forecastable elements of volatility and correlation, it is considerably more robust  to estimation error than classic mean-variance optimization and performs better out of sample in terms of  ex-post sharpe than all other existing conventional portfolio algorithms. Besides the theoretical appeal of MVO, these claims have also been validated by numerous academic studies.

The test presented in the paper uses a 6-month parameter (120-day) for momentum that selects the top 50% of asset classes and uses a weighted version of minimum-variance with an approximate lookback of 20 days to calculate portfolio allocations. The holding period for rebalancing was monthly. The obvious question is whether this particular combination of momentum and minimum-variance parameters were “lucky” and whether this represents a particular case of data snooping.  The charts below summarize backtests that were run from 1995 to 2012 using the assets provided in the whitepaper with different momentum (ROC) and minimum-variance (MINVAR) parameters. What is clear from the tables is that the performance in absolute (CAGR) and risk-adjusted terms (Gross Sharpe) is very consistent across parameters. While performance is more sensitive to the momentum parameters than minimum-variance, both are very stable. It is quite rare in system development to see this type of consistency. When the average performance is very close to the optimal performance and the standard deviation of performance is low, the chance of regret by selecting any pair of parameters is also quite low. Of course, one can avoid this pitfall by simply investing  in the pool of all possible combinations.  The methodology of this toy strategy can be vastly improved to increase risk-adjusted returns and robustness, but that is perhaps a subject for another post.

Image

Image

Image

Image

Design a site like this with WordPress.com
Get started