Where does big data fit in with investment theory?
A tension exists between theoretical models and big data. Big data has seen a surge in healthcare and other fields, but Mark believes a sound theoretical model, rather than something purely empirically driven, will win out.
“I don’t think big data will ever replace good theory and sound judgment. I don’t see big data in isolation getting us very far. In terms of computational power, there are lots of examples.”
One example is the dividend discount model, also known as the John Burr Williams model, introduced in 1938. It was an elegant summary of the discounted cash-flow problem and became popular due to a lack of computational power. At that time, it was nearly impossible to project 30–40 years of dividends and discounts. So, to get around this, some smart people came up with a closed-form solution: estimate a discount rate and a growth rate at the current dividend.
Another example is optimization. In 1952, Harry Markowitz developed the modern portfolio theory (mean-variance analysis). Mark explained that this is a remarkably robust approach to constructing portfolios even by today’s standards, even though it has limitations.
“One limitation is that it assumes one of two things in order for it to be valid. It assumes that either the returns are approximately elliptically distributed, which is much more forgiving than a normal distribution, or that it doesn’t have to be both—it’s one or the other—that investors have preferences that can be reasonably well approximated by mean-variance.”
While it works well in situations where returns are not normal and investors have preferences that have skewness within them, mean-variance doesn’t cut it. In this case, computational power can offer some advantages. Windham introduced an alternative: full-scale optimization. This solution takes into account every feature of data and allows investors to specify more plausible preferences.
In full-scale optimization, if an advisor has a sample of returns (historical or simulated), specify the utility function, and within 40 seconds about 500,000 different asset mixes will be displayed, ordered by highest utility.
“You couldn’t do that 20 or 30 years ago. That’s an example of computational power allowing you to go beyond mean-variance optimization, to take into account significant features of the data that won’t be captured by the typical distributions that we assume, or preferences that don’t conform to mean-variance.”
Another problem that is very complex to solve is rebalancing portfolios. The point of rebalancing is to shift a suboptimal portfolio to optimal rates. To accomplish this, it’s vital to evaluate the cost of rebalancing against the cost of having a suboptimal portfolio. The computation needed to rebalance a stock and bond portfolio is doable, but when there are five asset classes it’s not cost or time effective.
“We tried to compute five asset classes using 20 computers, and it took two weeks of it running continuously to figure out what the optimal rebalance rates were.”
Fortunately, Markowitz came up with a quadratic heuristic that works well—so well that Mark and his team have tested it up to hundreds of assets. This is possible thanks to pencil-on-paper mathematics, not computational power.
“I really think that computational power and data will only get you so far. You need theory, you need good math, and you need sound judgment. I think the industry has come a long way.”
Perfecting risk analysis: Pitfalls in the conventional approach
Analyzing the probability of losing money is one of the top concerns for every investor and advisor. There are two approaches: evaluating the probability of losing money, and assessing the value of risk when choosing a certain strategy. Robo-advisor solutions claim to have certain insights compared to conventional approaches, but for Windham Labs, sophisticated analytics is key to true innovation.
Pitfall #1
Mark says that the first and most important pitfall of conventional analysis is that conventional risk assessment concentrates on the outcome at the end of the investment horizon. It ignores what might happen in between, or within that horizon. This is true in academia and in industry; the risk measures that people focus on are based on end-of-horizon distributions.
“What we find is that the probability of loss is sometimes 10 times greater than people perceive it to be. I think that people don’t know this. They don’t know that, although there’s only a 5% chance that their portfolio is going to go down more than 10% at the end of 10 years, there’s probably a 50% or 60% chance that it will be down 10%, sometime along the way. That’s something that people need to appreciate.”
Pitfall #2
The other pitfall in conventional risk assessment is based on the full sample of returns. Undoubtedly, losses are much more likely when markets are turbulent, and less likely in calm periods. The problem with taking a full sample when estimating volatiles and correlations in a portfolio is that you can grossly underestimate exposure to loss.
“What you should instead do is say, ‘Since losses are much, much more likely to occur during turbulent periods, let me construct a subsample of returns that prevail during turbulent periods, and estimate my standard deviations and correlations from that subsample, and then use that to assess my portfolio’s exposure to loss.’”
In 2007, Windham published an article in which they analytically compared the conventional approaches and turbulent subsamples, both within and at the end of the horizon. They discovered that by using standard evaluation for a 60/40 portfolio, the risk was 10%. When they plugged in the innovative new analytics, the value of risk rose to 35%—a major difference. When the financial crisis hit in 2008, Windham experts went back and checked how the portfolio had done compared to the prediction; it was also 35%.
Bottom line
The balance between innovation and convention is becoming skewed as machine learning/AI, big data and computational power find new applications in various industries. It is clear that big data is useful in FinTech for things such as customer segmentation, risk analysis, and compliance, but investment theory still holds strong. There are complexities that computers have yet to solve—complexities that only a creative mind can come up with. When it comes to risk analysis, Windham have identified two big innovations that many overlook, showing that FinTech and the financial world in general are still evolving.
About
Mark Kritzman has been in the finance business for a long time. He previously worked for Bankers Trust and AT&T, eventually going off on his own to found Windham Capital Management and Windham Labs. For 16 years he has had a dual career in academia as a senior lecturer at MIT. He has done extensive research and has published many articles in academic and practitioner journals. |
![]() |
Interviewed by Vasyl Soloshchuk, CEO and co-owner at INSART, FinTech & Java engineering company. Vasyl is also the author of WealthTech Club, which conducts research into Fortune and Startup Robo-advisor and Wealth Management companies in terms of the technology ecosystem.