THERE IS NO CYBER ALPHA
(Or CIS benchmarks vs gummies in a jar.)
At an English county fair in 1907 an English polymath asked 787 villagers to guess the weight of an ox. While none were correct, the average of all the guesses was remarkably close. Among those in attendance there was, presumably, many lifelong farmers and cattle auctioneers that had a better overall idea on how to gauge the weight of an ox than others. Yet somehow even these experts were not better than average.
This is an early example of what has become to be known as a “wisdom of the crowds” experiment. Oddly enough, this phenomenon seems to be repeatable. More famous examples include inviting party goers to guess the number of jellybeans in a large jar. The math is quite consistent that the average of the guesses is closer than the guess of the average party goer. And few if any people manage a more accurate estimate than the crowd.
One of the earliest financial practitioners of this logic was John C. Bogle. “Jack” Bogle (May 8, 1929-January 16, 2019) was the founder of Vanguard Group, and widely considered the father of Index funds. An excellent read on this subject is his 1998 lecture entitled "Reversion to the Mean: Sir Isaac Newton’s Revenge on Wall Street." In it, Mr. Bogle argues that, in part because of mean reversion, attempting to “Seek Alpha” and do better than average is a Sisyphean task.
For the sake of those who may not be familiar with the subject, please accept the following oversimplification as an explanation.
First, an “actively managed mutual fund” is one in which an industry expert is placed in charge of purchasing a large basket full of stocks and bonds. The theory being that the industry expert will know more than the investor who employs him and will use that knowledge to buy or sell individual assets at or near the most appropriate times.
This is contrasted with a “passive fund” which buys and holds a pre-defined basket of stocks and bonds with minimal transactions and adjustments made along the way. In this method, little if any real thought or decision making is placed into the timing of any particular transaction. An excellent example of this would be an “S&P 500 index fund.” An S&P 500 index fund buys and holds the 500 largest companies in the market as defined by the Standard and Poor’s ratings agency.
This latter, passively managed fund is akin to being allowed to guess the average of the other guesses of beans in the jellybean jar. The argument for the former (actively managed fund) is a very intelligent, well studied and well compensated expert in the field is watching over your investments and making transactions to safeguard and grow your money. The argument Mr. Bogle makes is that even these industry experts fail to outperform passively managed index funds. Moreover, he asserts that of those that do outperform the average on a given year, many if not all fail to do so in subsequent years. This regression/reversion to the mean is (according to him) as inevitable as gravity itself.
Imagine that in a given year, 10 of 100 active fund managers manage to have better overall returns than the S&P 500. The natural response to this by investors will be for many of them to sell there under performing funds and purchase one of the ten higher performing funds. The natural response of the underperforming fund managers will be to examine the transactions and methodologies of the top 10 in an effort to improve and minimize the exodus of investments in their own funds.
Now suppose that in the second year, having learned from the top performers, 70% of funds now outperform the average. Each of the 70 funds that outperform the average, by definition, raise the average. Further, in order for it to be possible that 70% of the industry to do better than average the remaining 30% must have had substantially worse results. Meaning they will be replaced, or at least be under a significant pressure to improve going forward.
In reality, some of the original top ten were more lucky than good in the first year. It is possible to be lucky a couple of years in a row, but more likely their methods failed to be as successful in the second year. Thus, the lucky managers performance degrades over time as falling returns lower their individual average.
But what about those managers that were “more good than lucky.” Over time their performance also degrades relative to the mean, because the mean itself begins to rise and catch them. There is pricing pressure on last year’s high performance stocks that makes them less of a good bargain going forward. The combination of an influx of cash, and a diminishing return on last year’s stocks, means that the “good manager” is stuck investing in lower quality stocks just to keep all her money invested. Thus, in order to perform above average year after year; a given fund manager’s tactics must be good. But more than that, as their methods are discovered and their knowledge disseminated, they must continue to innovate and recognize new good ways to outperform the average given the changed market.
In a game as simple as guessing “jellybeans in a jar” tactics are easy to discern. Assume one enterprising individual bent on winning the jellybeans, spends the year working out volume calculations. She gains a basic understanding of how many jellybeans fit per ounce. Showing up at the fair, she recognizes an 8,16, or 20oz jar….takes her estimate and makes her guess. Not a bad overall strategy. But after winning three years in a row, her methods are discovered and the secret spreads. What happens? As more and more people learn the approximate number of jellybeans that fit per ounce, each individual guess gets more and more accurate. But so does the average. The deviation between individual guesses shrinks. The worst low guess isn’t as wildly low as the first year, and the worst high guess isn’t as wildly high.
The point is, as human beings, we are pre-programmed to learn from each other. We exalt the highly successful. We tell great stories of their great genius, and recount epic tales of tragic failure. And innately we learn from both. And if the game is simple, eventually everyone knows exactly how many jellybeans fit in a jar. If there is a right answer, eventually methods to determine that right answer become common knowledge.
But what about when the game is more complicated and prone to environmental changes, like stock picking. There is no “right price” to buy or sell a stock, or at least not a pre-determined one. Is the fund good or is it just good given this year’s conditions? Were those stocks of good companies, or will economic cycles cause them to underperform in the future? Was that manager good, or lucky?
The deviations in these complex markets tend to remain more widespread. Which means while the right choice might result in outsized returns, the wrong choice will devastate a portfolio. In these environments, without a clear vision of the future or the ability to accurately divine skill from luck; betting on the average will in fact return a better result than average.
Despite this, and over a hundred years of performance results, many managers persist in trying to outperform the market (“Achieve Alpha”). This also is human nature. There are entire schools of philosophy with technical reasons, or visions of the future that seem to indicate who the obvious winners will be. But history indicates that on any given year, only around 35% will outperform the market averages. And over multiple years that number will drop to below 20%.
What does this have to do with Cybersecurity? Well, to begin with Cybersecurity is a complex system that has no predetermined right answer. There is no correct checkbox that when checked indicates you have cybersecurity, or you don’t. The threat environment changes rapidly. New vulnerabilities, new exploitation methods, and new groups of hackers are popping up all the time. These “red team” or attacking tactics are studied, and counterbalancing defense or “blue team” strategies are developed to prevent and defend against exploitation. Without knowing the technology, it is nearly impossible for a non-technical executive to gauge the current and future competence of her cyberteam. She doesn’t know how much her team knows, and of that she doesn’t know how well they have implemented it.
But there are in fact generally agreed upon strategies that are more successful than not. And the best, most highly skilled cybersecurity specialists share and publish them. From a best practices standpoint one organization NIST (National Institute of Standards and Technology) puts forth a Cybersecurity Framework. Another organization CIS (the Center for Internet Security) has a set of benchmarks intended to directly harden the security of workstations and servers. These benchmarks are the aggregate work of specialists from hundreds of organizations across the world and are vetted and commented on by thousands more. They change over time, as information changes and the success of given strategies rise and fall. These recommendations are free and publicly available. Yet the average small business network scores between 28%-32% when graded against them as a standard.
Many industries are developing ISACs (Information Sharing and Analysis Center) around cyber and related threats. These ISAC’s provide information on real time active exploitations and the methods being used to combat them. Yet many if not most organizations do not participate, even when ISAC’s relate directly to their industry.
Further these same industries are developing compliance regimes that seek to test policies and procedures of an organization relative to a common cybersecurity framework or series of benchmarks. Some examples of this are CMMC for DOD contractors, CJIS for law enforcement, HIPPA for medical information or PCI DSS for those accepting payments via credit and debit cards. The number and complexity of compliance standards are growing considerably. But just because your organization isn’t currently required to follow any of them doesn’t mean you shouldn’t. There is, after all, wisdom in the knowledge of the crowds. There are things that can be learned and implemented, which will strengthen an organization’s cyber defenses.
Cybersecurity is a nascent arena, but there are standards that can be adopted. And if followed, you can expect to do substantially better than average in preventing breaches. These frameworks and compliances of course are a jumping off point for an organization’s cyber resilience strategy. They are not in and of themselves a complete answer to current and future threat landscapes. They are, in effect, the Index Funds of the industry. They are a generally low cost, high impact way of managing the security journey.
They provide a framework by which a non-technical executive can begin to be assured that the policies and procedures of the organization are reasonable, and not absurdly ineffective outliers. There is in effect no better way for a non-technical executive to judge if their current winning strategy is good or lucky.
In short, the average plumber who doesn’t study the markets should stick to index funds and not try to achieve alpha/beat the market. Similarly, organizations without the resources to hire sophisticated and comprehensive cybersecurity teams should not defer to the knowledge of their staff over the collective knowledge of an industry. When a non-technical executive hires an MSP and abdicates all responsibility of cybersecurity to them, they might as well be taking stock tips from a plumber they met at a bar. It is an equally dangerous habit that will lead to bankruptcy just as often and quickly.