Information is an assumption for modern finance. The Efficient Market Hypothesis uses information to back its case for efficiency. The EMH case is weak, but as Martin Swell (2011) explains until a flawed hypothesis is replaced by better hypothesis, criticism is of limited value. This paper challenges the information assumption in EMH based on the idea laid out first by Kenneth E. Boulding (1966), highlights the body of work discussing information relevance, information irrelevance, information content since Ball and Brown (1968) and illustrates how ‘Mean Reversion Framework’ (2015) can be used to re-explain the transformation of information from relevance to irrelevance, also referred to as the ‘Reversion Diversion Hypothesis’.
We can not understand ‘Value’ without understanding information. Information or market efficiency on one side suggests that there is no advantage that can be drawn from information to the extent of beating the market as markets are random and markets reflect all available information. While on the other information or market inefficiency is the contrary view where markets do not reflect available information and there continues to be relevance in information. The relevance or irrelevance of information is not a debate since the conflict between market efficiency and inefficiency is extensively observed and cited and the asset pricing model has faced challenges regarding its validity.
There could be two ways to build a new hypothesis and challenge the EMH. First at the product level and second by dealing with information at the conceptual level.
Granger (1992) mentioned, “To build a method that consistently produces positive profits after allowing for risk correction and transaction costs and if this method has been publicly announced for some time, then this would possibly be evidence against EMH…Only if a profitable rule is found to be widely known and remains profitable for an extended period can the efficient market hypothesis be rejected…Benefits can arise from taking a longer horizon, from using disaggregated data, from carefully removing outliers or exceptional events, and especially from considering non-linear models.”.
An investment hypothesis in Pal (2015), ‘Is smart beta dumb?’ showcased how a new indexing methodology can be built by combining value and growth and dual rebalancing. This paper develops the second approach by reexamining information as a building block underlying EMH to illustrate how EMH’s weakness as a hypothesis could be owing to its assumptions regarding information.
It was in 1968 that Ball and Brown considered the content in accounting information, the flow of information, the relevance of information, its predictive powers and its continuity and time dependence. The observed reversion was used as a validation of predictive content in earnings. This relevance was later shown to cause a drift, an anomaly, un unexplainable behavior of a market. Bernard and Thomas (1989) showcased a seasonal positive autocorrelation in partial periods connected to the news followed by a seasonal negative autocorrelation.
‘An empirical evaluation of accounting income numbers’, Ball and Brown (1968)
“Of all the information about an individual firm which becomes available during a year, one-half or more is captured in that year’s income number. Its content is, therefore, considerable. If the income forecast error is negative, we define it as bad news and predict that if there is some association between accounting income numbers and stock prices, then the release of the income number would result in the return of that firm’s securities being less than would have been expected. Such a result would be evidenced by negative behavior in the stock return residuals around the annual report announcement date. The converse should hold for a positive forecast error.”
From establishing content in information to defining bad and good news to connecting it to negative and positive stock behavior (prediction), the authors not only showcased relevance of information but also connected it with predictive behavior.
“If the information is useful in forming capital asset prices, then the market will adjust asset prices to that information quickly and without leaving any opportunity for further abnormal gain. If as the evidence indicates, security prices do in fact adjust rapidly to new information as it becomes available, then changes in security prices will reflect the flow of information to the market. An observed reversion of stock prices associated with the release of the income report would thus provide evidence that the information reflected in the income numbers is useful. Our method of relating accounting income to stock market prices builds on this theory and evidence by focussing on the information, which is unique to a particular firm. Especially we construct two alternative models of what the market expects income to be and then investigate the market's reaction when its expectations prove false…Historically the income of the firms has tended to move together. One study found that half of the variability in the level of an average firm’s earnings per share (EPS) could be associated with economy-wide effects. In light of this evidence, at least part of the change in a firm’s income from one year to the next is to be expected. If, in prior years, the income of a firm has been related to the income of the other firms in a particular way, then knowledge of that past relation, together with a knowledge of the incomes of those other firms for the present year, yields a conditional expectation for the present income of the firm. Thus apart from confirmation effects, the amount of new information conveyed by the present income number can be approximated by the difference between the actual change in income and its conditional expectation.”
The 1968 paper showcased the utility of information, information flow, and how market’s reaction to information. However, the usefulness linked with behavior predictability was validated with observed reversion. If the expected earnings were higher than actual, the news was bad and hence was preceded by a reversion in future stock market returns and vice versa. The idea of reversion linked to the forecasting error showcased dependency of past with future, and how information had influence.
“But not all this difference is necessarily new information.Some changes in income result from financing and other policy decisions made by the firm. We assume that, to a first approximation, such changes are reflected in the average change income through time. King (1966) it was estimated that about 30-40 percent of the variability in a stock’s monthly rate of return over the period March 1944 – Dec 1960 could be associated with market-wide effects. Market-wide variations in stock returns are triggered by the release of information which concerns all firms.Thus, since the market has been found to adjust quickly and efficiently to new information, the residual must represent the impact of new information…persistence in the drifts beyond the announcement month. Since the efficiency of the capital market is largely determined by the adequacy of its data sources, we do not find it disconcerting that the market has turned to other sources which can be acted upon more promptly than annual net income.”
There was subjectivity linked with information. There was a news which was market wide and hence adjusted quickly, and there was new information, which was believed to be causing drifts. The idea of new information was open ended and the focus was shifted to quality of data sources. This could have started the information industry wave and its perceived connection with stock market behavior.
In financial economics and accounting research, post–earnings-announcement drift, or PEAD (also named the SUE effect) is the tendency for a stock’s cumulative abnormal returns to drift in the direction of an earnings surprise for several weeks (even several months) following an earnings announcement.
“The nagging general question is what kind of equilibrium would support market prices that only partially reflect information as widely disseminated as and free available as earnings. Why the market would appear to react with a surprise to earnings information that is predictable, based on earnings for the prior quarter… Once a firm’s current earnings become known, the information content should be quickly digested by investors and incorporated into the efficient market price. However, it has long been known that this is not exactly what happens. For firms that report good news in quarterly earnings, their abnormal security returns tend to drift upwards for at least 60 days following their earnings announcement. Similarly, firms that report bad news in earnings tend to have their abnormal security returns drift downwards for a similar period. This phenomenon is called post-announcement drift.”
The drift was an observed anomaly. What kind of market structure caused such an anomaly? What was the reason for a 60-day drift? Why the positive and negative drift? Why was it symmetrical? Why was public information not fully reflected instantly? What was the reason for this partial organic assimilation of news? Why do we call it a surprise? Why the anomaly? Even after nearly 50 years, we have limited clarity on the above questions. The question of the interaction of information with market systems could possibly throw some light.
“The counterargument against market efficiency theory, PEAD is considered a robust finding and one of the most studied topics in financial market literature. The most widely accepted explanation for the effect is investor underreaction to earnings announcements. According to Bernard & Thomas (1990), PEAD patterns can be viewed as including two components. The first component is a positive autocorrelation between seasonal difference (i.e., seasonal random walk forecast errors – the difference between the actual returns and forecasted returns) that is strongest for adjacent quarters, being positive over the first three lag quarters. Second, there is a negative autocorrelation between seasonal differences that are four quarters apart.”
Anomalies pose a potential risk if they remain unexplainable for a generation. The interesting part is regarding the symmetry of the drift and the tendency for reversion. The fact that a positive trend of three-quarters changes to a negative tendency is not surprising. In our focus on the influence (indifference) of information and the predictability (unpredictability) of errors, we could have ignored the conceptual structure of the market leading to its behavior.
‘Random Walks in Stock – Market Prices’, Eugene F. Fama, 1965
“The assumption of the fundamental analysis approach is that at any point in time an individual security has an intrinsic value (or, in the terms of the economist, an equilibrium price) which depends on the earning potential of the security. If actual prices tend to move toward intrinsic values, then attempting to determine the intrinsic value of a security is equivalent to making a prediction of its future price; and this is the essence of the predictive procedure implicit in fundamental analysis. An “efficient” market is defined as a market where there are large numbers of rational profit-maximizers actively competing, with each trying to predict future market values of individual securities, and where important current information is almost freely available to all participants. In other words, in an efficient market at any point in time, the actual price of a security will be a good estimate of its intrinsic value. Now in an uncertain world, the intrinsic value of a security can never be determined exactly. Thus there is always room for disagreement among market participants concerning just what the intrinsic value of an individual security is, and such disagreement will give rise to discrepancies between actual prices and intrinsic values. In an efficient market, however, the actions of the many competing participants should cause the actual price of a security to wander randomly about its intrinsic value. The discrepancies between actual prices and intrinsic values are systematic rather than random in nature, then knowledge of this should help intelligent market participants to better predict the path by which actual prices will move toward intrinsic values.”
Ball, Brown, and Fama were contemporaries. Ball and Brown’s challenge to the efficient view linked with information were based on content, relevance, and prediction. But Fama’s argument regarding information was based on unpredictability, leading to information irrelevance at a performance (beating the market) level. According to Fama information’s predictive content was an illusion because intrinsic value itself was a moving target. Information and market participants consistently pushed the equilibrium value, so much so that Fama believed that the current security price was the equilibrium price. The level of disagreement between the players and the discrepancies made prediction a failed effort. Moreover, there was a neutralizing of the effort leading to prices wandering around the intrinsic value. The past information had no meaning, hence independence was to be expected. Though Ball and Brown’s arguments were substantiated and perfectly contrasting with a robust anomaly backing it, Fama’s EMH continued to stand firm for a few decades. The reason could be because Fama’s Hypothesis was based on a framework which explained how market participants and information interacted with the market. Now that Ball and Brown have more support and the ‘Inefficient’ school’s strong research backed argument has further weakened the EMH hypothesis, a new framework explaining information and market could lead to a new hypothesis.
“When the many intelligent traders attempt to take advantage of this knowledge, however, they will tend to neutralize such systematic behavior in price series. Although uncertainty concerning intrinsic values will remain, actual prices of securities will wander randomly about their intrinsic values. Of course, intrinsic values can themselves change over time as a result of new information. This says that the “instantaneous adjustment” property of an efficient market implies that successive price changes in individual securities will be independent. A market where successive price changes in individual securities are independent is, by definition, a random-walk market. Most simply the theory of random walks implies that a series of stock price changes has no memory – the past history of the series cannot be used to predict the future in any meaningful way. The future path of the price level of a security is no more predictable than the path of a series of cumulated random numbers. It is unlikely that the random-walk hypothesis provides an exact description of the behavior of stock-market prices. For practical purposes, however, the model may be acceptable even though it does not fit the facts exactly. Thus, although successive price changes may not be strictly independent, the actual amount of dependence may be so small as to be unimportant.
The mention about intrinsic value and equilibrium points to Fama’s acceptance that market does need a dynamic mean to oscillate too. As efficiency means a move towards value, which is intrinsic to the system under study. Fama also talks about disagreements between market participants leading to discrepancies, noise, or divergence from intrinsic value, the very reason market needs intrinsic value for equilibrium. The move towards equilibrium is key for Fama’s hypothesis. His argument revolves around the divergence from and reversion to intrinsic value, which is not something static, but something constantly changing in time. This was a similar thought when Galton laid out in his 1884 paper, ‘Regression towards mediocrity in hereditary stature’. The ‘Mean Reversion Framework’ (2015), explained how Galton laid down a framework for the functioning of natural systems with a balance of reversion and diversion, he primarily focussed on illustrating reversion and did not build a case on divergence. Fama just like Galton acknowledges diversion (discrepancy) but does not focus on it. The consideration of discrepancy as noise is the reason, EMH afford’s to accept anomalies and ignore inefficiency, relevance, and dependence choosing to stick to independence, irrelevance and efficiency of the system to revert over the composite of reversion and divergence, a broader framework more close to actual market functioning.
History of Debate
Even with the Joint hypothesis problems, attacks on randomness, research which highlighted the benefits of collection information outweighed the costs, proof that there was a section of the market which traded anything but information, market’s innate ability to generate information, markets reaction to new information, existence of predictability with randomness the EMH remains the weak but not redundant hypothesis. This is unlike the CAPM, which has a stronger case of redundancy against it. The table below illustrates the history of the debate in the context of the information. A part of the chronology has been updated from Swell (2011), ‘History of the Efficient Market Hypothesis’.
Information drives the knowledge process. Even if a part of it keeps getting reflected by the market, the information generation is a continuous process, which makes information an entity which can never reach an ideal state of being fully reflected in the market. The ‘fully reflected” state can not be considered an ideal state, it is one of the many states which the interaction of information and market brings. Our understanding of information should improve in time, but it is important to consider alternative ways to look at information and its interaction with the market entity. Kenneth Boulding not so coincidentally was also a contemporary of Ball, Brown, and Fama. While Ball and Brown were looking at information relevance, Fama was looking at the process of information becoming irrelevant, Kenneth laid down the guidelines that could assist in structuring an information framework.
Boulding talked about this transformation of information into knowledge and how the assumption of knowledge being costless was incorrect. He explained how the income of arbitrageurs might be regarded as the cost of acquiring knowledge, which is necessary to operate the market and the other people are willing to pay this rather than become arbitrageurs themselves. What is an anomaly today does not mean it will stay an anomaly in future? An anomaly proves that our understanding of how market interacts with information are limited and our current systems are limited in their potential to address these gaps.
John C. Bogle has argued that no value premium exists, claiming that Fama and French’s research is period dependent. “While gross performance reverts to the mean self-evident pattern of mean reversion. Yet as we observe these extended cycles of mean reversion, it must occur to you that investors ought to be able to capitalize on them, riding one horse until it tires, then leaping to the other.”
Boulding talked about how static systems were weak in capturing the dynamics of systems. There was a clear dependence of past information with the future. Decision-making theory which could not consider how inputs from the past could determine the future was “pretty empty”. Knowledge for Boulding was integral to the dynamic system, one could not separate the dynamic system from the information it faced and the information it generated. The system thinking was more important than the content of the information. The system was above the information and its content. This was a conceptual thinking ahead of its time. While randomness was getting rediscovered, dependence and independence of subsequent price changes were debated, EMH took center stage, Boulding’s work on information remained relevant.
‘The Economics of Knowledge and the Knowledge of Economics’, Kenneth E Boulding, 1966
“The absence of any unit of knowledge itself, however, and perhaps the intrinsic heterogeneity of its substance makes it very difficult to think of the price of knowledge as such, and indeed has probably contributed to a certain resistance which we feel to thinking of knowledge as a commodity…The theory of economic development is part of the general problem of evolutionary change, and its poor condition reflects the general poverty of theory of dynamic systems. Throughout the sciences, physical, biological, and social, we are still really more at home with equilibrium systems than we are with dynamic systems…The recognition that development is essentially a knowledge process has been slowly penetrating the minds of the economists, but we are still too much obsessed by mechanical models, to the neglect of the study of the learning process which is the really key to development…The decision is always the choice among alternatively perceived images of the future. The study of the decision, therefore, must concentrate on how these images of the future are derived from the information inputs from the past. The epistemological theory of decision making is, of course, pretty empty unless we can specify ways in which the inputs of the past determine the present images of the future.”
There are challenges to measuring information and related knowledge owing to its heterogeneity and society’s limited understanding of dynamic systems. Researchers have higher comfort with equilibrium compared to something dynamic. Dynamic systems could have multiple states of equilibrium. The linkage of past with future is integral to decision making, so independence of subsequent changes is an empty argument.
“We have here a certaine pistemological paradox, that where knowledge in an essential part of the system, knowledge about the system changes the system itself. This is a kind of generalized Heisenberg Principle, which is particularly troublesome in the social sciences. What this means, of course, is not that the knowledge is unattainable, but that we must regard it as a part of a total dynamic system. That is to say, we are not simply acquiring knowledge about a static system which stays put, but acquiring knowledge about a static system which stays put, but acquiring knowledge about a whole dynamic process in which acquisition of the knowledge itself is a part of the process…Thus in the case of the operations of a market and the behavior involved in buying and selling, it is doubtful whether the knowledge of economics as such makes very little difference. I am inclined to attribute a good deal to good luck and noneconomic forces. An enormous intellectual task still awaits the economist. We are a very long way from writing finis to this chapter of the human enterprise. We still can not handle some of the most elementary problems regarding economic development, economic dynamics, the function of the price system.”
Knowledge about the system changes the system, which means there are both internal and external influences. This indicates stages of dependence and independence between price movements. Equilibrium is essential for any system. Galton’s 1884 work illustrating a natural system which moves towards equilibrium was an observation that laid the foundation for linear regression. Fama’s work also acknowledges the dynamic nature of stock markets but assumes equilibrium to be an intrinsic value, summarizing all information and activity around it. Just like CAPM, though not redundant, EMH is one of the many idealized states, which stock market systems witness. Markets are dynamic systems with both internal and external informational influences. These influences drive the constant and consistent reversion and diversion process. It is in its many transient states that information goes through cycles of relevance and irrelevance, making markets efficient and inefficient. The assumption of independence between subsequent price movements is not incorrect but incomplete. It is time for markets to accept that random and non-random behavior coexist, the very reason for a new framework.
Reversion and Diversion
Behavioral finance agrees that anomalies can’t be identified and exploited on a persistent basis. The behavioral model accepts its temporal limitations. The paper ‘Arbitraging Anomalies’ Pal (2015) explained the circular argument explaining anomalies and how a behavioral explanation of the key anomalies is an extension of the reversion diversion process of stock market systems. The paper explains the five anomalies 1) The equity premium puzzle, 2) Predictability, 3) Dividends, 4) Volatility and 5) Volume myth as reversion failures.
The paper ‘Mean Reversion Framework, Pal 2015, re-explained the original work by Galton on mean reversion in 1886 and how it emphasized relative before absolute, talked about the relation of the variable with the sample average, pointed out the balance between convergence and divergence and showcased cross-domain expression of mean reversion. Though mean reversion as an idea has been in the open domain for 130 years, there has been no attempt to extend the Galtonian definition of natural systems into a framework that could allow for better understanding and functioning of natural systems and also explain the failures of reversion. Any proxy that expresses Galtonian reversion should be simple, relative and universal. The stock market case can be redefined as a framework that builds on the Galtonian explanation of a natural system and incorporates the idea of relative ranking, relative average, balancing forces of convergence and divergence, and the universal workability of the framework across domains.
The paper ‘Momentum and Reversion’ Pal 2015, explains how Momentum and Reversion have always been seen as independent of each other and never as a composite. The two behaviors are not only connected but also get transformed into each other. These dynamics drives not only stock market systems but all natural systems. One reason researchers did not see this composite behavior is because of the focus on independent components (asset prices) rather than a group of components (a collection of stock prices) and because of a lack of an adequate framework to illustrate the two key behaviors together. The ‘Mean Reversion Framework’ explains how natural systems witness reversion and divergence simultaneously across different periods of time.
Galton’s work talks about relativeness, either in the form of comparisons, ratios, deviations, proportions, degrees, scale, extremes etc. He goes about giving relativeness more importance than absolute values. Even deviates (deviations) are referred to in the context of comparisons between the variables. The focus is not on the absolute values of the mean but how they are relatively shifting towards average (mediocrity). The further away the natural data went from the mean the stronger it was to revert. The rule was simple; positive extreme was prone to revert down while negative extreme was prone to revert up. Convergence was more of focus than divergence. Galton classified his system as organic. There was a succession, continuity, and periodicity. There was a context, proportionality, and extremes in the data set studied. There was dynamism between the data sets, increasing and decreasing change, accelerating and decelerating change. There was a mean relative to the data. Despite the difference and change, the pattern persisted and balanced out, reached a state of equilibrium, compactness. There was a context of top and bottom. There were opposing actions; dispersive and converging forces, opposing tendencies, spring-like action. There was a process of transformation generationally, a process of replacement by the data set to freshen itself up, a sequence of stages. Despite the order, there was a sense of randomness, less scattered data lead to more scattered data and vice versa. Galton’s reversion is connected to the dispersion in the system. The both forces act together to keep the dynamism going. His focus is primarily on divergence from mean rather that top or bottom rankings. Galton’s considered reversion as extremely regular, leading to a state of constancy. Regularity meant that reversion and dispersion followed a cyclical process. He talked about an alternation between the two. Galton observed or focused on a complete cycle from one generation to the next, from dispersion and reversion.
Building on what Boulding mentioned, information leading to knowledge is integral to the ‘Framework’. It is hard to separate the system from the information (i) it interfaces and generates. The system is above the information and its content. The current state of the system is connected to its past state and the component or group specific information it generates is a function of the current state of the group and its components.
Reversion Diversion Hypothesis
The ‘Reversion Diversion Hypothesis’ is intrinsic to natural systems including stock market systems. The hypothesis is based on the ‘Mean Reversion Framework’, which explains how reversion and diversion can not be seen independently. It is the reversion and diversion process that drives assets into price momentum and into price reversion. The ‘Framework’ classifies price momentum into ‘Value’ and ‘Growth’ and illustrate how ‘Value’ and ‘Growth’ could be statistically driven. The Reversion Diversion Hypothesis addresses the failings of EMH in addressing discrepancies, by building a system approach to explaining how the idea of intrinsic value could be extended from the current price to something relative and dynamic. EMH accepts that the market needs a dynamic mean to oscillate to but does not develop a comprehensive framework to reconcile the arguments against market efficiency. EMH fails to address information in a comprehensive way and its insistence on market efficiency and independence of past from the future despite a large body of work on market inefficiency renders it incomplete. The ‘Framework’ was tested across time frames and across various groups for absolute and stationarity trends. Its transformation across states was orderly and significant. The Reversion Diversion Hypothesis proves that random and non-random systems, inefficient and efficient market systems, dependence and independence, relevance and irrelevance of information could co-exist.
Regression towards mediocrity in Hereditary Stature, Galton F. (1886)
Random Walks in Stock – Market Prices, Eugene F. Fama, (1965)
The Economics of Knowledge and the Knowledge of Economics, Kenneth E Boulding, (1966)
An empirical evaluation of accounting income numbers, Ball and Brown (1968)
Does the Stock Market Overreact?, De Bondt, W. F. M. & Thaler R. (1985)
Post Earnings Announcement Drift, Bernard & Thomas (1990)
Forecasting stock market prices: Lessons for forecasters, Granger (1992)
A Non-Random Walk on Wall Street. Lo A. W. & MacKinlay A. C. (1999)
The Stock Market Universe – Stars, Comets, and the Sun, John Bogle, (2001)
Universality in Multi-Agent Systems, Parunak et al. (2004)
The BRIC Model from a Japanese Perspective, Pal & Nistor, (2010)
The Divergence Cyclicality, Pal & Nistor (2010)
History of Efficient Market Hypothesis, Swell (2011)
Mean Reversion Indicator, Pal M. (2012)
Mean Reversion Framework, Pal M. (2015a)
Markov in the Mean Reversion Framework, Pal M. (2015b)
Momentum and Reversion, Pal M. (2015c)
Is Smart Beta Dumb, Pal M. (2015d)
Arbitraging the anomalies, Pal M. (2015e)
Stock Market Stationarity, Pal & Ferent (2015)