A Distinct Non-Randomness to Fully Random Errors

X
Story Stream
recent articles

Bell and Howell was a company creation of when the motion picture business resided in Chicago. Donald J. Bell was a projectionist in movie theaters around the Chicago area and got to know pretty well the business of showing movies, rather than the more glamorous side of creating them, seeing opportunity in the clunky and nonstandard nature of the equipment. He applied for his first patent in 1906 to improve "framing" on the dominant 35mm Kinodrome. Tinkering on his own, he had met his future partner, Albert S. Howell, working at the Crary Machine Works in Chicago, the shop where almost all equipment went to be machined and serviced. In February 1907, Bell and Howell was born ostensibly to repair movie equipment.

By the time of the Great Depression, the company was forced into many changes including some rather innovative techniques and products. These included a zoom lens called the "Varo" and a spool camera that eventually brought the company in contact with the military. During WWII, they would design and manufacture gun cameras and the retriflector sights used on B-29's. Post-war, the company worked through hard times but never lost contact with the military, especially the Air Force (then the Army Air Corps). New management led by Charles Percy, who was only 28 when he took over as President, expanded products and services further, both within and without motion pictures, and even gained an Oscar in 1951 for "technical achievement."

So it was no wonder that Mr. Percy and Peter Petersen, the firm's executive director, were visiting NORAD on October 5, 1960. They were joined by, among others, Thomas J. Watson, Jr., the President of the dominant computer firm IBM. At some point during their visit, the group was informed that equipment located at Thule Air Force Base in Greenland had flashed a Soviet launch warning, claiming 99.9% certainty that a nuclear salvo had been launched and picked up by the station's radar.

The base immediately declared DefCon 1, a condition that would not be widely known until 1983's Hollywood simulation of military stochastic simulations, WarGames. They called to Strategic Air Command and bombers were given high alert status, but no one could verify the attack was, in fact, real. Greenland's Ballistic Missile Early Warning System was the only equipment declaring, essentially, nuclear war. As this was about two years before the Cuban Missile Crisis, and really within the infancy of electronic warfare (itself based on computers which were themselves nothing but probability constructions), confusion, alarm and near-chaos reigned if only for a few brief minutes - I can only imagine what Percy, Petersen and Watson were feeling having rather more intimate knowledge of the military than most.

Fortunately for humanity there was no nuclear exchange at that instant, and no indication that it went so far as to be anything more than a few heart-pounding moments contained to a small cadre of network nodes at NORAD and SAC. It was established that Thule's equipment, 99.9% certain of Soviet missiles already over the horizon, had picked up the rising moon.

It may seem like such a simple mistake on such an extremely important system, but the dangerous gaffe reveals something far too common of probability-based approaches. The idea of 99.9% certainty is not that at all, as such a mathematical number only refers to the narrow set of circumstances into which the parameters of the system have taken account. The system's designers do everything in their power to try to ensure that in narrowing the sets of variables to be included, and then ranked by priority, that little escapes the math. What's left after those exhaustive computations is believed by statistical "science" to be nothing but random errors - or, more precisely, errors that are thought and calculated to be as no more common in occurrence than random chance.

In the case of Thule's statistical systems hardwired into its software, the moon was a "random error." This is an especially easy case to visualize how statistic computations are not "science" in terms of predictability, as they are just as subjective as raw emotion. The fact that the system's designers failed to account for the moon in just such a place as the detection field of view at the right time was bland oversight, but still subjective in its non-incorporation. There was nothing random about the moon.

In the field of chaos theory this is known as "exploding" error terms, as the static nature of the mathematical computation assumes way too much. That is most especially relevant to what it leaves out - by subjective inference as well as computation. The Thule radar software would have been just as wrong had the Soviets developed an ICBM with some kind of stealth capabilities (this was, of course, not realistic but I am digressing into allegory here) and then launched. In that case, the ballistic missile warning system would have inversely reported a 0% chance of a launch that was 100% happening, being just as wrong as the 1960 sequence.

By the definitions of the stochastic processes that defined the launch detection apparatus, a stealth missile (that evaded the CIA or DIA's notice) would have been treated as an "exogenous" factor incorporated within the framework of randomness. You can't model what you can't see or don't know about, but that doesn't make it any less dangerous to simply assume it as such.

This is most relevant now in the field of both quantum physics and economics. In quantum physics, at least, the overriding use of statistical models presents only a challenge to epistemology rather than science. The math works, and there are very few instances where random errors disprove prior established theory. That is why theoretical work can proceed by leaps and bounds to new areas and discoveries without having to re-litigate past doctrine over and over. It does not matter that there is no established means, right now, of turning that math into the real world (a Higgs boson, for example, may not be "real" in the sense of "something" that exists in our view of existence, it is at present just a variable in a complex maze of equations that allows them to equalize).

For economics, the matter is far different though it is rarely treated as such. Within the discipline there currently rages a debate with established lines of academic divide. The "saltwater" economists (those from schools positioned near the oceans, Berkeley, Princeton, MIT, etc.) cannot seem much to agree with their "freshwater" colleagues (those near the Great Lakes, such as U of Chicago, Carnegie Mellon and others) spilling into often acrimonious public arguments of child-like fits. From the public's perspective, the difference simply boils down to one set's models are slightly different from the other's.

In the fact of the larger questions about economic sanctity, the "big" issues are relatively settled and agreed upon. As eminent economist Greg Mankiw wrote almost a decade ago,

"In macroeconomics, as the older generation of protagonists has retired or neared retirement, it has been replaced by a younger generation of macroeconomists who have adopted a culture of greater civility. At the same time, a new consensus has emerged about the best way to understand economic fluctuations...Like the neoclassical-Keynesian synthesis of an earlier generation, the new synthesis attempts to merge the strengths of the competing approaches that preceded it."

The earlier "synthesis" to which he referred was the also-heated debates of the 1960's that so often pitted those predisposed to government overreach and activism against those ostensibly judging themselves more of "markets." This was the great contest to which Samuelson and Solow and their "exploitable" Phillips Curve dominated one side, and Milton Freidman and Edmund Phelps the other. Their answer to the idea that a central bank or government could and should "buy" higher employment with higher inflation was what became known as the "natural unemployment rate."

This strand of economics took its basis from Wicksell's interpretation of the "natural rate of interest" which he defined in 1898 as a "certain rate of interest on loans which is neutral in respect to commodity prices, and tends neither to raise nor to lower them." In 1966, Friedman, working simultaneously and independently of Phelps, classified his idea as, (from Guidelines, Informal Controls and the Market Place, University of Chicago Press; 1966)

"...there is what might be termed a ‘natural' level of unemployment in any society you can think of. For any given labor market structure, there is some natural level of unemployment at which real wages would have a tendency to behave in accordance with productivity...If you try, through monetary measures, to keep unemployment below this natural level, you are committed to a path of perpetual inflation at an ever-increasing rate."

As it turned out, Freidman (and Phelps) was absolutely correct in his theory which is why it gained widespread acceptance at the direct expense of Samuelson and Solow's prescriptive view of the Phillips Curve. What subsequent study would show, stochastically of course, was that the Phillips Curve seemed to hold but only for the short run until economic agents "normalized" their behavior to expected conditions. In the case of exploitative monetarism, that meant "buying" nothing but nominal changes that over the long run went missing as "real" wages adjusted back lower no matter how much nominal "inflation" central authorities inaugurated and dedicated.

Having observed this in actual function at the beginning of the Great Inflation, economists set out to model these new theories and create their own set of policies and "laws" by which an economy might be differently commanded. Among the most influential was Robert Lucas who took inspiration from both Friedman and Phelps, often corresponding at length with the latter. Lucas would play a central role in adopting this style of view, allowing it to predominate from the 1980's forward (and forming the basis of both saltwater and freshwater modeling).

The world into which relatively primitive econometrics worked was centered around the idea of a "general equilibrium." This was nothing new, as economists since the time of Malthus, Mill and Simon Newcomb believed that there was a method of quantifying any and all economic function. The equations would, as the name general equilibrium implies, have to balance. The central debate ranged around how price changes were set and modified especially owning to monetary and time variables.

What Lucas did, in his famous 1972 paper Expectations and the Neutrality of Money, was to assume generalized equilibrium from the very start. Departing from a regime of "adaptive expectations" Lucas asserted "rational expectations." What that meant was neutralizing the equations of price expectations so that the difference between actual and expected prices is thus set to zero. In that sense, price behavior could then be adapted under a general equilibrium format, and the whole set of Freidman/Phelps "natural unemployment rate" econometrics would balance (I am simplifying here intentionally).

The implications of such a mathematical breakthrough were striking and recognizable in the aftermath of the Great Inflation. The most direct interpretation, and the one at which modern and 21st century economics turns, is that "rational expectations" will lead to models in which quantitative monetary policy prescription and evaluation can be made. This is perhaps most well-known in the "rules-based" paradigm of monetary alternatives, such as the Taylor rule, but also forms the central theory even behind QE (especially the "Q").

The problem, as with quantum physics, is that "rational expectations" is not a real world phenomenon and certainly not directly relatable or transferable. It sounds as if it may be consistent with our experience of economic reality, as setting the differential of actual and expected prices to zero represents something like total market efficiency. It means that "market" prices are always correct and therefore econometric models need not concern themselves about initial equilibriums - they are always just assumed to be in that state. Inside the math, market prices are thus presupposed to always be market-clearing, and thus not subject to stochastic tests.

Even though the assumption of "rational expectations" is one in which there really appears to be no real-world counterpart, it dominates the centrality of all economic assumptions. Furthermore, like most economic and monetary paradigms, it is unfalsifiable. By adopting "rational expectations" at the start, any statistical tests are thus contained within the paradigm that all "market" prices are true and "correct." That is a dangerous proposition when real world economic and financial parameters are supposed to flow solely from what is simply a means by which to find a solution within a system of stochastic equations describing only general equilibrium.

That places them, at the very start, at odds with any micro-view of the economy whereby equilibrium is beyond a foreign concept. This includes any "network effects" where expectations might be simply self-fulfilling, a property that has, in contrast to rational expectations, been directly observed over and over. If nothing else, the entirety of economic history is replete with deviations, often sustained, that suggest strongly any equilibrium concept is especially alien rather than the norm.

Under the dominant view of rational expectations, however, the "proper" general treatment of any deviation from the generalized equilibrium format is to view such error as random! The set of stochastic equations that describe the equilibrium are derived from past history of subjective culling of relevant variables. Economists have tried to get past this obvious deficiency by incorporating "dynamic" properties that add new data as it becomes available through history (which is why the current set of models are DSGE, as in dynamic stochastic general equilibrium). However, this never solves the problem of simple evolution and innovation.

Like the potential for Soviet stealth missiles, these DSGE equations never integrate actual innovation because they cannot. There is no way to model what has not been invented or even dreamed up. To simply assume that all market prices are reflective of an actual state of knowledge that applies only to outdated information is nonsensical. That especially applies under circumstances of non-random error.

In this case I am speaking about the eurodollar market. As the persistence of the "global savings glut" idea showed expertly, economists still have not caught up with the modern wholesale system as it actually exists. In the sense of even DSGE models, then, the eurodollar market is treated as a random error like the moon rising once was in a different context. The drastic and sustained shift of global banking to off-shore funding models of speculation rather than trade has been left outside of the settled thought on how the economy and finance hits and maintains "equilibrium." Thus, asset bubbles are thought to be random occurrences of some "exogenous" factor unknowable to the set of current assumptions.

This is why economists become unglued by asset bubbles because they have no way of controlling for them in models that simply assume rational expectations. The burden supposedly lies with something apart from policy because the main interpretation of these equations also includes monetary neutrality, and thus the fault is in the stars rather than the "money." In other words, by mathematical existence alone, econometrics is forced to see bubbles as "proper" market function, even though that makes no sense whatsoever, just to preserve the equality of general equilibrium theory.

Pure observation cures any such notions, and begs for a rediscovery of some theoretical foundation free from even the contours of equilibriums. Real life, as described again in chaos theory, for example, is decidedly non-linear by raw experience of uncountable variables. It is beyond hubris to think that equilibrium is anything more than a purely mathematical concept designed to make equations workable. To then turn that around and fit the real world to the math is just backwards; the models should endeavor to describe reality rather than try to theorize reality around the models.

This would include "market" history dating almost always back to the same point in 1995. Nearly any financial factor you wish to study attains a new and higher trajectory right at that moment. The consistency of this appearance demands attention, yet rational expectations says it is random and unimportant even though regularity is itself a statistical signal of distinct non-randomness. Whether it is the PE on the S&P 500, the number of new homes built in the US, or the ratio of "net worth" in the US to domestic GDP, all surge sharply higher starting in 1995 beyond any historical bounds.

The other major occurrence of 1995 was the sudden and sharp introduction of eurodollar-based finance into US credit. This is not some theorized concept as we can directly observe its intrusion. The surge in the Swiss banking proportion of "dollar" assets, for example, both held and interbank, dates to exactly that time and the Swiss were just one component of the new "dollar" paradigm. In other words, for the most part in the prior history of eurodollars those "off-shore" arrangements largely remained off-shore as a matter of financing actual trade, which is why DSGE models largely ignored the evolution. However, in a most non-random turn, the change in Federal Reserve mechanics and the evolution of banking after the S&L crisis left a huge hole in especially domestic mortgage capacity to which this new funding "market" entered and eventually dominated.

Whatever "equilibrium" the Fed assumed it was attaining was instead another mathematical figment of pure, stale theory and calculation, shorn by intent of any mechanical means to describe what was really happening. That was shown conclusively in observation, not theory, by the belated attempt of Greenspan and Bernanke to suggest that the world was glutted by global savings rather than some endogenous shift in "dollar" financing that had taken place right under their very noses.

By simple intuition, there is likely something of an "observer effect" in monetary economics that functions not unlike that in quantum physics. The act of observation causes a collapse of a wave function so that probabilities no longer apply and the object being observed takes the state of what we experience. In terms of economics, the idea of rational expectations and perfect market efficiency masks the effects of central bank influence upon "markets" by simply assuming at the start everything is a market approach; thus market prices acted upon by central bank activity no longer maintain market-clearing nature because organic markets have been changed by the very act of central bank imposition.

That is why there is a distinct non-randomness, indeed even a growing regularity, to what "they" judge as fully random errors; bubbles are indeed bubbles but don't blame "markets" when these sets of equations amount to carte blanche for central bank activities (as we know all too well up close and personal of the 21st century). The dedication to this single mathematical variable has been hacked, brutalized and challenged by more tears and shattered dreams than anything since the last time we were led into a centralized model starting just over a century ago.

Perhaps one of the great ironies of this story, to me, especially contained within the purported devotion to "markets" among the rational expectations adherents, is that Peter Petersen who was present in 1960 at the Thule event went on to not only head Bell and Howell but also to become Commerce Secretary under Richard Nixon, briefly, dedicated to fiscal conservatism (though instituting wage and price controls on the rest of the country). In 1985, he co-founded Blackstone Group, the private-equity Wall Street powerhouse that has perhaps benefited like few others from continued "random" variations from expected equilibriums.

In 2006, a think tank in Washington was rededicated the Petersen Institute for International Economics after Mr. Petersen made a tremendous and generous donation to the endowment of the foundation. Its board of directors includes former Fed Chair Paul Volcker, former Secretaries of the Treasury Larry Summers and Paul O'Neill, Jean-Claude Trichet the former head of the ECB, and Maurice Greenberg the former head of AIG. There is no mathematical solution in existence that could define a general equilibrium whereby the grouping of those particular people was entirely random, a fitting observation for a persistent non sequitur.

 

Jeffrey Snider is the Chief Investment Strategist of Alhambra Investment Partners, a registered investment advisor. 

Comment
Show commentsHide Comments

Related Articles