Americans Are Sheltered and Wholly Unaware
Probability theory is not strictly a 20th century innovation but its uses and now-ubiquitous appearance perhaps epitomizes the transitions that so drastically altered pretty much everything. Christian Huygens, a 17th century Dutch mathematician and astronomer, published in 1657 a volume on the calculus of probabilities drawn from further communications with mathematical luminaries such as Blaise Pascal and Pierre de Fermat. From there, the idea of probability as a separate and distinct branch of mathematics, let alone applied math, was winding and certainly uneven.
When Harry Markowitz arrived at the University of Chicago in the late 1940's, there were a lot of new theories being nurtured about how to look at the economy and the financial world much differently than in the past. Small wonder, that, given the Great Depression catastrophe relatively fresh in the collective memory. The ideas for quantification about economic factors were equally ancient, and studied seemingly exhaustively in the late 19th century, but found new potential in the combinations of more openness toward it and the technology and theoretical proficiency to express it all coherently.
Where Markowitz stood out, however, was in applying probability theory to equity behavior. Defending his dissertation to Milton Friedman, of all people, Friedman, as Markowitz recalled in his 1990 Nobel lecture, "argued that portfolio theory was not Economics, and that they could not award me a Ph.D. degree in Economics for a dissertation which was not in Economics." It was that revolutionary as to start a branch of economics discipline that didn't really exist before.
There are more than a few great contributions from his work, but the most relevant, I believe, to the current age are as he described early in that Nobel speech:
"It seemed obvious that investors are concerned with risk and return, and that these should be measured for the portfolio as a whole. Variance (or, equivalently, standard deviation), came to mind as a measure of risk of the portfolio. The fact that the variance of the portfolio, that is the variance of a weighted sum, involved all covariance terms added to the plausibility of the approach. Since there were two criteria - expected return and risk - the natural approach for an economics student was to imagine the investor selecting a point from the set of Pareto optimal expected return, variance of return combinations, now known as the efficient frontier."
There is a lot of intuition in that intellectual framework, so much so that it has been adopted widely. Interest in this kind of quantification spread, particularly as the overarching monetary evolutions of the 1950's and especially the 1960's reduced the relevance of not just prior theories but actual operative circumstances. In banking, for example, risk management had always been concentrated far more so to the liability side owing as much to fractional lending of money or at least currency. One of the expansionary trends post-Markowitz was to adopt this style into the asset side to hopefully produce enough knowledge and foresight about bank risk as to render liability runs inapplicable.
We know a great deal of the final stages of those efforts, symbolized in the Basel bank regimes, but there was much about the pre-Basel stage that remains highly relevant to the overall context. The rise of the eurodollar standard was far more than just the final and excruciating demise of gold, it introduced an entire array of variables upon which the concept of risk would have to apply. There were floating currencies and the parallel rise of derivatives markets with which to try to manage it all. In 1975, the SEC issued the Uniform Net Capital Rule (UNCR) which sought to crystalize risk management through standardized quantification, placing bank assets into twelve classes.
While that represented somewhat of a regulatory departure, the true transcendence was that capital treatment of those twelve classes would be different, meaning capital arbitrage back in terms of risk management.
From that process, banks and financial firms came to use essentially Markowitz foundations to describe mathematical properties, through probability statistics, of quantifying risk. Among them was Value-at-Risk (VaR), a broader sense of the task in viewing more than individual securities as contributing potential loss scenarios, unifying them into a combined whole. The idea of VaR is simple and straightforward: to calculate effectively how much across the entire asset structure a bank or financial firm might lose in a specific timeframe given a set of specific parameters.
The growth and adoption of VaR in the 1980's was more limited, but by the 1990's as the shadow system sprung up and took over out of the ashes of the S&L crisis, VaR became common across every major firm in some form or another. A big break came in 1995, coincident to the rise in speculative eurodollars (i.e., the birth of the serial bubbles) when JP Morgan for the first time allowed total public access to its extensive (and quite impressive) database on variances and covariances for a far-reaching and meticulous set of securities and asset classes. Morgan called it RiskMetrics, allowing software to be developed and marketed on that basis.
By April 1996, even the Federal Reserve had noticed the small, if mostly inside, fury. In a paper titled Evaluation of Value-at-Risk Models Using Historical Data, published by FRBNY's Economic Policy Review, author Darryll Hendricks described the potential which was driving such interest in the math:
"Value-at-risk models aggregate the several components of price risk into a single quantitative measure of the potential for losses over a specified time horizon. These models are clearly appealing because they convey the market risk of the entire portfolio in one number. Moreover, value-at-risk measures focus directly, and in dollar terms, on a major reason for assessing risk in the first place-a loss of portfolio value."
That had an effect not just on bank operations but regulatory interest. Indeed, the Basel Rules that were adopted at that time were a melding of these kinds of mathematical concepts with the "bucket" approach first introduced by the UNCR. Under the Basel II framework, a bank's capital "charge" could be very heavily influenced by VaR provided it met a minimum set of standards: data sets needed to be updated every three months; VaR was calculated daily; 99th percentile, one-tailed confidence interval applied; 1-year minimum for historical observations; and, 10-day movement in prices were used as the form of "instant price shock."
The idea was for the firm to back-test VaR calculations against historical observations with the idea to ensure that the model performed close to designed parameters. Regulators would be satisfied that banks were as risky as they claimed, given consistency in the back-test, while firms could garner great capital "relief" through the re-arranging of risk-weighted assets.
The fact that these tests were almost uniformly confirming, of course, led to complacency about it and then recency bias. Mr. Hendricks in that 1996 paper for the New York Fed largely agreed that VaR was as advertised, but issued what amounted to a quite prescient warning:
"Virtually all of the approaches produce accurate 95th percentile risk measures. The 99th percentile risk measures, however, are somewhat less reliable and generally cover only between 98.2 percent and 98.5 percent of the outcomes. On the one hand, these deficiencies are small when considered on the basis of the percentage of outcomes misclassified. On the other hand, the risk measures would generally need to be increased across the board by 10 percent or more to cover precisely 99 percent of the outcomes...The outcomes that are not covered are typically 30 to 40 percent larger than the risk measures and are also larger than predicted by the normal distribution. In some cases, daily losses over the twelve-year sample period are several times larger than the corresponding value-at-risk measures. These examples make it clear that value-at-risk measures-even at the 99th percentile-do not "bound" possible losses."
That was a lesson learned the hard way beginning in August 2007 for the entire eurodollar system. What changed as a result of the adoption of VaR as more than a regulatory fashion was the very nature of banking and money. By tying quantification of risk to capital, the regulatory structure made that quantification money-like. For example, if a bank bought some specific security, it held a specific capital charge which would affect the ability of the bank to further expand and carry out operations, particularly if that security were "risky"; defined here under Markowitz reasoning as with great variance. However, if the same security was then paired with an offsetting security, some kind of hedge, that reduced the calculated fluctuation potential given the VaR horizon and specifics, that also reduced the capital charge associated with the trade and thus effectively expanded the leverage of the balance sheet. Hedging could, in theory, math and certainly practice, act the money printing press.
By the mid-2000's there were far more important aspects to this kind of shadow banking behavior than actual currency or whatever the Fed could provide directly. For its part, the Fed's activities during the bubbles may seem remote, even distanced enough with which to allow some plausible (seemingly) deniability in connecting the asset bubbles to monetary policy. After all, in 1996 when Hendricks published his positive review of VaR bank reserves on account at the Fed totaled between $18 and $19 billion, a relatively paltry sum given the almost $500 billion in daily eurodollar turnover. By the time the dot-com bubble peaked in April 2000, there were just $5 to $8 billion in "reserves"; by August 9, 2007, about the same.
Monetary policy was not "printing money" as is commonly believed, even today, rather the banking system was through VaR direction of capital leverage and associated fractioning of sophisticated bank liabilities. While total bank reserves became irrelevant (and actually remain so), the "global dollar short" exploded as a combined expression of banking hubris about risk management (recency bias) and the interplay of Greenspan-type faith over smoothing out large historical variations in asset prices. That latter part, also known as the Greenspan "put", had very real effects upon just these kinds of mathematical calculations even if only through what amounted to a self-fulfilling prophecy - financial agents believed the Fed could, if pressed, print liquidity in some form, which depressed mathematical calculations of expected variations in asset prices, reducing VaR numbers, thus allowing for far higher inductions of shadow/capital leverage.
By August 2007, while bank reserves were just $8 billion or so, the global dollar short had exploded to $2 trillion, $5 trillion, perhaps as much as $10 trillion! That all came from new forms of money-like behavior and the liquid ability to trade risk parameters.
That meant the disruptions in 2007 were far harder for traditional monetary mechanics to reach and alleviate. Again, in terms of the same example from above, a security and a paired hedge position that suddenly deviates from expected variance requires additional hedging components to bring it back in-line; absent that ability, the bank suffers a greater capital charge which has the practical effect of reducing leverage available in other parts of the balance sheet (vega and so on). That may be several steps detached from what is commonly thought of a bank run, but it is the very same pattern carried out across multiple dimensions - especially when additional hedging comes at a far higher price, or, in the worst cases of a run, is completely unavailable (as it was in the worst parts of the crisis).
I have made the argument that the Fed would have been far more effective, potentially, in 2008 underwriting the monolines' combined CDS portfolios, writing new protection swaps themselves, than any of their traditional liquidity measures that inevitably amounted to total failure. In very broad terms, what happened in 2007 and 2008 was that dealer capacity for issuing risk transformations of this kind, consistent with VaR, simply vanished. A lot of that was self-reinforcing due to the incestuous nature of interbank dealing to begin with; if you have VaR problems as whole asset classes violate deviation parameters and safety margins, then so does your risk counterparty who you are counting on to write protection for you in order to calm your own deviance (and often that counterparty was expecting you to take on risks from his balance sheet at the very same time!).
That was the issue, ultimately, of AIG and why that firm among all the others was singled out (both in terms of devastating collateral calls and the focus of the intense and often misunderstood bailout) in the days after Lehman. The Fed could no longer be lender of last resort because there was nobody left with which to lend; instead they were forced to belatedly absorb tremendous risk capacity onto their own balance sheet, often illegally, and then start to recycle risk chains through bureaucratic and very inefficient means (ultimately just QE's).
The dealers have remained wary of money dealing more broadly since then, meaning that the Fed has for the most part been contained in the middle of the financial situation ever since. However, VaR has not ended nor has the banking system moved to some other form of money-like features with which to conduct broad and efficient global operation of the eurodollar standard. Instead, the financial system meekly expresses quite finite liquidity capacity because of this arrangement, with small and seemingly insignificant disruptions becoming global events (October 15 the most prominent example).
Since October 2014, there appears to be another looming withdrawal of dealer capacity and willingness to perform the necessary money-like functions of risk transformations that are required of the wholesale, eurodollar system to maintain itself. There was a sudden and violent departure seemingly of risk capacity, and thus liquidity, on December 1, which led directly to the ruble, real and Swiss franc crashing in the first weeks of that month.
The Swiss National Bank's actions on January 15 this year remain somehow conventionally in terms of the euro exclusively when in reality the Swiss problem was always the "dollar." Despite heavy losses and near-demise in 2008, the Swiss banking system remains heavily allocated to both dollar assets and liabilities, but by tying the franc to the euro in 2011 the SNB left the banking system exposed to large deviations, including calculated risk measures, in the franc to the dollar through the euro. While on the outside that might seem to be a net neutral proposition (running a matched book of dollar assets balanced to dollar liabilities) it was actually far more problematic due to how liabilities versus assets actually function (VaR, rollover risk on the liability side cannot be balanced by an increase proportionally of dollar assets becoming "more valuable" due to a rising dollar). The downside is exponentially greater through the derivative structure on the liability side.
That meant the SNB "had" to act on January 15 to severe the franc from the euro to gain more relevant perspective through the "dollar" - accepting all that it cost, financially and economically, because of the existential dangers of not doing so. The rising "dollar" tracing back to last June is not truly an exchange valuation but rather a very strict "run" on the internal financing mechanics traced back to the VaR structure of money-like foundations for wholesale banking.
Indeed, data that has been accumulated since that time points exactly in that direction, though for reasons that may be somewhat surprising. The Treasury Department's TIC figures, which show changes in custodied and reported US Treasuries and dollar-denominated assets more broadly, stands in as a quite useful proxy of these eurodollar facets. For example, a reduction in the "official" (meaning foreign central bank or government account) holdings of US treasuries is not selling in the strictest sense, but rather mobilization of "reserves" with which to alleviate a local "dollar" financing problem.
The latest TIC data through April 2015 confirms that central banks have not just mobilized massive reserves, they have done so at an unprecedented rate consistently for the past six months tracing back to the aftermath of October 15. Starting December, "official" sources have sold (mobilized) a net $77 billion in dollar-denominated assets; there is simply nothing on that scale in the historical accounts, even the worst parts of 2008.
The reason for that huge central bank presence is the retreat of bank balance sheets. The TIC series also calculates a measure of banks reporting their own "dollar" liabilities. In December, bank liabilities in "dollars" dropped by the largest amount on record: -$241 billion. That followed an unusual drop in November of $11 billion (the middle month of each quarter, since the 2013 taper "tightening", typically sees an "inflow"). Further, balance sheet expansion in leading months has been severely reduced, as April's "inflow" was just $70 billion compared to $126 billion in January and $173 billion in October. In short, as expected by the behavior of both the "dollar" and foreign central banks, bank balance sheets are contracting.
What we don't know is in what form those balance sheet contractions are taking, as there are a multitude of ways in which reduced liabilities can be amplified in terms of systemic leverage or leverage reduction. Given, however, the desperate position of the SNB by January 15 (not to mention the Russians, the Brazilians and the Chinese) we can reasonably assume that "dollar" leverage in all the math, in all the dimensions, is retreating. Even the federal funds rate, which has led a placid and mostly irrelevant existence under the four QE's woke up enough on December 1 to jump and remain a few basis points higher (as with LIBOR, interbank rates have been rising since December 1 in eurodollars).
There may be a tendency to ascribe all of this recent balance sheet detachment to expectations about the Fed winding down QE and then ending ZIRP; a natural impulse given the misconceptions more generally about what constitutes liquidity and even a "dollar." There may be some of that being incorporated into greater volatility calculations, and thus VaR, if only upon the contours right now, but I believe this is much deeper and more serious (permanent) than just a policy shift.
Just over a week ago, HSBC announced its intent to cut down its operations by an enormous proportion, proposing asset sales and employee reductions of about one-fourth. Plans are for 50,000 jobs to be eliminated (either through attrition or as part of those system sales) along nearly $290 billion in risk-weighted assets. The bank, essentially, is admitting its operations are too big and inefficient.
That move follows several shakeups all through the banking sector globally, particularly the European behemoths that formed the bedrock of the eurodollar system as it was built in the 1990's and 2000's. Deutsche Bank removed both its co-chairs under near-shareholder revolt, as the bank's restructuring is not bearing enough progress. The bank has now been described as "unwieldy" as well as unprofitable. Credit Suisse sacked its long-time CEO, Brady Dougan, one of the few who actually survived 2008. The reason: he resisted significant downsizing of the bank's global operations.
These were firms that prior to 2007 were all about size and scope, expansion anywhere and everywhere; and the "markets" cheered every acquisition and balance sheet development. The eurodollar standard itself was built upon that trend, as what made size so favorable made also the asset bubbles, with VaR and vega at the heart of it all balancing leverage as money and capital. Now these same banks are "unwieldy" and in desperate need of shearing?
It seems quite clear that the global banking system is again finding itself short of capacity as dealers, what is left of them, scramble for the exits. What started in August 2007 was an irreparable separation between what was (dealer-based, multi-dimensions), what is (central bank-based, reduced dimensions) and what will be (nobody yet knows, which is precisely the problem). Unfortunately, rather than hang on for an orderly hand off between the second and third steps or transitions, banks are again, echoes of 2007, taking their part and leaving without much to fill in the wholesale gaps. This is not to say that we are on the road to repeating 2008, only that the conditions in liquidity, half of what took us down then, may be already as bad. If there were some less benign ignition now, a spark of selling that was the other half of the panic run, what support would there be for orderly pricing?
The answer to that, given already by October 15, December 1 and January 15, isn't very encouraging. To a great extent, Americans are both sheltered and wholly unaware, but the rest of the world is very much alerted to the continued downside of the eurodollar standard. Stocks may be at or near record highs (though broader stock indices, such as the NYSE composite, have gone nowhere since the "dollar" started to rise), but Brazil is in a state of total economic and financial chaos while China flirts with what was never thought possible (growth at Great Recession levels, a massive housing imbalance and now a stock bubble that in some ways puts the dot-coms to shame). There was a "dollar" system somewhat in place, largely before the middle of 2013, which supported all those but no longer does.
The encouragement of quantification was an enormous transformation of money into more than just derivatives and hedging. In so many ways, we are still trying to figure it all out, which is why there is grave concern about the state of global "dollar" liquidity in 2015 even if Janet Yellen refused to acknowledge it publicly. What is clearer is that the eurodollar transformation continues toward an unknown settled state somewhere at some point, and that we will all have to, ironically, take the risks that accompany that uneven conversion even if they don't readily show up in variance calculations.