X
Story Stream
recent articles

It really boggles the mind if you stop and think about it. The worst economic contraction in modern history, the United States, in particular, seeing millions upon millions thrown out of work in a relentless destruction of income potential. The ranks of the employed crashed by an unimaginable degree, various measures of unemployment citing an ultimate peak of perhaps 25% to 30% or even more of all available workers.

And yet, while the number of payrolls were being decimated, the estimated real wage skyrocketed. Yes, you read that correctly; the more millions left for the charity of soup-lines before starvation, the higher the real wage rate had climbed. This head-scratching result even has a fancy name: SRIRL, or short-run increasing returns to labor.

We don’t, nor should we, associate rising wages (SRIRL) with the chaotic annihilation of the equally fundamental economic function of aggregate output. Normally, during every other time period but this worst case, wages move countercyclically; that is, they would fall as businesses struggle in an economic downturn, and rise as the economy recovers.

The evidence from the early 1930’s, however, is pretty clear and unimpeachable (even though there was no data-collection and tabulation infrastructure in place at the time; there are widespread datasets available each recounting this phenomenon). According to one of those, the real wage rate increased by an enormous and enormously confounding 16% during 1930 and 1931 – the same years when some measures of the unemployment rate skyrocketed from around 3% in 1929 to upwards of 20% and more entering 1932.

Real wages then stayed high for the rest of the decade despite its lack of recovery and massive, depression-within-a-depression setback in 1937 and 1938.

How?

Economists have been debating the data, and its implications, almost from the time it happened. Some have said this was due to technology shocks. Others, Keynesians, mostly, have rather convincingly argued (as I have, from time to time) for labor hoarding (firms don’t fire everyone they otherwise might, rather holding on to some additional labor during downturns, keeping some number of workers as a reserve for when recovery arrives and there is no delay nor added costs of training new workers to get busy when it does).

Either way, while nominal wage rates did decline in most American industries during the Great Collapse, it was far less than overall prices had. Thus, the real wage jumped up by a sizable figure projecting higher relative labor costs onto firms in the aggregate; therefore, SRIRL.

Originally, Robert Lucas and Leonard Rapping challenged the prevailing view of their time, around 1969, by trying to match these observations with others in the financial channels – interest rates, primarily. They ended up making their variable for labor supply a positive function of the real wage and the real interest rate, however a negative function of the expected future real wage.

By these econometric equations, the suddenly unemployed were perfectly happy to remain that way if they anticipated wages to be much higher down the road. This would explain not only the puzzle of SRIRL but also another of the Great Depression’s monstrous features: long run unemployment.

Alas, it also doesn’t make any logical sense, just mathematics. Were unemployed workers precariously dependent upon both charity and the rise of New Deal government doles content to just sit home and wait for the maximum real wage rate to show itself years, even a decade later? Obviously, no.

Many have argued that it must have been just that kind of government intervention which explains the vast differences; what seemingly amounts to an intractable labor shortage given the vast disparity between persistently elevated levels of unemployment and these rising and obstinately high real wage rates. From the one perspective, it surely seemed as if businesses might have been struggling to recruit labor from off the sidelines where they had been parked during the contraction.

Several modern studies disfavor the hypothesis. Among the first (and most cited), in 1981 John Joseph Wallis and Daniel Benjamin (Public Relief and Private Employment in the Great Depression) found “results strikingly at odds with the notion that federal relief programs produced lower employment in the private sector in the 1930’s. It would appear that relief served only to redistribute wealth toward persons who otherwise would have been unemployed and without a source of market income.” [emphasis added]

In fact, the New Deal itself was designed just for this purpose. It was never “stimulus”, certainly not in any modern sense, rather the ultimate goal was to do something with this oceanic reservoir of unemployed labor. Keynesian in its nature, proponents wanted to generate government-based incomes for doing government-based work programs as a substitute for private work that just disappeared catastrophically.

The data we have today shows that, again, this was the case; the economy, contrary to much of current popular belief, did not actually recover (though that can scarcely be solely blamed on any failure of the New Deal). Total private hours worked fell sharply during the contraction, but then fell even more following FDR’s imposition of what was really a mishmash of various experimental programs (together known as the New Deal).

All this together suggests that whatever the government did and tried, the private economy failed to reignite at any time along the way. And then partway through, the added devastation of the 1937-38 re-depression.

This, then, may have introduced another sort of friction in the labor market akin to a liquidity preference; that is, perhaps laborers who lost jobs, were unable to find new work and therefore found themselves on the government’s relief payroll stuck it out on relief fearing the gross and palpable uncertainty prevalent in the private economy.

As one such unfortunate soul said in a 1940 Census Survey, “Why do we want to hold onto these [relief] jobs?...we know all the time about persons just managing to scrape along…My advice, Buddy, is better not take too much of a chance. Know a good thing when you got it.”

Hysteresis.

Workers may have demanded a weird sort of uncertainty wage premium(again, akin to a liquidity premium) on top of what they may have otherwise gotten from private employment knowing how precarious private employment had become. Prospective employers, then, would have to scrape together enough cash on hand or credit to be able to pay up these demands given that they’d be hiring ahead of any serious recovery in revenues.

Because of that, their own level of confidence in that recovery must reach an even higher threshold to lay out such a huge commitment – even if they had already participated in hoarding workers ahead of time. In fact, the more labor that may have been hoarded the more confidence may have been needed before meeting the market-clearing wage now including this uncertainty premium.

But if recovery was a very real prospect, as it may have seemed at times especially 1934-35, before the 1937 setback, what else might have been holding them back from seriously reaching into this huge reservoir of labor market slack?

Indeed, we can turn to someone like Ben Bernanke for a big clue. Writing in 1983 (Non-Monetary Effects of the Financial Crisis in the Propagation of the Great Depression), the future Federal Reserve Chairman (of the same infamy…) building upon the work of Milton Friedman and Anna Schwartz (…of having promised Mr. Friedman in 2002 that the Federal Reserve would “never do it again”, meaning cause another serious depression, before allowing that very thing to happen scarcely five years later) identified one key channel by which to propagate and hold such depressionary factors across such an unusually large expanse of time.

Intermediation breaks down.

“The basic premise is that, because markets for financial claims are incomplete, intermediation between some classes of borrowers and lenders requires nontrivial market - making and information - gathering services. The disruptions of 1930-33…reduced the effectiveness of the financial sector as a whole in performing these services.”

During the decade of the Great Depression, this didn’t happen no matter what the Federal Reserve or federal government came up with. The entire banking sector had shifted into survival mode whereby the very thought of extending risky credit in the form of illiquid loans (or the like) was entirely eschewed in favor of holding only the safest and most liquid forms of assets including debt (the basis for the interest rate fallacy).

In favoring this very clear, very destructive liquidity preference, the vital intermediation function of the whole economy broke down. Thus, going further than Friedman and Schwartz, money destruction created a perverse set of incentives that also destroyed intermediation.

Deflationary money becomes the foundation for impaired intermediation that disfavors if not excludes illiquid credits and creditors who are then unable at their weakest in the downturn to muster the monetary/financial resources to pay the market-clearing wage for labor which has grown reasonably more expensive for decidedly non-inflation reasons.

Everything I’ve written above should sound at least vaguely familiar to you as we look back on 2021 with 2022’s imminent arrival, with our current Federal Reserve (along with its compatriots around the world) turning increasingly “hawkish” for stated reasons of inflationary wagepressures to go along with purported real economy expectations.

Never once has anyone at the Fed (or in the mainstream) attempted to reckon with the massive contraction of 2020 which went along with GFC2; each of which took place within an era of no recovery from the first GFC from 2007 to 2009, a prolonged stretch which had already exhibited these same symptoms of deficiency, especially this breakdown in intermediation long before COVID.

We’re left instead to sort out these seemingly contradictory consequences from a web of misunderstanding; banks currently do not want to lend, but they’ll buy up high-grade corporates and government debts by the boatload, any amount at practically any price for the safe and liquid.

Google or Apple can basically name the price for issuing their own debt, as if they needed any more, and “markets” will snap up every last dime at lower and lower yields (the other part of the interest rate fallacy). Yet, for every dollar raised by big name corporates via bond or otherwise, what we never see is what they never saw in the thirties.

Rather than complain about the lack of bond vigilantism as government debts quite naturally skyrocket as an unintended corollary, realize what this means; intermediation has again stopped where only the highest perceived quality obligors are “eligible” for borrowed money and credit, leaving vast swaths of the real economic landscape hidden in their incapacity.

As the labor market attempts to recover from a second major disruption, here again we hear constantly about a “labor shortage”, even some anecdotes about rising wages. However, we know without fail wages haven’t actually gone up nearly enough, like the thirties, to clear the market of its “shortage.”

On the contrary, like 2018, it is increasingly clear businesses are unwilling or, far more likely, also like the thirties, unable to pay the clearing wage/income to account for all these things including what may be another rise in labor’s uncertainty premium (which began by overly “generous” unemployment benefits).

While some have finally come to terms with how the Federal Reserve isn’t really a central bank, how the Fed’s bank reserves aren’t really useful money, and thus inflation due from QE would only be possible through some nebulous sentiment channel. And that’s such a low probability so as to not bother considering (unless you like stocks).

But – they say – the federal government’s upped ante in 2020 and earlier in 2021 was the game-changer for inflation, therefore even if QE isn’t money printing, and it isn’t, then Uncle Sam’s efforts will alone unleash the long-dormant inflation genie.

Drawing upon actual historical and relevant precedence, no, the federal government isn’t actually the deciding factor so many have made it out to be. The New Deal led to no such consequence (in addition, see: Japan), instead all these other deflation-consistent outcomes top to bottom, financial economy to real economy.

The federal government – at best – redistributes what’s available, such as it once did in the New Deal’s vast programming. Another way of saying this is, the feds step in and attempt their own form of intermediation that, historically speaking, just isn’t up to the task; governments aren’t nearly the substitute for the banking system in this crucial aspect.

And if “what’s available” to be redistributed is simply impaired and not gaining, or not gaining fast enough, then inflation, real inflation apart from any temporary supply shocks (see: 2010 and 2011; or 1934 and 1935) is simply out of the realm of possibilities.

“Labor shortage.” Falling not rising interest rates. Growth scare. More hard times for consumers as workers, maybe even what might seem like SRIRL. An ocean of hidden slack, even as the unemployment rate falls once more primarily because of the lack of participation. The government’s dirty boot print all over everything.

Each a symptom of the same thing we’ve witnessed countless times before.

This year began with such promise, I suppose, “stimulus” and vaccines, perhaps a plausible way out of this mess. Reality, though, intruded very quickly and has gone the opposite way even as CPI numbers rose. Thus, the year ends very, very differently than how it began, but it’s not like we haven’t seen all this before.

Too many times. 

Jeffrey Snider is the Head of Global Research at Alhambra Partners. 


Comment
Show comments Hide Comments