Is There a Monetary Story Behind the College Bubble?

Story Stream
recent articles

It is hard to know exactly how to interpret the plight of the modern college student. The term "lost generation" isn't exactly original, having been used to describe grads during the three "jobless" recoveries of the interest rate targeting age, but the current predicament is much more dire and unrelenting. At least the graduate population after the dot-com recession had a housing bubble to look forward to, as the current iteration of asset inflation holds little such "hope."

The labor force participation rate for those people 25 years and older with a Bachelor's degree peaked around 1995 at 81%. After the 2001 recession it was about 79%, but fell to 78% during the 2002-03 "jobless" recovery. From there it held steady throughout the housing bubble fiasco. At the official end of the Great Recession in the middle of 2009, the participation rate had dropped down to 77%, and has fallen at the steepest pace in the years since - now only 75%.

As bad as those results are, and they are in direct conflict with statistics that show an improving economy, they are not significantly different across the labor spectrum. The participation rate for those 25 and older with no college at all peaked in 1997 around 66%, about 15% below those with a Bachelor's degree. By 2002, the rate had fallen to about 64%, a 14% spread to college grads. By the middle of 2009, the spread was again at 15%. The participation rate for those with no college is down all the way to an astonishingly low 58.5%, but that rate progression has largely been in tandem with college grad cohort.

I suppose it offers some evidence of the value of college education, but it is far less clear about what that value might be. A staff report by two economists at the Federal Reserve Branch of New York estimated that in 2010 only 62.1% of college graduates were employed in a job that required a degree. Of those 62.1%, only 27.3% were in a job that matched their degree or major.

There are a few ways to interpret those results, including the idea that college can at least provide an advantage in the labor force, even if it wasn't where you envisioned. But overall, given the trends in post-secondary education I think there is another way to look at these figures.

It is no secret that there is growing angst and dissatisfaction with college as a value prospect. In the course of trying to discern whether college as it exists today as an educational process is worth the outlay, there have been a number of attempts to quantify the results of what is essentially a costly certification process. This is partially a result of parents making valuations judgments on outlays for their children, and as a consequence of those parents recalling their own involvement. The popular perception of the college "experience" has so shifted in the past generation (perhaps two) that such doubts are no longer controversial at all, particularly as parents today mostly fall within that generational timeline.

In 2011, authors Richard Arum and Josipa Roksa published a book, Academically Adrift: Limited Learning on College Campuses, that tracked 2,300 four-year college and university students and their academic trajectories. If you are really interested, you can read their methodologies for controlling as many variables as possible, including academic proficiency upon entry, but, as you might surmise from the title, the results they gathered were entirely predictable and very much align with popular perceptions of the current state of college learning.

Among the most interesting findings, Arum and Roksa estimated 45% of students "did not demonstrate any improvement in learning" in their first two years of study. Worse, 36% of students failed to improve after four years. Worse yet, of those that actually made gains, overall they were minimal. As part of their conclusions, the authors assess most undergraduates as, "drifting through college without a clear sense of purpose..."

I don't believe it would be controversial to state that such a conclusion, if valid, is in direct opposition to how college education was viewed in the 1960's or before, for example. That would suggest some kind of paradigm shift in the intervening decades. Anyone with even a passing knowledge of what has taken place in education knows where such a dramatic change came from - money.

In 1965, total "on-budget" support of post-secondary education from the federal government totaled just less than $1.2 billion ($8.7 billion in constant 2012 dollars). By 1980, that expense grew more than eightfold to $11.1 billion. And while there was no Guaranteed Student Loan program in 1965, by 1980 there were $4.6 billion in loans outstanding with the federal government's guarantee. That was the first wave (or second if you classify the GI bill for WWII vets as such) of money into the system, and with it an influx in the student population.

There would only be modest growth in college money throughout the 1980's and 1990's. Beginning in 2000, there was an explosion. In that year, the total on-budget expense was about $15 billion. By 2012, the federal government was spending just under $70 billion. The Guaranteed Student Loan program, now called the Family Federal Education Loan program, expanded from $22.7 billion in 2000 to $67 billion in 2009 when it was closed and folded into the Direct Loan program. None of these figures include state expenditures on their own public universities and loan programs.

Total student debt, at around $300 billion at the end of 2004 is now greater than $1.1 trillion, the so-called student debt bubble. And it has nearly all been moved under the auspices of the federal government. Since 2009, it has been, by far, the largest single source of credit in the US economy.
Between 1990 and 2000, the number of 18 to 24-year olds grew 1.7% (not per year; total) to 27.3 million. That is not a perfect proxy given that there has been a non-trivial trend toward non-traditional students and programs for them, but it is at least a solid approximation and basis for comparison. By 2010, that age cohort had grown to 30.7 million, a decadal increase in population of 12.6%.

Clearly, there was some positive momentum in the 2000's for college enrollment due to population and demographics. But that does not account for the sheer size of the college boom and bubble. Total enrollment at Title IV participating institutions in the fall of 1993 was 14.3 million. By the fall of 2000, total student enrollment had risen modestly to about 15.7 million (9.7%), which is consistent with the modest growth in dollar outlays and borrowings for college at that time. Given the low population growth of that decade, it also represents a nearly student for student increase in the proportion of "kids in college." But where there were moderating factors in the 1990's, for the fall 2010 semester there were 21.6 million students enrolled, or a massive 37.6% increase.

To think that has not played a role in the "quality" of educational outcomes is to be fully naïve. If one tended toward the cynical, one might even suggest that, given the widespread and growing dissatisfaction with college now, it appears as if colleges and universities simply invented courses and programs to keep these students busy and enrolled regardless of eventual career paths and student expectations. That such a perpetration would be consistent with the sudden appearance of "demand" alongside an atrocious enlargement of the pool of "money" dedicated to paying for that demand is simply logical.

If someone showed up at your door tomorrow with an immense amount of cash seeking some product or service, it is not unexpected that you would take the cash and worry later about fulfilling those needs. It's even more likely when the delivery of that service is nearly devoid of any kind of direct tie, and therefore accountability, between the ultimate payer and the consumer (like healthcare?). That would certainly offer an explanation for the rise in identity studies, popular culture classes and obscure theory, as well as the ascension of often-generic "business" degrees as the most prevalent. With the advent of such courses of study, students might "drift" without a "clear sense of purpose", but at least their checks would clear.

Was the rapid rise in college students, particularly in the 2000's, a result of rising demand for college, or as a result of the massive increase in money for college? There seems like a bit of chicken and egg going on here, but I think it can be unpacked by relatively simple means. To put it another way, did the increase in the availability of funding, including loans, create demand for college services that might not otherwise have occurred? I think the answer is most definitely affirmative.

Whether or not that is a "good" outcome is very much debatable. It is entirely possible, as the research I have cited here and the volumes of study that exist elsewhere suggest, that the end result of this monetary intrusion into colleges has been to take unready or poorly matched kids and push them in a direction that both leaves them ill-suited enough to make a successful outcome nearly impossible and deeply indebted for their troubles. That would mean the influx of money simply oversaturated what would "normally" be the size of the pool of college-ready adults. In this rush to get everyone educated, we may be creating far more college grads, as the FRBNY study concludes, than is needed by the marketplace - leading to an epidemic of underemployment and thus a growing dissatisfaction with the entire process (the Great Recession and conspicuous lack of recovery are playing a role here, but these trends predate that event and have only been strengthened by it; that is especially apparent of the student loan annexation by the federal government as a means to "stimulate").

That is perhaps the most powerful evidence in favor of the argument that money is creating demand here, and not simply satisfying an existing need. This financial intrusion, in its most basic form, overrides the prices and value relationships that would otherwise signal to market participants against taking on college burdens. If potential college entrants knew that they would end up in a job that didn't require a diploma (and the FRBNY study says that almost 40% don't, at least at their current job; left undiscovered is the role of college education in the overall career trajectory as people progress and advance), is it not likely that they might at least re-evaluate options and preferences? If you are told at the outset that you will be stuck in a job waiting tables, to use an admittedly extreme and simplistic example, you might at least pause and re-assess how valuable the college "experience" really is (or at least your parents might).

Of course there are those that attend without careers and jobs in mind, but by far the majority is seeking ultimately gainful employment, chasing their own perceptions of their own abilities in that path to prosperity. And in evaluating those signals, devoid of the monetary clutter, perhaps the idea of meaningful experience vs. formal education becomes more relevant, and even skewed away from the expense of post-secondary education (or any formal education, for that matter). High proportions of college education are, obviously, a recent phenomenon. You can make a valid argument that globalization and the "21st century workforce" demand higher educational attainment, but that is not at all conclusive nor is it in full alignment with even recent precedence.

While I certainly believe that this is a valuable and necessary discussion on its own merits and for its own purposes, my interests lay in a tangential relationship. I think there is enough here to question the "settled" idea of monetary neutrality. If we think about this college "bubble" as it relates to economics in general, and aggregate demand in particular, it follows all the main orthodox themes. The introduction of money "creates" demand that then fosters further activity, including job growth, so that in the short-run the system appears better and more robust for the efforts.

There is more than a little truth to that assertion, particularly in the college and university example used here. Not only did the population of university students explode in the 2000's, so did the number of employees at these institutions. In 1997, there were 2.8 million staff members, including 1.8 million "professionals" (faculty, administration, grad assistants, etc). For the fall semester in 2011, there were 3.9 million employees (+39.3%). Total faculty employment rose 53% to 1.5 million; the number of employees in administration rose 56%. The only class of employees not to grow so substantially was the "nonprofessional" class of janitors and maintenance workers (+0.7%).

It is the dream "stimulus" package of all-time: an entire system of economic advancement that creates an increase in aggregate demand and jobs, all backed by monetary prowess. But what would happen if the credit or government largesse were turned off tomorrow? It would collapse almost totally because demand is held artificially high by that same monetary "prowess."

That would seem to bolster the argument in favor of monetary neutrality, except that the entire education system is not neutral for having undergone such monetary-driven expansion. Not only in terms of the additions in labor resources and physical facilities, including athletics to the point that college football players want to unionize, but also in the behavior of society itself. There is now almost shame in not only not going to college but even consideration against it. That largely rests upon this idea that everyone should go because of the money available for it. In other words, college is certainly the way to prosperity for some people, but only recently has the idea that college is the only way to achieve financial security and success become so widespread.

Further, other behavioral changes lie in the system orienting away from the primary goal of individual suitability and attainment, tending more toward self-perpetuation. Such alteration disrupts the market-based signals in creating the highest potential efficiency for the employment pool as a whole. There is a reason that the aggregate size of the "nonprofessional" staff, who have little voice in the setting educational expenditures, did not grow while the ranks of the "professionals" exploded more than the pace of student enrollment. That is the very character of inflation itself, as individual agents become captured by the inorganic intrusion of money rather than market signals to do something more sustainable and ultimately more productive. Once that becomes entrenched, it no longer responds at all to any market signals of waste and inefficiency, instead resisting even minor evaluations and reform.

That is why monetarism is far more than just wasted resources and malinvestment. Those are bad enough on their own, but they are so often accompanied by these changes in behavior that go unnoticed, the hidden and high costs of using so much central authority. That is especially true as all they teach during that "schooling" is there are no alternatives, and this is the best that can be achieved. In a word, it is sclerosis; as potent an economic depressant as has ever been identified.


Jeffrey Snider is the Chief Investment Strategist of Alhambra Investment Partners, a registered investment advisor. 

Show commentsHide Comments

Related Articles