$67B Video Game Industry Crushes Monetarist Mysticism

X
Story Stream
recent articles

It is not very likely that most people know how far the video gaming craze went in the early 1980's. Atari was the king of the second-generation consoles, and the explosion in the Atari 2600 convinced competitors it was a worthwhile risk. Barriers to entry were few and far between as digital technology and productivity were driving down costs of equipment and even development. When Atari introduced its expected blockbuster E.T. The Extra-Terrestrial, the game spent only six weeks in development.

Undoubtedly that contributed to its spectacular failure, as it usually takes sufficient time and attention to develop digital and computer products. But the years leading up to that were a kind of open-ended industry, almost pure competition for what was believed to be an unending growth trajectory. Even Quaker Oats (yes, the oatmeal folks) got into video games, starting its US Games subsidiary in 1982 that produced 14 forgettable titles. It closed down in 1983.

That was the year of the great crash. A saturation of games with no clear business purpose or strategy, other than just to sell as many titles as possible, left the industry vulnerable to what conventional economists might call a demand shock. The failure of the E.T. title along with so many other junk games left retailers unable to meet overly optimistic sales projections - they always seem to predict in a straight-line (or parabola). Unable to move units, they were sent back to the producers.

In the desert near Alamogordo, New Mexico, it is believed that Atari was forced to bury several million copies of unsold E.T. games, though it is suggested in several places that that was just an urban legend. The great crash of 1983, however, was not. Atari made it through, but barely. In addition to the Quaker Oats exit, Magnavox (the company that pioneered the first, by several definitions, 1st generation console, the Odyssey) and Coleco followed. A company called Imagic, third party developer of Atlantis, was forced to pull out of a planned IPO the day before launch. It was eventually liquidated in 1986.

Only a few businesses survived the 1983 shakeout, but those that did were forced to innovate and develop in order to recapture what was obviously an enormous potential. The third generation of consoles, beginning with Nintendo's NES, changed the course of consumer electronics.

Nintendo itself had undergone numerous business variations in its century-long history. The company began as a trading card business in Japan in 1889. Their first major revolution came in 1959 when they acquired a license from Disney, allowing them to produce cards with Disney characters. No longer limited to gambling and gaming, Nintendo was off into the toy and children's games business.


That too faltered by the late 1960's and technology advances had a lot to do with it. By 1973, Nintendo was the first to license the Magnavox Odyssey in Japan, putting it into the gaming business. Looking to get into the fast growing arcade business, they developed a game called EVR Race and then hit gold with their second title, Donkey Kong.

Donkey Kong was developed by Shigeru Miyamoto. Not only did Miyamoto's creation help Nintendo weather the crash of 1983, he would also go on to produce and develop titles like Super Mario Brothers and The Legend of Zelda, games that, combined with the Nintendo NES, grew the industry into far more than a toy niche. Keeping Miyamoto meant prospering.

It was not that way at other companies. Even though Atari was a leader in game consoles in the 1970's, engineers were never given credit for the titles they developed. The business model of Atari placed far more emphasis on the equipment, believing that was what drove sales growth - as if gamers would play just any title because it was on the 2600. That might have been true until its development team came up with blockbuster games like Pac-Man and Pitfall!.

In 1979, several Atari engineers made a value judgment. They were getting paid about $20,000 per year in salary, which was more than a pittance, but saw that their titles brought in roughly 60% of the $100 million in revenue for the company. Atari gave them no credit for the games and paid no bonuses, commissions or royalties. They did what any free marketer would do; they found some capital and started their own firm.

The former Atari programmers David Crane, Larry Kaplan, Alan Miller and Bob Whitehead teamed with a music industry manager, Jim Levy, and a venture capitalist, Richard Muchmore, to found Activision. That company would become the world's first third party game developer. Concentrating on developing titles to be licensed to various hardware platforms, they showed that it was games that drove sales and not platforms. The Pong world had been superseded.

Like Nintendo, they developed enough successful games to weather the great crash of 1983. In doing so, Activision solidified the third party business model for the industry, revolutionizing not only the way gaming would move forward, but changing the value construction and framework beyond that industry.

The gaming industry itself was an offshoot, and certainly an underappreciated and misunderstood one at that, of the overall computer/digital revolution of the 1950's through the 1970's. It was emblematic of the other subsectors as well, such as telecom, where technology that was created and nurtured in the preceding decades was finally unlocked to build entirely new businesses and industries that did not exist before. In fact, these new industries could hardly have been conceived at the time the revolutionary innovation was in its infancy.

That is the essence of capitalism and the hope it brings for a rising living standard. It is often called productivity, but that leaves something out of the image, such as how these infant offshoots take the form they do. The gaming industry is interesting, particularly as it relates to Activision's creation, in that there is always a balancing act between costs and the risks for future growth expectations. These businesses cannot survive without some attention to their cost structure, but, as Atari demonstrated, they cannot place too much emphasis on it without creating such a value mismatch that they destroy their ability to harness the real assets in their firm: in this case their designers.

The marketplace is decidedly messy, particularly when competition is so ubiquitous and without shelter. We often think of entrepreneurs in that age as the lone wolf in the garage, like Bill Gates, creating something brand new from scratch. But that is not really how it works. Innovation is a chain of incremental progress that suddenly makes a jump due to absolutely unknowable conditions.

Samuel Morse, for example, is still widely credited for inventing the telegraph, but he did no such thing. The first crude version was introduced by Samuel Soemmering in 1809 in Bavaria, with improvements by Harrison Dyar in the United States in 1828. A few years earlier, William Sturgeon in England came up with a workable design for the electromagnet, which was furthered by Joseph Henry in the US in 1830 when an electric impulse was sent over a mile by wire to strike a bell. Morse combined all of them into a commercially viable product - a leap to be sure, but not isolated.

Only the marketplace could create the telegraph, just as only the marketplace could turn the nascent and open-ended digital display interactivity into a $67 billion per year industry (2012 Forbes estimates).

But there is a striking contrast here that is perhaps unnoticed on casual review. Morse's final stage of that particular innovation chain started perhaps the stirrings of what would end in the internet - the telecom revolution. It was certainly primitive at the time by today's standards, but it came to define the idea of the value of the relation of information to time. It was productivity that has little dispute in its worth or essential pieces of modern life.

Video games, on the other hand, are an expensive toy, fit only for recreation (sure, there are some educational uses, but that is not what generates the $67 billion). Some would say that is entirely wasted effort, particularly with so many other problems in the world. Others note how exploitive these innovations are in the face of how they perceive the global economic system in its functional processes. It makes no sense to them to use scarce resources for our own amusement when so many others "suffer" for it. For that reason, among others, more control is often demanded over the marketplace to make sure we don't "waste" resources.

The additional element of centralized control, however, only ensures waste. The great video game crash of 1983 seems to fit that description, but only in a very narrowly construed sense of history and proportion. The crash was a necessary experience to unleash what the industry became. Without the crash do we get the Nintendo NES, extending to the eighth generation consoles and games that are about to be unleashed? The answer is almost assuredly no.

The larger lesson here, if it needs to be called that, is about this "wasting" of resources and economic control. The industry did not spring to life of its own accord; innovators saw an opportunity to fill a need, or desire, that always existed. In fact, it took market actions and disarray to create the perfect set of circumstances for the industry to grow at just the right time. There were attempts at gaming as early as the 1940's, but they never amounted to much because businesses and individuals in the marketplace saw that there did not exist the right conditions for sustained success (factors like TV saturation and disposable incomes).

Primarily, we are looking at a near perfect example of how supply leads to demand, and not the other way around. The creation of new technology creates demand; there is no mechanism for the opposite. It takes visionaries and risk-takers to peer off into the distant future to surmise what markets might support of new technology and innovation, and then to act upon those instincts, taking risks to make them realistic. The products themselves always have to come first - nobody in 1972 could have credibly asserted that the pioneer efforts of the Magnavox Odyssey would lead to $67 billion per year but that was the potential they were seeing. They just had to build it first, and then survive its constant reshaping and the dynamic forces it unleashes (for the good).

Further than that, failure of demand is not a net economic negative over time. What if Paul Volcker had decided the video game industry was too valuable, too much in our national interests, to let fail as it did? The Volcker Fed might have tried to "stimulate" demand for video games in the general way that it does, by "stimulating" the hell out of everything around it, thereby giving the public debt-driven "money" to actually buy millions of those wretched E.T. copies. It would have made little positive difference because the crash was actually valuable in the long run, and any additional contribution to the industry for those terrible games would have simply delayed that inevitable crash.

Monetarism and centralization don't solve the moribund economic malaise because the supply side must develop first. Demand will always follow. In that way innovation is like the process of smaller firms driving employment gains, the gazelles. There is absolutely no way to prejudge which firms are going to drive growth, only the market and all its messiness can do so. Just as governments or agencies cannot and should not pick winners, they should also not try to save all of them from bankruptcy or liquidation. Failure is a vital part of that market process, as it fine-tunes the business senses and focuses the attention of innovation in the most realistic and plausibly sustainable directions. Demand-type programs and tendencies have the opposite effect, like a beam of light passing through a prism and diffusing in every direction.

Worse, government or fiscal programs simply entrench the inefficiency. A Keynesian approach in 1983 might have the government buy all those E.T. copies with borrowed money, cheapened by an early appearance of Greenspanism, to "step in" for the demand "shock." Politicians could proclaim a victory in keeping the industry alive during "tough" times, performing their economic duty as they saw it. However, all that would have done was destroy the market for games since there would have been much less incentive to improve on the horrible crop of titles that crippled it in the first place. With a built-in market, which might become permanent with the right mix of lobbying dollars, the impulse to drive new demand with changing supply and innovation potential withers at a diminished priority. Further, that is something that is entirely proportional to the scale of intervention and usually comes with an implicit expectation of perhaps reducing competition.

I'm sure that any criticism of this counterfactual view will be driven by pointing out the element of silliness in it - the government bailing out the video game industry through fiscal and monetary measures. Like Plato's just city in Republic, what is true for the small is true for the large. Using the video game industry as an example may be silly, but is it all that much different than on an economy-wide scale? All the problems we surmise from such silly examples are highly evident in the current economy, including the inability, despite trillions and trillions on both fiscal and monetary policies, to generate even a marginal recovery from the Great Recession.

Worse, the fact that economy has yet to experience a recovery is itself significant since orthodox economics posits symmetry in the economic cycle. The absence of full recovery, after five years, more than suggests something very much amiss. There is no stability in business, and trying to achieve it only succeeds in introducing debilitating sclerosis. After decades of "stimulating" demand in the name of such stability (it was suggested as "defeating" the business cycle in the 1990's, as if that was a positive goal), perhaps the innovative impulse and narrowing market focus has been too diffused to create the processes that robust economic conditions demand. Unlike Atari after the Christmas holiday in 1982, the current economy is being held back by policy demanding greater and greater production of the unplayable E.T. game.

 

Jeffrey Snider is the Chief Investment Strategist of Alhambra Investment Partners, a registered investment advisor. 

Comment
Show commentsHide Comments

Related Articles