Moore's Law Has Well Exceeded Moore's Expectations

Story Stream
recent articles

In 1971, Intel's first microprocessor, the 4004, contained 2,300 transistors, the tiny digital switches that are the chief building blocks of the information age. Between 1971 and 1986, Intel sold around one million 4004 chips. Today, an Apple A8 chip, the brains of the iPhone 6 and iPad, contains two billion transistors. In just the last three months of 2014, Apple sold 96 million iPhones and iPads and put around 30 quintillion (30,000,000,000,000,000,000) transistors into the hands of people around the world.

This is the fruit of Moore's law, which turns 50 years old this week. In the early 1960s, silicon integrated circuits were new and expensive. The U.S. military was the chief customer. Gordon Moore, a young physical chemist at Fairchild Semiconductor, however, saw a much bigger future: "I wanted to get the idea across that this was going to be the way to make inexpensive electronics." Writing for the 35th anniversary issue of Electronics, Moore extrapolated just a few data points and said the number of transistors on a chip could double annually for the next 10 years, reaching 60,000 by 1975. This rapid progress, he wrote, "will lead to such wonders as home computers - or at least terminals connected to a central computer - automatic controls for automobiles, and personal portable communications equipment." In 1970, Carver Mead, a Caltech physicist and Moore collaborator, dubbed the new curve of silicon scaling Moore's law.

Despite the apparent success of Moore's prophecy, however, information technology is increasingly viewed as either impotent or dystopian. Many think Moore's law has been repealed, that the pace of chip performance improvements substantially slowed beginning around 2005. The National Academy of Sciences has called it a "crisis in computing performance." Information technology more generally, says economist Robert Gordon, has not lived up to the hype. It wasn't as powerful as the Agricultural and Industrial Revolutions, or even the age of plumbing, electricity, and air conditioning - and in any case has peaked and is now waning.

Others argue just the opposite - that information technology is so dangerously powerful that robots will steal most jobs. If, that is, artificial intelligence doesn't first wipe out the human race.

Sorry to disappoint, but the case for information technology is much brighter than either the pessimistic or catastrophically optimistic scenarios. (See our new report: Moore's Law: A 50th Anniversary Assessment.)

It's true that to avoid excessive heat, clock speeds of chips leveled off around 2005. To compensate, however, Intel and other firms started designing chips with, among other innovations, multiple processor cores. Steve Oliner, a former Federal Reserve economist now with the American Enterprise Institute, shows that annual chip performance gains over the last decade have actually continued at 32%, near the five-decade average of 40%. A microprocessor in 2013 was thus 1.5 million times more powerful than Intel's first in 1971.

In addition, Google, Amazon, and others spent the last decade linking tens of thousands of chips and disks to deliver "warehouse scale computing" - or supercomputing for the masses. Parallelism on chips was thus matched and augmented with parallelism across Internet data centers, now accessible via two billion smartphones. In 1965, the only computers were large, centralized mainframes at universities and large corporations that were still fed by punch cards. A small number of lucky workers, professors, and students had to wait in line to access scarce computer time. Today, the centralized computers of our era - those in "the cloud" - generate 4.7 zettabytes (1021) of data per year.

The economic benefits have likewise been underestimated. The government's producer price index (PPI), for example, says microprocessor prices have virtually stopped declining. But Oliner and his colleagues believe this reflects a quirk in Intel's pricing strategy. They estimate that the actual annual price drop for 2000-13 was 44 percent. According to the official government measure, $100 worth of computing power in 2000 could be purchased for $1.40 in 2013, which sounds impressive. The actual cost in 2013, according to Oliner, however, may be just 5 cents ($0.05). By this measure, consumers can purchase 28 times more computing power per dollar than the official data suggests.

Dale Jorgenson of Harvard, using new data from the Bureau of Economic Analysis, shows that over the last 40 years nearly all the gains in total factor productivity (or technical innovation) have come from information technology and that IT accounts for between 50% and 70% of all productivity growth. In 2007, William Nordhaus of Yale, looking back two centuries, found that the labor cost of computing (hourly income per unit of computation) had declined by a factor of 73 trillion. Based on his numbers, I estimate that since Gordon Moore plotted those first few data points the labor cost of computing has fallen by a factor of roughly one trillion.

It is no coincidence the microchip, software, and Internet industries have been the most explosively innovative. The digital world, unlike other sectors, has flourished in a policy environment free of heavy handed regulation.

Moore's law unleashed relentless spirals of innovation, lower prices, surprising consumer demand, more capacity, and further invention. These learning curves depended upon a freewheeling environment where entrepreneurs could both iterate rapidly and also take big risks on unproven technologies and business models.

In the previous generation, Bill Gates exploited Intel's newly abundant transistors and created the vast PC market, which dramatically expanded Intel's markets and volumes and thus allowed it to invest in new designs and chip factories. Today's digital economy depends upon similar virtuous circles among broadband, app, device, cloud, and content firms.

In the next five years, silicon transistors will shrink to just 7 nanometers, approaching fundamental atomic limits and signaling an apparent end of Moore's law - for real this time. But as Gordon Moore recalls, "there always seemed to be a barrier two or three generations away." Experiments with a variety of new devices and materials, from 3D chip stacking to spintronics to carbon nanotubes, are yielding a surprising number of encouraging pathways to the future.

The single biggest threat to this virtuous circle of Moore's law and the information economy is the Obama Administration's new Internet regulations, just published on Monday. Over the past several decades, Moore's law enabled the construction of a largely unregulated, parallel communications platform, which transcended the government imposed scarcity of monopoly telecom. Between 1990 and 2015, Internet traffic grew from 1 terabyte per month, about the size of your PC hard drive, to 75 million terabytes per month. But now, in an inexplicable reversal of a wildly successful, multi-decade, bipartisan policy, Washington is imposing the old rules on the new network.

Fortunately, America still has genius renegades dreaming up new inventions in spite of Washington. Maybe they will have to build yet another network out of Washington's reach. "It's extremely important that there's a living example of people's belief in the future," summed up Carver Mead. "That's really down deep what Moore's law is about."


Bret Swanson, a visiting fellow at the American Enterprise Institute and a U.S. Chamber Foundation scholar, is president of Entropy Economics LLC. 

Show commentsHide Comments

Related Articles