AI Promised Magic, Then Wall Street Did the Math
AP
X
Story Stream
recent articles
It’s late afternoon in Menlo Park. Mark Zuckerberg’s senior infrastructure team is camped out in one of Meta’s glass-walled conference rooms. The slide deck peers back from the screen. Zuck, as everyone knows him, appears from 2,500 miles away, on the lanai of his 1,400-acre compound on Kauai. Palm trees tilt in the trade winds. The Pacific catches the last low morning light. He doesn’t need a virtual background.
Zuckerberg leans toward the camera and asks a question. The faces in the room look perplexed. Then he sees the red icon, unmutes it, and starts again.
“What do we actually need to build,” he asks, “if we want to win AI?”
No one comments on the Hawaiian scenery. That’s not how Zuckerberg rolls. To him, a question is like a pool when your house is on fire. You dive in. As he’s said in other contexts, he runs Meta with a “core army” of 25–30 people he tries to include in everything — non-hierarchical, no small talk. 
The head of infrastructure glances at his notes, then back at the screen.
“What we need, Zuck,” he says, “is to build a city.”
For a second, the line just hangs there, like the time an engineer showed up in socks and sandals. Then he explains that he does not mean city as a metaphor. Meta has already told investors it expects to spend between $64 billion and $72 billion next year, mostly on AI infrastructure. That’s roughly 40% of last year’s revenue, comparable to the annual budget of a small G-20 country. It’s also the part that has Wall Street reaching for antacids.
The immediate culprit is hundreds of thousands of GPUs — the specialized chips at the heart of the whole experiment. In the old world, a CPU was a lone clever accountant. GPUs are designed to do thousands of simple calculations at once. They were the clerks in the back office. AI quietly flipped that. Now GPUs mean the clerks are running the show. 
The Rolls-Royce of GPUs, Nvidia’s flagship AI chip, the H100, sells for something like $25,000 to $30,000. A single high-end training cluster can contain tens of thousands of the buggers. Before anyone pours a slab of concrete, a top-tier cluster can run $3–5 billion in hardware. Lead times run nine to fifteen months. You don’t click “Buy Now”; you join a queue that stretches around the globe. The rest doesn’t get prettier, especially if you happen to be a Wall Street analyst.
The International Energy Agency projects that electricity demand from data centers, AI, and crypto will double by 2026 to over 1,000 terawatt-hours a year — think Japan’s current power use. The existing grid was not designed for a social-media company to roll up demanding a gigawatt for itself. So Meta needs dedicated substations and long-term power.
Then all that data has to move. Training large models at scale requires low-latency fiber between these campuses. That means new long-haul routes — trenching along highways, boring beneath rivers, negotiating every right-of-way. Depending on terrain and permitting, high-capacity fiber can run from tens of thousands of dollars to well over $1 million a mile. The supply chain has also become a bottleneck. Lead times stretch from months to two or three years.
“This is,” the engineer says finally, “infrastructure on the level of a major metropolis, built from scratch.”
Zuckerberg gives a small nod. As the palm trees move in the background, the balloon caption is almost visible: for eighteen months, the market treated AI like a gold rush — free money for whoever staked the first claim. Up close, the story looks more like utility construction. And utilities, as investors know, don’t move at internet speed or trade at internet multiples.
Step back for a moment to consider there is nothing wrong with AI technology or the dreams attached to it. Not long ago, AI felt like a magic word on Wall Street. Nvidia briefly joined the rarefied club of companies worth more than $3 trillion. Every CEO mentioned “AI” on earnings calls the way executives in the late 1990s mentioned “the internet.” Then the mood shifted. It only took a few rumblings that said, this isn’t an amusement ride, it’s an earthquake.
On one Microsoft call, management talked about “prioritizing” AI projects. At Alphabet, quarterly capex for servers and data centers suddenly jumped past $22 billion, with CFO Ruth Porat signaling a full-year spend in the $85–90 billion range, “overwhelmingly” for AI infrastructure. Meta raised its own 2025 capex forecast into that $64–72 billion band and watched its stock drop by double digits the next day. In one particularly rough stretch, big tech names saw roughly $750 billion in market value erased in a matter of days, as investors tried to re-price the AI spending boom. The market had started to do the math: “How much is this gonna cost us?”
When people reach for an AI metaphor, they still talk about the gold rush. It’s not quite right, but there is resonance. Gold never fully justified its mythology. There was not all that much of the shiny stuff, and most ended up on wrists and fingers. The real story was in the second word — rush. Money flowed to whoever could move faster — lay rails, string telegraph wires, build hotels, open banks, print newspapers. Gold was a hobby. But if you brought speed, that was a business model.
AI is staging a variation.
The “artificial” part is where we get into trouble. Most of the cost sits in the physical infrastructure needed to train and run modern models. The “intelligence” part is where the long-term value lives, the ability to embed systems inside search, commerce, logistics, medicine, creative tools — every broken workflow that clogs modern life.
The artificial part is trenching. Intelligence is how your doctor reads an image, how your bank flags a suspect transaction, how your phone answers a question in plain English. Those sound like nice-to-haves, but they turn out to be big deals indeed. McKinsey, in its more optimistic modeling, estimates that generative AI could eventually add $2.6 trillion to $4.4 trillion a year, roughly like adding Germany to the sector’s valuation.
Psychologists have a word for steep learning curves that we tend to misread. Daniel Kahneman, the late Nobel laureate, spent a career mapping unfortunate mental shortcuts. He warned about exponential growth bias — the stubborn tendency to mistake compounding for a straight line.
History is filled with such parables, the best known of which appears in a 13th-century chessboard legend recorded by the Persian scholar al-Biruni. A Brahmin counselor surprised his ruler by requesting a single grain of rice on the first square of a chessboard, doubled on each of the sixty-four squares in return for his wise advice. The king agreed, thinking he had outsmarted a court philosopher. The treasury discovered that by the thirty-second square, the due bill reached four billion grains; by the sixty-fourth, it exceeded the entire rice harvest of the medieval world. It was what the Wall Street analyst says: do the math.
Investors make the same mistake. On the way up, we convince ourselves the rise is sustainable. On the way down, we feel betrayed. In both cases, we mistake straight lines for eternity.
This is why the MAAG firms — Microsoft, Amazon, Apple, and Google — occupy such an odd position. They’re not just racing to build bigger models. They’re racing to weave whatever emerges into daily habits: shopping, messaging, maps, word processing, video, payments, health records. Their upside is measured in how much of everyday life they can quietly refactor.
A McKinsey partner put it to me more directly over coffee: every technological revolution begins in infinite possibility and ends in finite spreadsheets. That’s the reason Meta’s stock, which had sprinted ahead of most valuation models, suddenly had to absorb the reality of that $64–72 billion capex band. A strategist at one large fund framed it: “We priced in infinity, and then someone handed us a tape measure.”
Which brings us back to Menlo Park.
On the screen, tropical light spills across Zuckerberg’s shoulder as the head of infrastructure walks through the terrain. Around him, the war room looks less like a tech startup and more like a small cabinet for a mid-sized country. Zuckerberg ponders. He doesn’t blink or flinch.
“So the future,” he says at last, amused, “depends on how fast we can build a city.”
From far away, you can still tell the fairytale. AI will change the world. But the euphoria phase is ending. The construction phase has begun. The question he’s really asking is how long investors are willing to hold their breath while the walls go up. 


Comment
Show comments Hide Comments