Originally posted on Data Center POST.
A Vision for the Next Era of Compute from Structure Research’s Jabez Tan
Framing the Future of AI Infrastructure
At the infra/STRUCTURE Summit 2025, held October 15–16 at the Wynn Las Vegas, Jabez Tan, Head of Research at Structure Research, opened the event with a forward-looking keynote titled “Where Is AI Taking Data Centers?” His presentation provided a data-driven perspective on how artificial intelligence (AI) is reshaping digital infrastructure, redefining scale, design, and economics across the global data center ecosystem.
Tan’s session served as both a retrospective on how far the industry has come and a roadmap for where it’s heading. With AI accelerating demand beyond traditional cloud models, his insights set the tone for two days of deep discussion among the sector’s leading operators, investors, and technology providers.
From the Edge to the Core – A Redefinition of Scale
Tan began by looking back just a few years to what he called “the 2022 era of edge obsession.” At that time, much of the industry believed the future of cloud would depend on thousands of small, distributed edge data centers. “We thought the next iteration of cloud would be hundreds of sites at the base of cell towers,” Tan recalled. “But that didn’t really happen.”
Instead, the reality has inverted. “The edge has become the new core,” he said. “Rather than hundreds of small facilities, we’re now building gigawatts of capacity in centralized regions where power and land are available.”
That pivot, Tan emphasized, is fundamentally tied to economics, where cost, energy, and accessibility converge. It reflects how hyperscalers and AI developers are chasing efficiency and scale over proximity, redefining where and how the industry grows.
The AI Acceleration – Demand Without Precedent
Tan then unpacked the explosive demand for compute since late 2022, when AI adoption began its steep ascent following the launch of ChatGPT. He described the industry’s trajectory as a “roller coaster” marked by alternating waves of panic and optimism—but one with undeniable momentum.
The numbers he shared were striking. NVIDIA’s GPU shipments, for instance, have skyrocketed: from 1.3 million H100 Hopper GPUs in 2024 to 3.6 million Blackwell GPUs sold in just the first three months of 2025, a threefold increase in supply and demand. “That translates to an increase from under one gigawatt of GPU-driven demand to over four gigawatts in a single year,” Tan noted.
Tan linked this trend to a broader shift: “AI isn’t just consuming capacity, it’s generating revenue.” Large language model (LLM) providers like OpenAI, Anthropic, and xAI are now producing billions in annual income directly tied to compute access, signaling a business model where infrastructure equals monetization.
To continue reading, please click here.