AI’s demands are driving changes across the data center market. Users are scrambling for space, utilities are struggling to supply necessary power allotments, and communities are working to assimilate new data centers into their landscapes. Tom Traugott explores how data center operators are pulling parties together while providing high-quality, high-density spaces that will drive tomorrow’s digital needs.
The US data center market is slated to more than double by 2030, from 17GW to 35+ GW of demand. While staggering in scale, this growth fits the trajectory of the last three decades, where new technology applications have driven significant increases in demand for server space and power. The Internet, telecom, and dotcom booms of the 1990s were followed by the introductions of enterprise data centers in the 2000s and hyperscale data centers and cloud computing architectures in the 2010s.
The 2020s now have their next evolutionary growth driver: generative AI, with advanced machine learning (ML) at its core. While ML concepts date back to the 1940s and 1950s, it was cost prohibitive until relatively recently to train ML models, and the datasets being processed were significantly smaller: ~1,000 datapoints in 1990, ~1M datapoints in 2000, ~100M in 2010, and finally crossing ~10B datapoints in 2020.
The Heart of the Growth Engine
Modern cloud and GPU offerings’ ability to process billion-parameter datasets quickly and cost effectively has supercharged the field and led to tangible advances in AI and additional demand for more compute capacity. Generative AI applications like ChatGPT have done the most to capture the public’s imagination, but demand for AI goes beyond apps that create images, write essays, and speed up software development. Companies across industries are expanding their use of AI for business insights, and investments in AI infrastructures are being driven not only from the top by the biggest tech providers, but also by upstart GPU-clouds like CoreWeave and Lambda.
AI is not only pushing the need for bigger data centers—it’s driving an emphasis on higher density environments. The issue is the size of the workloads themselves. AI models run on a new class of computing: powerful GPU-driven computing that lives outside traditional cloud architectures. Huge amounts of data need to be crunched to train AI models. That requires intense amounts of computing and additional power, which forces data center providers to make density their top priority when constructing new spaces and retrofitting old ones.
ABOUT THE AUTHOR
Tom Traugott spearheads EdgeCore’s advancement in emerging technologies and currently focuses on how AI reshapes the global data center ecosystem. Engaging with technologists, thought leaders, and community stakeholders, Traugott ensures EdgeCore remains at the forefront of innovation, sustainability, and geographic expansion—fulfilling the company’s commitment to deliver safe, future-ready solutions for the world’s leading cloud and technology providers.
With a career spanning from the post–dot-com era through the evolution of wholesale colocation, hyperscale cloud, and now AI-driven infrastructure, Traugott brings decades of insight to the industry. Before joining EdgeCore, he served at Amazon Web Services, where he oversaw strategy and execution for new and existing regions across EMEA, APAC, and the Americas. Previously, he co-led Cassidy Turley’s (now Cushman & Wakefield’s) Data Center Advisory Practice, and earlier served as Regional VP of Sales at CoreSite Realty Corporation.
Traugott earned his BA in Social Studies with honors from Harvard College.


























