There are four distinct opportunities we see AI bringing to the table in 2025 and beyond.
THE IMMEDIATE OPPORTUNITY
In 2024, businesses were still training their AI models, whereas 2025 is the year where more businesses push inference out into the market.
Training is when an AI model is taught to perform a specific task by exposing it to relevant data. Inference is when a trained model is deployed and used to make predictions or decisions on new, or unseen, data. This shift from training to inference signals the movement from “building” solutions to “using” them.
IDC forecasts that 32 percent of all AI spending within businesses will be in GenAI by 2028—a 60 percent, five-year compound annual growth rate (CAGR) from 2024. This increase in investment signals that inference is evolving into more of an organizational strategy.
We’ll see data architecture change once businesses start moving from training to inference. Right now, training happens in secondary markets, where power is more affordable, and many businesses do their data transfer and training in isolation. Once you move into inference, it can no longer be in isolation.
We expect to see this sudden shift from businesses training in isolation, to moving inference into major metro areas as close to the end users’ locations as possible. The more well-connected and positioned, the lower the latency, the faster the response, and the better the AI user experience.
THE SUSTAINABILITY OPPORTUNITY
Businesses today balance sustainability and environmental goals across their operational initiatives—including how they manage their data infrastructure.
For large or medium businesses that have some infrastructure on premises and want to go with higher density, their onsite or managed data centers are unlikely to be as efficient as colocation. One of the sustainability measures that more data centers employ is liquid cooling.
Liquid cooling technologies help alleviate heat dispensation by cooling chips. As demand increases, we’ll see more data centers pre-designed with chilled water plants.

Liquid cooling falls into multiple categories. These include direct liquid cooling (DLC), where chilled water loops directly to the chips or to active rear-door heat exchangers—or both simultaneously for highly dense workloads. Immersion is another option, yet it may not reach wide adoption because it’s hard to support, especially in a mixed customer data center.
But the changing cooling market needs to be receptive to customer demands. Data center providers need to be able to offer options to customers who want part, or all their workload water-cooled or who still want full air cooling.
In addition to cooling, as data loads increase, especially with new AI initiatives, businesses will need affordable energy options. This need is why we see new data centers built with geographic considerations for power and energy availability. In the future, they may move closer to natural resources to harness hydropower or geothermal energy.
THE GOVERNANCE OPPORTUNITY
Sovereign AI dovetails from the concept of data sovereignty that we see today, where the laws in the region where the data is collected and stored governs the data. Initiatives like the EU’s General Data Protection Regulation or the California Consumer Privacy Act governing regional data have not yet stretched to regional AI models with specific governance to them.
Considering rising cyber attacks from malicious threat actors and hackers, fovereign AI also plays a role in national defense. If a country builds up its AI capabilities and manages proprietary and private data, that’s something they want to run close to the chest and not outsource to another country.
Ultimately, sovereign AI is still in its infancy, but we may see more countries investing in proprietary AI models in 2025.
THE BIGGEST OPPORTUNITY
Businesses with sensitive, private, or proprietary data—like pharmaceutical and financial services businesses—don’t want to have their data exposed, which is a real possibility when using public AI models. They will use hybrid or private AI models to keep datasets protected, which they can do by partnering with a private AI solution or by building in-house.
While spending a lot of money to do private installations for inference and training may not be an issue for larger businesses, smaller businesses may not be able to afford that. That’s where they’ll invest in public or hybrid AI, where they do their training in the cloud, then pull back to run their inference through their data centers.
We’ll likely see more businesses look to a hybrid AI model in 2025. Private AI will be a critical part of the demands for AI factories and sovereign AI, too.
PlatformDIGITAL® is the home of AI, and we have the core understanding and expertise of what’s required for both GenAI and high-performance compute to be successful.
ABOUT THE AUTHOR
Tony Bishop is responsible for Digital Realty’s global growth strategy and plays a central role in prioritizing the enterprise business segment, championing company platform strategy, and supporting integrated partner solutions. This work includes the programmatic launch of platform and solution capabilities that position PlatformDIGITAL® as the leading data center platform and global meeting place for enterprises and service providers.
Prior to joining Digital Realty, Bishop served as Vice President of Global Vertical Strategy and Marketing at Equinix, where he was responsible for creating the global growth strategy for enterprise and service provider markets through vertical solution development. Previously, Bishop served at 451 Research as Chief Strategy Officer focused on digital infrastructure research.
Earlier in his career, Bishop was Managing Director, Global Head of Enterprise Datacenter Operations and Strategy at Morgan Stanley, where he led the global team that defined and implemented the data center transformation program. He also served as an advisor to NuCyper, is a second-degree fellow with Infrastructure Masons, and is the author of Next Generation Datacenters: Driving Extreme Efficiency & Effective Cost Savings.