What’s Next In AI Evolution?

A CTO’s perspective

There are five distinct trends we anticipate will drive AI’s evolution during 2025 and beyond.

DISRUPTION ACCELERATES

We can anticipate seeing a year’s worth of AI innovation happening in each quarter, as the virtuous cycles of AI create a feedback loop into advanced AI model design. Groundbreaking changes are happening from all over, and each one creates multiplier effects on what used to be state-of-the-art.

There are three vectors for innovation: hardware, software, and networking. We keep a pulse on these areas to ensure we continue to be a foundation of support to our customers through digital transformation, cloud migration, and AI. Software in particular has seen the most rapid rate of change and is one of the key areas for both technology acceleration and the potential for disruption.

PRODUCTIZATION DELIVERS DIFFERENTIATION

Over the next 12 months or so, we will see businesses that figure out how to productize, build a financial model around their technology, and align themselves with strategic partners begin to strip away from pure research and cool factor. They will have a better chance at differentiation against the competition.

AI monetization won’t come from direct model sales, but rather from application software enablement. We believe AI’s rapid evolution will make colocation a more appealing solution for businesses. The successful AI-forward companies will be building up their capabilities to support the unprecedented demand that is headed toward them. Data center products will be increasingly in demand with many requests for scarce resources.

INTELLIGENCE WILL BE FREE

Across the market, we’ve seen the rapid reduction in costs required to perform a given workload as more efficient infrastructure takes AI abilities from concept to production. In a way, we’re all beginning to turn into cyborgs, with our logical extension of intelligence into our tools.

Future success will not be based on skill but on how well you integrate these new tools into your daily and professional lives.

To quote Sam Altman, Chief Executive Officer of OpenAI: “We used to put a premium on how much knowledge you had collected in your brain, and if you were a fact collector, that made you smart and respected. And now I think it’s much more valuable to be a connector of dots than a collector of facts […]. If you can synthesize and recognize patterns, you have an edge.”

EFFICIENCY WILL RULE

As AI capabilities begin to normalize, the focus will be on how efficiently AI workloads can run. Tokens per watts per dollar (a brilliant framework that takes into account the energy costs of intelligence) will be the new standard for whether a company delivers AI services above or below market levels. Tokens are the fundamental unit of work that is processed by an AI model. By measuring efficiency this way, it quantifies through energy costs how we measure intelligence.

Success will require both choosing the correct infrastructure and understanding the ramifications on how to deploy it. The ability to deploy the newest AI hardware, which will increasingly require water cooling and power densities of 10–15x traditional data center workloads (for example, as supported by high-density colocation), will be critical.

AI WILL DRIVE CONNECTIVITY

As AI becomes more deeply embedded in our personal and professional lives, the focus of AI will shift from training (creating the models) toward inference (use of those models) and agents (autonomous action). Unlike training, the performance of inference is impacted by the latency between the end user and the AI inference cluster, as well as all the datasets that are queried for relevance.

This process requires considerable front- and back-end connectivity, with considerations to proximity for workload deployment. We can expect network service providers to continue to invest in new long-haul and metro routes to support these workloads.

To meet growing demand as AI increasingly integrates into enterprise software and consumers, we want to enable software-driven AI advancements—similar to cloud compute and storage deployments, which require larger, denser power and bulk interconnection capabilities.

The ability to be flexible and take advantage of this rapid change in pace will be critical for both service providers and consumers. Highly performant solutions like ServiceFabric® will enable businesses to onboard AI technology rapidly to their data instead of the other way around.

With AI agents, we need to consider the latency between the AI cluster and end systems. Solutions like Private AI Exchange (AIPx) will become increasingly important to ensure performant workloads, dataset scaling, privacy and security, as well as access to an AI ecosystem of partner applications and networks. All these elements will exist across a hybrid of public and private (hybrid AI) infrastructure and networks as on-demand consumption models. Networks will also become increasingly intelligent and able to monitor performance, cost sensitivity / analysis, and security across AI workloads and data flows, with the ability to adjust configuration based on these requirements.

ABOUT THE AUTHOR

Chris Sharp has over 20 years of experience in the technology industry, with an extensive background in developing technology strategies in global markets. He has a deep knowledge of the data center sector and is well positioned to expand technical innovation at Digital Realty.

Before his current role, he was responsible for cloud innovation at Equinix, where he led the development of innovative cloud services solutions and developed new capabilities enabling next-generation, high-performance exchange and interconnection solutions to facilitate broad commercial adoption of cloud computing on a global basis.

Previously, Sharp held leadership positions at top network and colocation providers, including Qwest Communications, MCI / Verizon Business, and Reliance Globalcom.