CTO predictions for 2025: A year of AI evolution
Chris Sharp, Chief Technology Officer, Digital Realty
Disruption accelerates
We anticipate seeing a year’s worth of AI innovation happening each quarter, as the virtuous cycles of AI create a feedback loop into advanced AI model design. Ground-breaking changes are happening from all over, and each one allows multiplier effects over what used to be state-of-the-art.
From our perspective, we watch three vectors for innovation: hardware, software, and networking. We keep a pulse on these areas to ensure Digital Realty continues to be a foundation of support to our customers through digital transformation, cloud migration, and AI. Software has had the most rapid rate of change and is one of the key areas of technology acceleration and potential for disruption.
Productization delivers differentiation
In 2025, we will see companies that figure out how to productize, build a financial model around their technology, and align themselves with strategic partners begin to strip away from pure research and ‘cool’ factor. They will have a better chance at differentiation against the competition.
We believe that AI monetization won’t come from direct model sales, but rather from application software enablement. Think Microsoft Copilot—but applied through broader uses cases. I recently met with a leading ERP solutions provider to discuss this in detail, given their role as a workflow company undergoing transformation by AI agents.
According to IDC, the “heightened adoption of AI within datacenters will make third-party colocation facilities even more appealing...”1 The successful AI-forward companies will be building up their capabilities to support the unprecedented demand that is headed toward them. Data center products will be increasingly in demand with many requests for scarce resources.
Intelligence will be FREE
Across the market, we’ve seen the rapid drop in costs required to perform a given workload as more efficient hardware and software takes AI abilities from concept to production. In a way, we’re all beginning to turn into cyborgs, with our logical extension of intelligence into our tools.
Success in the future is predicted by some to not be based on how much skill you have, but how well you integrate these new tools into your daily and professional lives.
To quote from Sam Altman, “We used to put a premium on how much knowledge you had collected in your brain, and if you were a fact collector, that made you smart and respected. And now I think it’s much more valuable to be a connector of dots than a collector of facts that if you can synthesize and recognize patterns, you have an edge.”
Efficiency will rule
As AI capabilities begin to normalize, the focus will be on how efficiently AI workloads can run. Tokens per watts per dollar will be the new standard for whether a company delivers AI services above or below market levels. Tokens are the fundamental unit of work that is processed by an AI model. By measuring efficiency this way, it quantifies through energy costs how we measure intelligence.
Success will require both choosing the correct hardware and also understanding the ramifications on how to deploy it. The ability to deploy the newest AI hardware, which will increasingly require water cooling and power densities of 10-15x traditional data center workloads (for example, as supported by High-Density Colocation), will be critical.
AI is connected
As AI becomes more deeply embedded in our personal and professional lives, the focus of AI will shift from training (creating the models) toward inference (use of those models) and agents (autonomous action). Unlike training, the performance of inference is impacted by the latency between the end-user and the AI inference cluster, as well as all the datasets that are queried for relevance.
This requires considerable front and back-end connectivity, with considerations to proximity for workload deployment. We can expect network service providers to continue to invest in new long-haul and metro routes to support these workloads.
To meet the growing demand as AI increasingly integrates into enterprise software and consumers, we want to enable software-driven AI advancements—similar to cloud compute and storage deployments, which require larger, denser power and bulk interconnection capabilities.
The ability to be flexible and take advantage of this rapid pace in change will be critical for both service providers and consumers. Highly performant solutions such as ServiceFabric™ will enable enterprises to rapidly onboard AI technology to their data, instead of the other way around.
With agents, we need to consider the latency between the AI cluster and end-systems where the agents take action. Solutions like AIPx will become increasingly important to ensure performant workloads, dataset scaling, privacy and security, access to an AI ecosystem of partner applications and networks.
All of which across hybrid of public and private (Hybrid AI) infrastructure and networks as on-demand consumption models. Networks will also become increasingly intelligent and able to monitor for performance, cost sensitivity/analysis and security across AI workloads and dataflows with the ability to adjust configuration based on these requirements.
To learn more about how you can harness AI technology to create value, read our latest whitepaper AI for IT Leaders: Deploying a Future-Proof IT Infrastructure.
1 IDC, Worldwide Datacenter Colocation Services Forecast, 2024-2028