AI Datacentre Tech Trends from Vertiv Report

From News Desk

Datacentre innovation is continuing to be shaped by macro forces and technology trends related to AI, according to a report from Vertiv, a provider of critical digital infrastructure. The Vertiv Frontiers report, which draws on expertise from across the organisation, details the technology trends driving current and future innovation, from powering up for AI, to digital twins, to adaptive liquid cooling.

“The datacentre industry is continuing to rapidly evolve how it designs, builds, operates and services datacentres, in response to the density and speed of deployment demands of AI factories,” said Vertiv chief product and technology officer, Scott Armul. “We see cross-technology forces, including extreme densification, driving transformative trends such as higher voltage DC power architectures and advanced liquid cooling that are important to deliver the gigawatt scaling that is critical for AI innovation. On-site energy generation and digital twin technology are also expected to help to advance the scale and speed of AI adoption.”

The Vertiv Frontiers report builds on and expands Vertiv’s previous annual Datacentre trends predictions. The report identifies macro forces driving datacentre innovation – extreme densification, accelerated by AI and HPC workloads; gigawatt scaling at speed. Datacentres are now being deployed rapidly and at unprecedented scale, as a unit of compute. The AI era requires facilities to be built and operated as a single system; and silicon diversification i.e., datacentre infrastructure must adapt to an increasing range of chips and compute.

The report details how these macro forces have in turn shaped five key trends impacting specific areas of the data center landscape.

1.    Powering up for AI

Most current datacentres still rely on hybrid AC/DC power distribution from the grid to the IT racks, which includes three to four conversion stages and some inefficiencies. This existing approach is under strain as power densities increase, largely driven by AI workloads. The shift to higher voltage DC architectures enables significant reductions in current, size of conductors; and number of conversion stages while centralising power conversion at the room level. Hybrid AC and DC systems are pervasive, but as full DC standards and equipment mature, higher voltage DC is likely to become more prevalent as rack densities increase. On-site generation and microgrids, will also drive adoption of higher voltage DC.

2.    Distributed AI

The billions of dollars invested into AI datacentres to support large language models (LLMs) to date have been aimed at supporting widespread adoption of AI tools by consumers and businesses. Vertiv believes AI is becoming increasingly critical to businesses but how and from where, those inference services are delivered will depend on the specific requirements and conditions of the organisation. While this will impact businesses of all types, highly regulated industries, such as finance, defence and healthcare, may need to maintain private or hybrid AI environments via on-premise datacentres, due to data residency, security, or latency requirements. Flexible, scalable high-density power and liquid cooling systems could enable capacity through new builds or retrofitting of existing facilities.

3.    Energy Autonomy Accelerates

Short-term on-site energy generation capacity has been essential for most standalone datacentres for decades, to support resiliency. However, widespread power availability challenges are creating conditions to adopt extended energy autonomy, especially for AI datacentres. Investment in on-site power generation, via natural gas turbines and other technologies, does have several intrinsic benefits but is primarily driven by power availability challenges. Technology strategies such as Bring Your Own Power (and Cooling) are likely to be part of ongoing energy autonomy plans.

4.    Digital Twin-Driven Design and Operations

With increasingly dence AI workloads and more powerful GPUs also come a demand to deploy these complex AI factories with speed. Using AI-based tools, datacentres can be mapped and specified virtually, via digital twins. The IT and critical digital infrastructure can be integrated, often as prefabricated modular designs; and deployed as units of compute, reducing time-to-token by up to 50%. This approach will be important to efficiently achieving the gigawatt-scale buildouts required for future AI advancements.

5.    Adaptive, Resilient Liquid Cooling

AI workloads and infrastructure have accelerated the adoption of liquid cooling. But conversely, AI can also be used to further refine and optimise liquid cooling solutions. Liquid cooling has become mission-critical for a growing number of operators but AI could provide ways to further enhance its capabilities. AI, in conjunction with additional monitoring and control systems, has the potential to make liquid cooling systems smarter and even more robust by predicting potential failures and effectively managing fluid and components. This trend should lead to increasing reliability and uptime for high value hardware and associated data/workloads.

Disclaimer – Views are those of the spokespersons and this website doesn’t necessarily endorse them. Readers are urged to use their own discretion while making a decision about using this information in any way. There has been no monetary benefit to the Publisher/Editor/Website Owner for publishing this post and the Website Owner takes no responsibility for the impacts of using this information in any way.

Read more in Technology and Society