Nvidia has partnered with hardware infrastructure vendor Vertiv to provide liquid cooling designs for future data centers designed to be AI factories.
AI factories are specified data centers emphasizing AI applications as opposed to traditional line of business applications like databases and ERP. They make heavy use of GPUs in ultra-dense configurations, which generate a lot more heat than traditional CPUs.
Because of this, the move to liquid cooling has been accelerated by necessity, as air cooling is no longer viable in these dense rack configurations filled with GPUs. Air cooling is viable up to 30 kW per rack. After that it simply can’t keep up.
With their joint design, Vertiv and Nvidia are offering liquid cooling support for up to 132 kW per rack. The architecture aims to optimize deployment speed, performance, resiliency, cost, energy efficiency and scalability for current- and future-generation data centers.
The reference architecture is actually a hybrid liquid- and air-cooling infrastructure that simplifies and accelerates deployment of AI workloads in new and existing data centers and enables standardization across sites. Through the use of preconfigured modules and factory integration, Vertiv claims it can deliver AI critical infrastructure up to 50% faster than onsite builds.
“New data centers are built for accelerated computing and generative AI with architectures that are significantly more complex than those for general-purpose computing. With Vertiv’s world-class cooling and power technologies, Nvidia can realize our vision to reinvent computing and build a new industry of AI factories that produce digital intelligence to benefit every company and industry,” CEO Jensen Huang said in a statement.
The two companies have been working together since March 2024, when Vertiv became a Solution Advisor: Consultant partner in the Nvidia Partner Network (NPN), providing wider access to Vertiv’s experience and full portfolio of power and cooling solutions.
Read more about liquid cooling
- Data center liquid cooling market heats up: Liquid cooling technologies for data centers are transitioning from niche options deployed in specific market segments to mainstream applicability, according to research firm Dell’Oro Group.
- Pros and cons of air, liquid and geothermal systems: Whether it’s to save money, reduce carbon emissions, comply with regulations or accommodate high-powered AI workloads, enterprises are looking to operate more energy-efficient data centers.
- Schneider Electric shares liquid cooling guidelines: As high-density data centers continue to add AI workloads, there’s growing interest in liquid cooling, thanks to its ability to transfer heat more efficiently than air cooling.
- Data centers warm up to liquid cooling: AI, machine learning, and high-performance computing are creating cooling challenges for data center owners and operators. As rack densities increase and temperatures rise, more data centers are finding ways to add liquid cooling to their facilities.
- Accelsius offers liquid cooling without a data center retrofit: Accelsius, a relative newcomer in the liquid cooling market, has launched its NeuCool dual-phase, direct-to-chip liquid cooling technology, which is designed to be deployed without having to do a massive retrofit of your data center.
- ZutaCore launches liquid cooling for advanced Nvidia chips: The HyperCool direct-to-chip system from ZutaCore is designed to cool up to 120kW of rack power without requiring a facilities modification.
- Is immersion cooling ready for mainstream? Liquid cooling started as a fringe technology but is becoming more common. Proponents hope the same holds true for immersion cooling.
- Supermicro has a new liquid-cooled server for AI: Supermicro’s new server uses a self-contained liquid cooling system, and vendors outside of the IT infrastructure market are showing interest in the technology.
Source:: Network World