Liquid Cooling for AI Data Centers – Schneider Electric

A leader in the digital revolution of energy management and automation, Schneider Electric has published a white paper titled “Navigating Liquid Cooling Architectures for Data Centers with AI Workloads,” number 133. Liquid cooling methods and their uses in contemporary data centers—especially those managing high-density AI workloads—are thoroughly examined in this article.

AI is becoming more and more in demand at an exponential rate. Consequently, significant heat is being produced by the data centers needed to support AI technology, especially those that house AI servers equipped with accelerators for processing heavy inference workloads and training massive language models. Liquid cooling is becoming more and more necessary to maintain reliability, sustainability, and optimal performance in light of this heat production.

In its most recent white paper, Schneider Electric helps IT managers and data center operators navigate the intricacies of liquid cooling by providing concise answers to important queries regarding system design, installation, and operation. 

Understanding Liquid Cooling Architectures

The authors, Paul Lin, Robert Bunger, and Victor Avelar, divide liquid cooling for AI servers into two primary categories: immersion cooling and direct-to-chip cooling, covering a total of twelve pages. In order to control temperature, flow, pressure, and heat exchange within the cooling system, they outline the parts and operations of a coolant distribution unit (CDU).

According to Robert Bunger, Innovation Product Owner, CTO Office, Data Center Segment, Schneider Electric, “AI workloads present unique cooling challenges that air cooling alone cannot address.” By demystifying liquid cooling architectures, our white paper hopes to arm data center operators with the information they need to plan liquid cooling installations with expertise. We want to give data center managers useful knowledge so they can maximize the performance of their cooling systems. Operators can improve the performance and efficiency of their data centers by knowing the trade-offs and advantages of each design.”

Six popular liquid cooling topologies are described in the study, combining various CDU types and heat rejection techniques. It also offers recommendations for choosing the optimal solution based on deployment size, speed, energy efficiency, and other criteria.

The growing need for AI processing capacity and the accompanying increase in thermal loads have made liquid cooling an essential part of data center architecture. Additional industrial trends covered in the white paper include the need for increased energy efficiency, adherence to environmental laws, and a move toward sustainable operations.

“As AI continues to drive the need for advanced cooling solutions, our white paper provides a valuable resource for navigating these changes,” Bunger said. “We are committed to helping our customers achieve their high-performance goals while improving sustainability and reliability.”

Providing the Industry with AI Data Center Reference Designs

The recent partnership between Schneider Electric and NVIDIA to optimize data center infrastructure for AI applications makes this white paper especially pertinent and urgent.

Through this collaboration, Schneider Electric’s experience in data center infrastructure and NVIDIA’s cutting-edge AI technologies were combined to present the first publicly available AI data center reference designs.

Data center operators now have creative ways to effectively manage high-density AI workloads thanks to the reference designs, which have raised the bar for AI deployment and operation.

Total
0
Shares
Share 0
Tweet 0
Pin it 0
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Demand for Data Centers with Next-Generation AI – Bloomberg

Next Post

Asana Announces $150 Million Stock Repurchase Program

Related Posts