Siegel_Universitaet_Muenster

THE UNIVERSITY OF Münster DEPLOYS ZUTACORE'S WATERLESS, TWO-PHASE LIQUID COOLING

HyperCool® Delivered the Most Computing Power for the Money; Eliminated the Risk of Water Leakage and Reduced Energy Costs

Case Study

With 43,000 students, the University of Münster is one of the largest universities in Germany. The University’s CIT – Center for Information Technology is the main IT center responsible for all aspects of IT infrastructure, for communication and media technology as well as for media competence. And like every other higher education and research institution in the world, they need more compute power, but are challenged with how to provide this while controlling their heat, energy consumption and floorspace footprint. Traditional air cooling systems have reached their limits for cooling the latest generation of processors that consume up to 400 watts of power, which led the University down the path to investigate cutting edge new liquid cooling technologies.

HPC Systems Lie at the Heart of CIT

The University is located in the middle of North-Rhine-Westphalia in Germany, it spans multiple locations spread over several buildings. The University’s HPC system has its own building with its own infrastructure for power and cooling. The latest expansion of the system comprises an ASUS HPC system that consists of 44 dual socket nodes with AMD EPYC 9654 96-core processors and 768 GB RAM and 4 dual socket nodes with AMD EPYC 7773X 64-core processors and 4TB RAM.

The University Evaluated Several Liquid Cooling Technologies

Once the University determined that liquid cooling was the only option for cooling their IT server infrastructure, they started evaluating various liquid cooling technologies such as direct-to-chip and immersion. The ZutaCore® HyperCool® waterless direct-to-chip liquid cooling solution came out the clear winner for both performance and environmental benefits to the University, as the solution:

  • Uses no water: The HyperCool system leverages a heat-transfer fluid, which eliminates the risk of leakage or corrosion.
  • Requires little to no modifications to current real estate, power or cooling systems: As a self-contained and self-regulating system, HyperCool triples the processing capacity of highly dense computing environments using less than 50 percent of the energy and half the space of conventional cooling systems.
  • Reduces Energy Costs: HyperCool reduces cooling power by 80%, while being able to cool the most powerful processors of 2,800 watts or more.
  • Enhances Reliability: Consistent and effective temperature management reduces wear and tear on computing equipment, increasing longevity and reliability.
  • Long-Term Scalability: Liquid cooling solutions can be scaled to meet the demands of growing data centers, HPC clusters, or AI workloads, ensuring long-term viability and performance.

According to Holger Angenent, Leader of the e-science infrastructure group of the CIT, By deploying ZutaCore’s direct-to-chip liquid cooling technology, we were able to increase the density of servers in a rack, while also using less electrical power for fans. Also, in comparison to a water based solution, ZutaCore technology enables us to work with a larger delta T and higher temperatures in the liquid.”

Easy Installation, Stellar Results

The implementation of HyperCool into the University’s HPC server room was extremely quick and easy, with components being shipped to the University and assembled in their facility. According to Jürgen Hölters, who is the Deputy Head of CIT, “After the system was completely assembled, we ran a few tests and it was clear that the ZutaCore system fit all of our needs and provided the performance, energy savings and reliability that was so important to us. And after running the system for months with no problems, we are extremely confident in our selection of HyperCool.”

Jürgen also highlighted the energy efficiency of the ZutaCore System, stating that they were able to reduce their energy costs because it enabled them to utilize slower fan speeds. In addition, because the CPU temps were lower, they also had better efficiency rates from the CPUs themselves. With their CPUs consuming approximately 400 watts, they could use the HPC system in more efficient ways because they could now install CPUs with much more computing power. They also save rack space because they don’t need as many nodes to get the same computing power due to the more efficient CPUs. According to Jürgen, “This would have never been possible with air cooled systems, and the system is also much quieter due to the reduction in fan noise.”

Scalability in the Future

With the success of its first HyperCool deployment, the University plans to consider HyperCool for all future needs – opening a way to accommodate the skyrocketing power requirements of the urgently needed HPC resources in the limited data center real estate. And as their compute power increases, leading to an increased amount of heat being moved off the processors, it will also consider using the HyperCool system to re-use this heat for other applications that could also include heating some of the University rooms and buildings. As Jürgen pointed out “HyperCool’s ability to enable 100 percent heat re-use is another significant advantage over other liquid cooling technologies, particularly when you are dealing with power-hungry processors and GPUs. The possibilities to redirect that heat can deliver a significant environmental impact not only to the University, but to the planet itself as we strive globally for net zero emissions.”