POWER AND COOLING ARE BECOMING INTEGRAL PARTS OF IT SOLUTION DESIGN IN THE DATA ROOM , AND WE ARE SEEING A BLURRING OF BORDERS BETWEEN IT AND FACILITIES TEAMS .
INDUSTRY WATCH
AI is already transforming people ’ s everyday lives , with local use of technology like ChatGPT , virtual assistants , navigation applications and chatbots on the upswing . And just as it is transforming every single industry , it is also beginning to fundamentally change data centre infrastructure , driving significant changes in how high-performance computing is powered and cooled .
To put this into perspective , consider the fact that a typical IT rack used to run workloads from five to 10 kilowatts , kW , and racks running loads higher than 20 kW were considered as high-density . AI-chips , however , can require around five times as much power and five times as much cooling capacity in the same space as a traditional server . So , we are now seeing rack densities of 40 kW per rack , and even more than 100 kW in some instances .
This will require extensive capacity increases across the entire power train ; from the grid to chips in each rack . It also means that , due to traditional cooling methods not being able to handle the heat generated by GPUs running AI calculations , the introduction of liquid-cooling technologies into the data centre white space , and eventually the enterprise server room , will be a requirement for most deployments .
Investments to upgrade the infrastructure needed to both power and cool AI hardware are substantial , and navigating these new design challenges is critical . The transition will not happen quickly : data centre and server room designers must look for ways to make power and cooling infrastructure futureready , with considerations for the future growth of their workloads .
To absorb the massive amount of heat generated by hardware running AI workloads , two liquid cooling technologies are emerging as primary options :
Direct-to-chip liquid cooling
Cold plates sit atop the heat-generating components , usually chips such as CPUs and GPUs to draw off heat . Pumped single-phase or two-phase fluid draws off heat from the cold plates to send it out of the data centre , exchanging heat but not fluids with the chip . This can remove between 70 to 75 % of the heat
POWER AND COOLING ARE BECOMING INTEGRAL PARTS OF IT SOLUTION DESIGN IN THE DATA ROOM , AND WE ARE SEEING A BLURRING OF BORDERS BETWEEN IT AND FACILITIES TEAMS .
generated by equipment in the rack , leaving 25 to 30 % to be removed by air-cooling systems .
Rear-door heat exchangers
Passive or active heat exchangers replace the rear door of the IT rack with heat exchanging coils , through which fluid absorbs heat produced in the rack . These systems are often combined with other cooling systems as either a strategy to maintain room neutrality , or as part of a transitional design starting the journey into liquid cooling .
While direct-to-chip liquid cooling offers significantly higher density cooling capacity than air , it is important to note that there is still excess heat that the cold plates cannot capture . This heat will be rejected into the data room unless it is contained and removed through other means such as rear-door heat exchangers or room air cooling .
Because power and cooling are becoming such integral parts of IT solution design in the data room , we are seeing a blurring of the borders between IT and facilities teams , something that can add complexity when it comes to design , deployment and operation . Thus , partnerships and full-solution expertise rank as top requirements for smooth transitions to higher densities .
Challenges for Africa
One of the main challenges when considering cooling solutions for data centres is the fact that servers need to be kept within certain temperature and humidity limits to function optimally . At the same time , infrastructure
Wojtek Piorko , Managing Director Africa , Vertiv
Jonathan Duncan , Technical Director Africa , Vertiv
www . intelligentcio . com INTELLIGENTCIO AFRICA 65