Embracing Direct-to-Chip cooling for an energy-efficient AI era

Embracing Direct-to-Chip cooling for an energy-efficient AI era

Josh Claman, CEO, Accelsius, argues data centres are the plumbing for all things digital, and the desire for power isn’t diminishing any time soon. Consequently, operators should be utilising all emissions to initiate more sustainable solutions.

Josh Claman, CEO, Accelsius

The AI boom has expedited innovation and breakthroughs, bringing change to almost every aspect of people’s lives with the guarantee to continue to do so. It has enhanced and streamlined billions of processes, aiding practically all industries, from healthcare to education, and bringing unprecedented change to daily lives.

However, this technology does not come without its downsides. The growing demand for AI-driven power has resulted in an exponential increase in electricity and water usage within data centres, and the current infrastructure cannot withstand the ever-increasing demand without change.

AI chips and workloads already demand more power than past solutions, and data centres are unprepared for the surge in electricity AI will need. Data centres use about 40% of their power allocation on inefficient air-cooling infrastructures for regular chips and workloads. With AI, data centres will need to draw more energy from the power grid due to the higher computing needs of AI chips. This presents the primary issue – it is no longer a question of ‘if’ we will run out of power, but ‘when’ we will.

Recent data shows that current data centre racks average about 7kW of power, but when integrating AI hardware, the power consumption can jump to anywhere from 30-100kW or more. This 4-5x increase in energy use can cause added strain on power grids and create additional heat in data centres.

To put this in perspective, in 2022, data centres and AI used 460TWh of electricity. The energy usage is expected to nearly triple by 2026 with data centres set to use more than 1,000TWh at the current pace of technological development. The addition of this power consumption is roughly equal to Japan’s current usage – it would be like adding another country’s worth of power use.

Data centre experts predict a need for 18-30GW of new capacity over the next five to seven years in the US alone to handle the growing demand, and our current infrastructure is not ready to handle this surge. AI-powered chips put out a lot of heat and require a lot of power to run, so if data centres do not begin updating and changing their processes and internal hardware, they will further strain the US’s aging power grid.

To their credit, tech companies have attempted to counteract their energy and water usage by heavily investing in renewable energy sources. Companies have tried to help their consumption through Power Purchase Agreements with solar or wind farm operators, or by purchasing renewable energy certificates that allow power companies to pay for the creation of renewable energy.

Unfortunately, data centres cannot entirely rely on renewable energy resources – data centres require a consistent power source to stay running, something renewable energy cannot provide. However, there is hope for the future with the possibility of nuclear or geothermal energy sources, although neither is currently commercially available. Even if we generate enough power, upgrading transmission and distribution systems remains a significant challenge.

Data centres must make crucial changes toward sustainability and this responsibility falls directly on the operators. Operators must build and maintain efficient and sustainable data centres and practices. As power becomes the containing resource, they must ensure that every watt is efficiently allocated to compute – not cooling.

The inevitable increase in data centres has operators looking at alternate cooling solutions. With Google revealing that their data centre water consumption was nearly 4.5 billion gallons in 2021, and almost half of current data centre power going towards cooling, data centre shifts are becoming more and more necessary.

The previous air-cooling methods are no longer enough. These solutions take up too much power in data centres and are not, and will not be, sustainable – using as much water as three average-sized hospitals for cooling and computing needs.

As data centres grow in size, computing power and numbers, the top two concerns are the sharp increase in water and electricity usage – both of which can be aided by the switch to liquid cooling solutions.

Data centres have previously relied on standard air-cooling, however, this method is not sustainable and is no longer enough to cool the heat produced by AI chips and servers.

This necessitates the shift to better cooling technology that directly applies the cooling solution to each chip.

The biggest issue with previous solutions is that they are using a less efficient means of removing heat (liquid is 6x better at removing heat than air) and using brute force to remove heat from the server and rack as a whole versus targeting the source of heat – the chips.

By successfully cooling the chips, data centres can keep running efficiently without the fear of outages while optimising every watt of power and litre of water used. Data centre operators can implement Direct-to-Chip liquid cooling to easily target the chips and keep them cool before heat is emitted to the rest of the server and server rack. This necessary shift to Direct-to-Chip liquid cooling allows data centres to free nearly 20% of the energy used for cooling – leaving the other 20% previously used for computing needs.

Through Direct-to-Chip cooling, data centres can also optimise their space, increasing the servers per rack while decreasing water and energy consumption. Direct-to-Chip cooling, when compared to air-cooling, offers a fit-for-purpose solution for energy and water consumption challenges. It provides an estimated 50% savings in energy costs and reduces the amount of water used to zero.

When considering data centre sustainability, energy and water usage are top concerns. Data centres utilise enormous amounts of these resources, often earning them a reputation as bad for the environment. However, data centres are the plumbing for all things digital, so if we plan to continue to use the Internet, we’ll need data centres to increase their sustainability efforts with Direct-to-Chip liquid cooling. With waterless liquid cooling, data centres can minimise their water usage, utilise every watt for computing and maximise the utilisation of each server rack.

Click below to share this article

Browse our latest issue

Intelligent Data Centres

View Magazine Archive