The true role of data centres in our digital future is becoming apparent, as environmental and commercial demands increase. Steven Carlini, Vice President of Innovation and Data Center, for Schneider Electric’s Secure Power division, explores the new trends set to impact the sector in 2024 and beyond.
As we look out into a future that appears to have more variables by the minute, making any kind of prediction is hard. However, there are two things that seem pretty certain. First, we will have to do more to combat climate change and second, we will have to change the way we do – and support – business because of the increasing impact of Artificial Intelligence (AI).
The common thread between these two apparently divergent, but pressing, priorities is the unsung hero of our increasingly digital world: the data centre.
Data centres will not only be critical in supporting the digital tools and services that will allow us to implement the technologies to combat climate change in manufacturing, transportation, buildings and power generation industries, they will also be the foundation on which the next course of the digital economy will be built, facilitating the adoption of AI in operations and service offerings.
Few could have failed to notice the noise and heat (no pun intended) generated from the recent COP28 summit. Many have expressed dismay that this is the 28th such conference on something that should have been comprehensively addressed by now, but I digress.
A key commitment made at this year’s conference was reported by Reuters. The governments of 118 countries have pledged to triple the world’s renewable energy capacity by 2030, in a bid to reduce the use of fossil fuels in energy generation, thus drastically cutting greenhouse gas emissions (GHG) associated.
This is a massive undertaking, but one that is achievable and one in which data centres, and other large energy users, can make a significant contribution.
In recent years, various trends have driven data centre designers and operators to develop their facilities to be more energy self-sufficient. This has sometimes been due to constrained energy supply, particularly in urban areas, and sometimes due to rising costs of connections. In fact, there is mounting evidence to suggest that planned developments that were viable just a couple of years ago are being reconsidered due to stubbornly high interest rates and attendant costs.
Demand side management
All of these drivers have meant that many data centre designers and operators are moving more toward the capability of operating at a severely reduced demand side energy consumption rate, or being entirely self-sufficient for sustained periods. This can be achieved through various means, but sometimes it can be as simple as using a greener alternative to diesel in existing generators, such as hydrotreated vegetable oil (HVO), but extending to full microgrid operation.
A key characteristic of microgrids is that they can operate either independently of a wider, national grid, or collaboratively with it and other microgrids. When operating collaboratively under an electricity trading scheme often governed by smart contracts, data centres can leverage their critical power infrastructure to provide grid balancing services which are essential when high levels of variable renewable energy (VER) sources are employed.
If the world is to achieve the tripling of RES used in grids globally, then the speed of incorporation of such sources will be ramped up significantly towards 2030. The digital management systems, such as DCIM, critical power infrastructure and the ability to function as a microgrid with energy flows in both directions, will allow data centre designers, owners and operators, to facilitate a an accelerated transition to renewables, and particularly VERs, that will see those ambitions achieved.
The other major trend that is foreseeable, if not fully predictable, is the impact of AI on business, consumers and the ICT industry as a whole. Data centres are already changing under the demands of AI and we at Schneider Electric have done a lot of work to understand those changes and provide insights as to how to optimise facilities. Our White Paper 110 entitled The AI Disruption: Challenges and Guidance for Data Center Design explains relevant attributes and trends of AI workloads, describing the resulting data centre challenges and provides guidance on addressing the challenges for each physical infrastructure category including power, cooling, racks and software management.
However, the impact of AI is far greater than just infrastructure or management. There has been a growing realisation that moving resources closer to where they are needed is a sound approach to many of today’s digital challenges. Compute power, data processing and analysis and now AI are being moved to the Edge. Distributed IT or Edge Computing has been implemented in sectors such as retail and finance, and manufacturing will increasingly deploy Edge Computing to enable increasing use of Industrial Internet of Things (IIoT), as well as automation and more. The next 12 months or so will be when everyone starts talking about the need for Edge AI. This AI at the Edge will support not just optimisation of infrastructure and operations, it will also be key in supporting enterprise applications.
Moving to the Edge
In many cases, but already seen in the likes of retail, the day to day data is not as important as the insights it contains, which must be extracted quickly to be of value. Data processing has arguably been moving from the core to the Edge of networks over the last decade or so, and especially for Big Data applications for which the result is the key and raw data less so. A raft of developments, from processor technologies to 5G and Wi-Fi 6 high-capacity networks, have enabled more and more applications to be placed at the Edge, providing vital speed and capability where it is needed. When these Edge implementations run AI algorithms in a neural network, they can be referred to as Edge AI. Additionally, with video as a central Edge AI component, a vision processing unit microprocessor is preferred in many of these applications to accelerate Machine Learning and AI algorithms, to better support image processing (or computer vision) by using less power with higher speed.
This kind of Edge AI has numerous benefits, not just to reduce data traffic to centralised infrastructure, which is in danger of experiencing the concept of data gravity, but in providing intelligence from data faster than previously. This in turn can feed into AI optimisation of operations with better quality inferences from fresh data direct from where it is produced.
Edge AI has the potential to hark back to one of the early benefits of the Internet in decentralisation. Building on trends such as Blockchain, Web3 and the Metaverse, Edge AI can enable architectures which are inherently resilient, self-optimising and highly efficient.
Central role for data centres
Data centres have a central role to play in future demands of the digital world. Not only can they host AI-enhanced applications and services that can increase efficiency and provide the transparency to enable other industries and sectors to decarbonise, they can directly contribute to the acceleration of renewable energy adoption to achieve the pledges made by 118 governments at COP28.
Additionally, data centres are evolving to be able to provide an optimised foundation for the increasing AI workload demand from businesses of all sizes. Data centres truly are the unsung heroes of our digital future.Click below to share this article