EkkoSense’s Dr Stu Redshaw, explains why it’s time for data centre operations to ditch their spreadsheets and start paying attention to much more granular, real-time thermal and power measurements.
Ask any operations team if their data centre rooms are fully performance-optimised and the answer will invariably be yes. However, despite the best efforts of FM and IT teams, the reality is that even today’s best run data centres have cooling, power and space issues that should be resolved.
This is an industry-wide challenge, particularly as powering data centre cooling now accounts for around 30% of a data centre’s overall operating cost. And with research suggesting that just a third of installed data centre cooling equipment actually delivers active cooling benefits, there’s clearly a requirement for organisations to look again at their data centre’s performance if they’re to successfully remove thermal risk, achieve cooling energy savings and release further IT capacity.
Unfortunately very few data centres have access to the kind of real-time core power and thermal metrics that they need to make informed performance optimisation decisions about their critical facilities. Instead, many still rely on their original CAD floor layouts to calculate power and cooling requirements, while those that do regularly collect thermal data often consign it to inflexible spreadsheets that are rarely either accessible or up-to-date when critical information is needed.
This highlights a dilemma at the heart of data centre optimisation. Do you choose to lock down your initial data centre set-up – in the hope of maintaining reliability and performance? Or do you recognise that IT will need to keep pace as your operation evolves, requiring frequent attention in order to keep pace with your changing business requirements? The answer should be clear, but delivering on it will require true, real-time visibility of your critical heartbeat operational data.
This challenge was brought home last month when I visited a data centre that was undergoing a significant upgrade. They were updating around 15% of their existing racks and were adding 30 new ones over the next few weeks. Given the scale of change, how could they hope to ensure the right thermal, power and capacity balance if they had no access to current performance data?
Getting serious about sensing
So, if you’re serious about optimising data centre performance then you really need to know what’s happening right now – not what was going on yesterday or last week. That’s why it’s only when data rooms are carefully mapped with all the appropriate data fields that operations teams can start to gain a true understanding of their overall data centre performance. When a data room is carefully mapped with appropriate thermal data fields – right down to an individual rack or server level – whole new levels of understanding and cooling efficiency become possible.
To address this, organisations need to work out how to build rack-level detailed maps of their data centre estate that display all their cooling, power and thermal performance in real-time. It’s only by combining this kind of granular cooling and thermal data with smart monitoring and analysis software that they can access the intelligence required to enable informed performance optimisation decisions to be made.
Unfortunately, less than 5% of data centres currently gather this kind of precision data, as it requires a far greater networked mesh of sensors to accurately capture not just temperatures, but also energy usage, heat outputs and airflows.
Turning critical data into meaningful intelligence
Given that so much data operator and facilities management time is taken up with tracking data centre thermal performance and ensuring that the right cooling, power and space strategies are in place, it’s easy to see why many data centre teams view the process as an administrative burden rather than a positive activity.
We’ve been working to address this challenge through the application of our SaaS-powered monitoring, management and optimisation software, however we knew that we also had to make the process as accessible as possible for data centre operations staff.
That’s why we’re helping operations staff by replacing complex spreadsheets with easy-to-use 3D visualisations that show exactly what’s going on in real-time. Innovative room builder technology means that modelling your room is as simple as dragging and dropping critical rack and cooling assets into place, while the latest ‘what if?’ simulations also help you to experiment safely with different rack, power and cooling layouts to find your optimal M&E configuration.
3D visualisation is just the start
True data centre optimisation requires a proven, safe process that’s based on thousands of real-time sensors and expert spatial models that combine to remove the uncertainty from data centre cooling and power. Having access to rack-level data provides exactly the data platform needed for the kind of software-enabled real-time decision-making and scenario planning capabilities that organisations need to optimise critical facilities.
The next stage is to use the software to take things to the next level, actively supporting data centre teams with advice about what they can physically do to improve their centre’s performance. Drawing on sensor inputs and software analysis, our software now advises our customers about data centre floor tiles or grilles that need changing, recommends immediate adjustments to setpoints in the coldest parts of the data centre room, while also tracking Air Handling Units to highlight and suspend those not actually doing any active cooling.
Working from a common performance data set
Central to effective data centre performance optimisation is the requirement for data centre managers and their IT and facilities management teams to have access to the same core set of performance information. By providing a common data framework for managing their critical estate, different teams can rely on a trusted, single point of information that’s ready to support decision-making and support improved collaboration and workflows. Instead of waiting for sign-off by other teams, a data centre’s power, IT and cooling specialists can all work together from a common 3D data set.
This combination of effective 3D visualisation, Internet of Things (IoT) sensors, a common data set and analytics is combining to offer a powerful tool in helping organisations to drive energy efficiency and risk reduction in data centres. It’s a great example of Gartner’s ‘Digital Twins’ approach, where intelligent modelling and analytics are deployed to help organisations monitor and control their increasingly complex assets and processes.
Gartner sees this Digital Twins model as a great way for organisations to disrupt traditional CAD model-based management, drawing on innovations such as 3D modelling and IOT sensors to provide continuous data updates from the data centre floor.
Supporting changing data centre demographics
Data centre teams will also quickly come to rely on their sensor + software enabled Digital Twins model as their operations continue to evolve, particularly as they come to terms with managing their expanding edge and hyper-scale activities. It’s a model that’s well-suited to changing data centre demographics and the changing knowledge expectations of younger employees particularly.
The next industry generation will become increasingly comfortable with the concept of managing their data centre operations using exactly the kind of real-time enabled 3D operational models outlined above.
And, perhaps critically, they’ll also be empowered to make active optimisation decisions based on this trusted data centre digital model. Indeed, they may succeed in achieving what their predecessors have long hoped for – being able to navigate their data centre careers without having to resort to outdated CAD drawings and inflexible departmental spreadsheets.
Click below to share this article