It is the increasing cost of emission taxes applied to data centres. To help improve efficiency, data centre owners have turned to computational fluid dynamics (CFD) to model airflow to optimise cooling. While the science behind CFD makes it ideal for modelling data centres, it can quickly become outdated as workloads change.
Dave Wolfenden of Heatload explains why models must be verified and use real-time data to stay valid.
The rise of Computational Fluid Dynamics in the data centre
Wolfenden believes that CFD on its own is not enough. It is akin to creating a new aerodynamic part for a motor racing team then bolting it to the car hoping that it will deliver a race winning performance. While it may provide some benefits, they will be severely limited in scope. Worse still they are likely to lead to a range of other decisions that can even degrade the overall performance.
Use heat loads to test models
The use of heat loads is nothing new. An increasing number of companies already use them to test the initial design of the data hall. The problem is that designers do not universally use them nor are they regularly used during refurbishment. Wolfenden believes data centre designers are missing the point. It is not just about the heat loads validating their designs and models but providing a better baseline and library of designs that can speed up the development of future data centres.
Heat and cooling are directly related to workload
An example of the problem is the introduction of virtualisation. Workloads changed from being contained on servers to running anywhere in the data centre. It created the opportunity to move high heat loads to areas where there was adequate cooling. By automating the process, it meant that workloads were the priority for resources rather than heat load balancing.
Returning to Wolfenden's motor racing analogy, this is the equivalent of testing the aerodynamic components on the car during a test session. It delivers accurate data as to how the components work under real conditions which enable designers to improve their models further.
Moving beyond test loads to real-time data
Using the data from sensors in the racks and aisles will provide information on airflow and air temperature both hot and cold. The data is fed into the model to see where it is predicting heat and help make it more efficient with real-time data. If linked to orchestration software then the data can also be correlated to workflows. This has the advantage of providing data that can be used to carry out predictive analysis of future cooling needs.
Sensors inside servers can also provide a lot of vital data. For example they can provide information about CPU temperatures which will show how much processing is being done. With the increase in analytics being done in-memory this will give information on where certain workloads are running and the power and heat they generate.
Information from PSUs will also enable a greater understanding of energy utilisation across the data centre. It will show where power is getting dangerously close to the maximum capacity in particular racks and where there is little to no power drain showing under-utilised hardware.
All of this not only helps inform the CFD models but also the longer-term models around data centre design and utilisation. For IT managers, they can now see just how effectively they are utilising resources and the cost of that level of use.
For more information visit www.heatload.co.uk