Marrying CFD with real-time monitoring

b3506aac-f1cf-4d49-bc02-b45461363a7e

29 August 2016
|
Power and cooling are, for many data centre owners, their biggest operational expenditure (Opex) cost. It is not just the physical cost of the energy that impacts Opex.

It is the increasing cost of emission taxes applied to data centres. To help improve efficiency, data centre owners have turned to computational fluid dynamics (CFD) to model airflow to optimise cooling. ​While the science behind CFD makes it ideal for modelling data centres, it can quickly become outdated as workloads change.

Dave Wolfenden of Heatload explains why models must be verified and use real-time data to stay valid.

The rise of Computational Fluid Dynamics in the data centre

CFD is used in a wide variety of industries to understand fluid flows. Sophisticated algorithms and analysis show how fluid, in this case, air, moves. CFD has made it important to industries such as aerospace and automotive as they look to improve aerodynamics. Racing teams at major motor races use CFD to see how effective the new parts are in a car in the early practice sessions. While those same parts will have already been modelled and tested in a wind tunnel, the real-time race data is used to tune the models.

Wolfenden believes that CFD on its own is not enough. It is akin to creating a new aerodynamic part for a motor racing team then bolting it to the car hoping that it will deliver a race winning performance. While it may provide some benefits, they will be severely limited in scope. Worse still they are likely to lead to a range of other decisions that can even degrade the overall performance.
Content continues after advertisements

​Use heat loads to test models

​One way to improve models is to introduce heat loads into the data hall to simulate the type of workload expected. This data can be captured and then applied to the model to identify where it begins to diverge from the captured data. To stay with the racing analogy above, this is the equivalent of using a wind tunnel to test aerodynamic components before putting them on a race car.

The use of heat loads is nothing new. An increasing number of companies already use them to test the initial design of the data hall. The problem is that designers do not universally use them nor are they regularly used during refurbishment. Wolfenden believes data centre designers are missing the point. It is not just about the heat loads validating their designs and models but providing a better baseline and library of designs that can speed up the development of future data centres.

Heat and cooling are directly related to workload

​No matter how efficient the design model appears and how well it has performed under test conditions it is when designers apply real workloads that it can be truly validated. Using real workloads is a significant challenge for designers. Hardware, software and workloads change over the life of a data hall. A model can be outdated before any contractors install any hardware. When you replace hardware, it is possible to import the technical data from the vendor to update a model, and this will help improve it and the way the engineers configure the data centre. The bigger problem is software and the underlying workloads.

An example of the problem is the introduction of virtualisation. Workloads changed from being contained on servers to running anywhere in the data centre. It created the opportunity to move high heat loads to areas where there was adequate cooling. By automating the process, it meant that workloads were the priority for resources rather than heat load balancing.

Returning to Wolfenden's motor racing analogy, this is the equivalent of testing the aerodynamic components on the car during a test session. It delivers accurate data as to how the components work under real conditions which enable designers to improve their models further.

​Moving beyond test loads to real-time data

​There are several sources of evidence that can be used to help drive models in a live data centre. It is important to take advantage of the tsunami of sensors that have appeared inside the data centre over the last 20 years. The sensors are inside servers, storage devices, switches, power units, racks and aisles. So what data can you use and how?

Using the data from sensors in the racks and aisles will provide information on airflow and air temperature both hot and cold. The data is fed into the model to see where it is predicting heat and help make it more efficient with real-time data. If linked to orchestration software then the data can also be correlated to workflows. This has the advantage of providing data that can be used to carry out predictive analysis of future cooling needs.

Sensors inside servers can also provide a lot of vital data. For example they can provide information about CPU temperatures which will show how much processing is being done. With the increase in analytics being done in-memory this will give information on where certain workloads are running and the power and heat they generate.

Information from PSUs will also enable a greater understanding of energy utilisation across the data centre. It will show where power is getting dangerously close to the maximum capacity in particular racks and where there is little to no power drain showing under-utilised hardware.

All of this not only helps inform the CFD models but also the longer-term models around data centre design and utilisation. For IT managers, they can now see just how effectively they are utilising resources and the cost of that level of use.

​Conclusion

Modelling a data centre is a vital part of any design. Failing to update that model with real-time data when it is available ensures that the model is not only ineffective but can also incur considerable extra costs.

For more information visit www.heatload.co.uk