Data centre cooling comes at a cost, and appears at odds with the environmental and energy saving aims of a business. John Grenville, Managing Director of ECEX, explains how the cooling sector can help maintain the delicate balance between operational effectiveness and energy efficiency.
Information technology is the fastest moving sector served by the air conditioning and refrigeration industry, which places significant pressures on designers and constructors of close control facilities such as data centres.
With the power of computer servers continuing to grow at an exponential rate, the development of more sophisticated equipment to keep this technology cool is key. Indeed, arguably the biggest constraint on the growth of computer power in business is not the technology itself, rather it is the cooling for cutting-edge IT developments such as blade servers.
Blade servers have emerged as the biggest single technological issue in close control applications like data centres. The increase in this sort of high-tech, mission-critical equipment – and the fact that it tends to operate 24 hours a day, seven days a week – places enormous pressure on close control air conditioning suppliers to offer ever higher cooling capacities, better reliability and greater efficiencies.
Suppliers have responded to this challenge by providing a number of solutions, some based on conventional close control units and others on brand new technology to enable high-density computer facilities to be kept at the correct temperatures in an energy efficient way.
However, it is not only manufacturers that must grapple with increasing IT demands. Contractors and consultants also face a real challenge – how to design and build cooling processes that will guarantee the performance and reliability of mission-critical computers while, at the same time, balancing dramatic increases in cooling requirements with energy efficiency.
There are essentially three cooling media to cool the IT equipment that produces high heat loads – air, water, or carbon dioxide (CO2). The key to specifying the right one is to keep it simple and offer best value. So the cooling medium needs to be matched to the equipment loads and rack density.
Air has long been the preferred option for engineers, but it is limited in terms of cooling output to around 7kW per cabinet depending on the layout, space constraints and airflow strategy. Water is more efficient at cooling computers than air, however, water is a conductor of electricity and IT operators can be deterred from using a solution that involves a water/electronics mix.
CO2 is a significant greenhouse gas and a big contributor to global warming. Conversely, using CO2 as cooling for computer systems might, ironically, save energy and therefore reduce the carbon emissions that increase the likelihood of global warming.
The best way to strike a balance between the need to cater for higher heat loads and energy efficiency is to take a holistic approach to the problem. People tend only to talk about getting the heat from the servers out of the room, but it is also important to go a step further and consider how to get the heat out of the building too.
If the data centre cooling is a chilled water system, for example, it might be appropriate to specify a free cooling chiller. This means that, on days when there is a low ambient temperature, the cooling can be provided by a direct air-to-water exchange without running any energy consuming compressors.
It doesn’t matter if you have the most efficient close control unit available if you attach it to an antiquated chiller. Therefore, it is absolutely essential to consider external cooling technology as part of the energy and reliability equation.
Filtration is key
Cooling towers and air handling units (AHUs) draw in air from the outside which is potentially full of debris including dust, leaves and pollen, all of which can clog system filters. To ensure these outdoor units work to their optimum capacity, filters must be kept clean and in good working order – a simple step that can offer big savings on fuel bills.
This is essentially because lower airflow will result in a unit working harder to meet its cooling targets, leading to higher energy consumption and increasing the likelihood of breakdown. One influential study, for example, revealed that when condenser flow rate is reduced by 20% in mechanical and absorption chillers, full load energy usage is increased by 3%.
Best practices in building maintenance and operations can reduce HVAC usage by 10 to 20%, with poor maintenance increasing energy usage by 30 to 60%.
By stopping airborne particulates from entering a system in the first place, cooling is kept constant, risk of breakdown is reduced and maintenance requirements are cut. This can be achieved by external, pre-filtration media. A solution that can improve the operational efficiency of existing cooling equipment.
Examples such as ECEX Air Intake Screens offer payback through reduced energy consumption in a very short timeframe; a solution which will be welcome to building managers facing high cooling bills - a large proportion of a data centre’s annual budget, particularly during periods of raised ambient temperatures in the summer. Reduced energy consumption also equates to reduced strain on chillers, condensers, AHUs and cooling towers, cutting the risk of breakdown and downtime.
Whichever close control air conditioning system is specified, it is fundamentally important to follow one cardinal rule – don’t take chances. Ensure the system maintains the integrity and resilience of both the computers and the data centre. There can, after all, be no compromise in terms of the availability of this mission-critical equipment; if it fails because the cooling is inadequate or unreliable then people’s livelihoods are on the line.
Maintaining the operational/energy cost balance
Engineers are called upon to juggle lifecycle and energy costs as they are faced with an array of computer servers that are changed and rearranged almost continually. To make matters worse, ever greater computing power is being squeezed into ever smaller spaces. Up to 45% of the energy cost of a data centre can relate to HVAC, with 3% of the UK’s total electrical energy usage estimated to be attributable to data centres.
Anything that can be done to mitigate the voracious appetite of data centre cooling for power and energy has to be a good thing. Working with AMEY, ECEX recently conducted a controlled, long-term field trial to accurately measure the energy saving benefits of using ECEX Air Intake Screens, an external pre-filtration system that traps airborne debris before it enters a building.
At Westminster City, AHUs were monitored, first without ECEX Air Intake Screens and then with them. Over a period of 31 days, the addition of Air Intake Screens resulted in a power consumption fall of almost 5% and a reduction in CO2 emissions of 296.33kg. This meant the ECEX Air Intake Screen payback was just 13 months on energy alone.
Air Intake Screens also reduced maintenance because the AHUs didn’t have to work so hard. This means filters needed replacing less frequently, further reducing costs and potentially pushing the return on investment of ECEX Air Intake Screen installation to less than four months.
In buildings with several AHUs, cooling towers, or other HVAC equipment that takes in air from outside, energy, money and carbon savings stack up favourably.