Unsupported browser

For a better experience please update your browser to its latest version.

Your browser appears to have cookies disabled. For the best experience of this website, please enable cookies in your browser

We'll assume we have your consent to use cookies, for example so you won't need to log in each time you visit our site.
Learn more

Focus - Data centre cooling

Data centre cooling efficiency is becoming increasingly important, and getting your design strategy right is the key to achieving that. RAC reports

In the next decade data centre energy consumption is expected to significantly increase as computers become ever more prevalent, and with this expansion is the need for ever more efficient cooling.

According to a report by the EU server energy consumption will stand at 23.5TWh (trillion watt hours) for 2011, up from 14.7TWh in 2006, that’s a dramatic 60 per cent increase.

Alan Beresford managing director at data centre solutions company Ecocooling, says: “The figures for consumption are huge, for example a recently opened data centre in Swansea consumes 90Mega Watts an hour.”

Mr Beresford believes we will see further pressure on companies to reduce server energy consumption arriving sooner rather than later.

“We’ve already seen the government’s Carbon Reduction Commitment (CRC), which is aimed at improving energy efficiency and cutting emissions in large public and private sector organisations, introduced.

“And though it’s currently limited to organisations that consumed more than 6,000 megawatt-hours (MWh) per year of half hourly metered electricity during 2008, it wouldn’t surprise me if that figure is lowered in the future, forcing people to have a look at data centre energy consumption.”

Mr Beresford also points to the current European data centre Code of Conduct, which comprises of voluntary measures to reduce energy consumption. Introduced in 2009, the code is seem as a pre-cursor to more stringent legislation in the coming years as the EC sees the technology as a heavy and inefficient consumer of energy.

This could mean the introduction of a maximum Power Usage Effectiveness (PUE) level, which is a standard to measure data centre efficiency. For example, if a data centre has a PUE of 1.5, this is the ratio of total energy consumed compared with the IT load.

Most centres are rarely below 1.5, with many older systems touching 2.5, the likelihood is that this could be required to drop significantly.


Data Centres: Designing for demand

Correct system design methodology is essential in order to reduce operating costs, says Paul Oliver of Airedale

There is no one ideal solution for data centre cooling. The actual demands of a data centre fluctuate on a daily basis and are nearly always lower than the maximum design figures.

A new data centre is likely to run at a fraction of the maximum design and may take several years to approach the installed capabilities. Some never reach their maximum design parameters.

Like a car manufacture’s consumption figures, which only indicate the vehicle’s efficiency at pre-certified conditions demand in a data centre varies greatly according to seasonal factors, time of day and infrastructure.

Server technology development is raising the thermal envelopes of data centres. Traditionally, 2kW of energy per rack would have an air off temperature (AOT) of around 28 deg C. A fully populated 42U rack may reject up to 30kW of energy, at an AOT up to 45 deg C. This has led to greatly increased (a) capacity needed (b) temperatures.


Measuring efficiency

The accepted measure of efficiency for a chiller is ESEER (European Seasonal Energy Efficiency Ratio). The average chiller loading works out to be 54 per cent of its full load and the temperatures under which it is measured only represents 7 per cent of the typical UK ambient hours.

Whilst this loading profile may match comfort cooling applications, few if any data centres will have these loading characteristics.

Because the demands of the data centre are now higher densities at higher operating temperatures, it means that the traditional chilled water temperatures of 7/12 deg C are no longer appropriate and we are more likely, to use chilled water at around 14/19 deg C.

Selecting a chiller based on its ESEER rating only, without further considering its actual annual overall energy efficiency including its ability (or not) to meet the cooling demand with concurrent free cooling, is not appropriate. In the following example, two chillers are run at 7-12 deg C

400kW conventional chiller, ESEER of 4.1, cost £35k, energy consumption £53k p.a.

2. 400kW free-cooling chiller, ESEER of 4.0, cost £48k, energy consumption 28k p.a.

The increased cost of the free-cooling chiller, £13k, is paid off in less than two years.

However, when run at 14/19 deg C, the free-cooling chiller is repaid in six months.

Efficient design

Resistance of air movement in data centres is a significant negative. More air means higher velocities, more resistance and an increase in the power requirements of the system fans – which are a significant consumer of power in the data centre.

EC fans consume power in relation to the cube of their speed. So a fan running at 80 per cent speed uses 51 per cent of the power of a fan running at 100%. A good quality EC fan fitted to an 80kW chilled water CRAC unit will cost approximately £3.4k p.a. to run.  A poor quality fan badly selected and applied would increase the running cost by over 75 per cent to around £6k p.a.

It is usual to supply CRAC units on an N+1 basis but historically, the standby unit has remained dormant for long periods. In a system comprising four run units and one standby unit, it is much more efficient to run all five units at 80 per cent airflow, than four units at 100 per cent. This scenario alone would reduce the fan power consumption by around 49 per cent.

With increased employment of hot/cold aisle containment systems, it becomes a natural development to supply the cold aisle with an amount of air which is equal to that demanded by the servers.

This can be achieved using ‘constant pressure control’ which ensures that the individual server fans of the room are not under or over pressurised with air. Too little cool air and the servers overheat; too much and the fans can mechanically malfunction.

The system adapts to load requirements and feeds this cooling air to match the collective server requirements. Data Centre Power Usage Effectiveness (PUE) and Data Centre infrastructure Effectiveness (DCiE) can both be improved.


Integrated free-cooling

Rising temperatures in the data centre give greater opportunities for free-cooling. The opportunity for free-cooling is present when the ambient temperature is below the room operating temperature, i.e. 98 per cent of the UK ambient year (London, UK). The best free-cooling chiller systems bring together concurrent free-cooling and mechanical cooling, enabling free-cooling to be captured whenever the ambient is below the return water temperature.

Non-concurrent free-cooling chillers require a very low ambient temperature to operate and whenever free-cooling cannot deliver 100 per cent of the required capacity, free-cooling is sacrificed and completely replaced by mechanical cooling. Some free cooling systems will only operate for less than 2.5 per cent of the UK year and will actually cost more to run than a conventional chiller system – but they tick the ‘free-cooling’ box and the actual running efficiencies are often overlooked.

Bespoke solution

When increased temperatures are combined with medium/high density loads; concurrent free-cooling; variable speed fans and clever, dynamic controls to create an intelligent, bespoke system, real end user benefits can be achieved in reduced power usage and operational costs of the data centre.

The high efficiency ECHO IT Cooling System is such a system. Comprising ACE (Active Cabinet Exhaust) unit, CRAC unit, concurrent free-cooling chiller and control system, the ECHO ensures the right air temperature, in the correct quantity and at the correct pressure is presented to the server inlet, enabling it to breathe efficiently. By varying the air volume, the ECHO system operates not only with air volumes 50 per cent less than traditional cooling systems but much more efficiently and with elevated water temperatures that allow up to 95 per cent free-cooling.

The ECHO system takes advantage of all the available energy saving opportunities currently available and when compared to a modern conventional downflow chilled water system, energy reductions in the region of 56 per cent can be achieved and payback of the increased capital costs of the system is retrieved in less than a year.

Paul Oliver is Airedale’s sales director


Carbon Trust data centre design service

The Carbon Trust (CT) has launched a data centre design service, which it claims can reduce energy costs by up to 50 per cent.

Hugh Jones, solutions director at the Carbon Trust, said: “Along with the ever increasing demand for storage comes an ever increasing demand for energy to power and cool UK data centres. Low carbon design in new build and refurbishment projects has the potential to unlock hundreds of millions of pounds in energy bills each year.

“Now that the design service has been successfully road tested in what is expected to be one of the largest data centres in the world, we are very keen to offer our experience to other developers.”

That project is the multi-billion pound Lockerbie Data Centre in Scotland, which will accommodate 50,000 server racks, with a peak power of 300 megawatts.

However, the CT designed data centre is said to emit 200,000 tonnes less of CO2 than a conventional centre of the same size.

It incorporates advice from the CT on maximising the natural flow of air to keep components cool, as well as using local renewable energy sources and reusing the heat generated.

David King, Lockerbie data centre project director, said: “When complete, each of the data centre’s forty modules will use around half as much energy as a conventional data centre of the same size.”

Have your say

You must sign in to make a comment

Please remember that the submission of any material is governed by our Terms and Conditions and by submitting material you confirm your agreement to these Terms and Conditions.

Links may be included in your comments but HTML is not permitted.