Unsupported browser

For a better experience please update your browser to its latest version.

Your browser appears to have cookies disabled. For the best experience of this website, please enable cookies in your browser

We'll assume we have your consent to use cookies, for example so you won't need to log in each time you visit our site.
Learn more

When it's hot in the city

From a data centre operator’s perspective, ensuring that the cooling system stands up to the extremes of temperature in today’s urban environment is absolutely critical, says Roger Keenan

We should all be pretty well aware by now how much of modern business depends on the computer and data communications technology operating from data centres. From email messaging to IP-based telephony, from electronic document storage to contact management through Facebook or cloud-based CRM systems, we all live in a world dependent on modern IT systems.

They, in turn, depend on the physical data centre environment in which they operate. That environment is essentially two things – reliable power supplies and a constant temperature.

Many organisations – probably most, in fact – operate their own IT and communications systems on their premises. In the old days, when IT was an adjunct to the business, that worked just fine. In today’s world, it can also be just fine, but only if it is properly and fully specified, installed, operated and maintained.

Temperature is difficult to deal with and the most common cause of failures.

Often, equipment in a company’s computer area will work well for most of the time, but as summer comes and temperatures go up, unexplained and erratic problems start to happen as the cooling systems reach capacity if they are not optimised to the thermal load.

In extreme cases, such issues can lead to loss of data and corrupted databases, which may take many months to sort out.

As the outside temperatures go up, so the cooling systems have to work harder and harder to keep pace. Such problems may not become evident until there is a really hot day.

London has a benign and temperate environment. There are no major variations in temperature, such as are seen in, say, New York.

For most of the year, the outside temperature is below the ideal operating temperature of most electronic equipment, which may be around 22 to 24 deg C most of  the time.

There can be significant exceptions.

The highest temperature ever recorded in London was 37.6 deg C on 10 August 2003.

Although London is typically 2 deg C hotter than its surrounding environment, the highest temperature ever recorded in the UK was on the same day, 10 August 2003 at Gravesend in Kent, at 38.5 deg C.

In that hot summer, over 2,000 people died through the heat.

The heat is on

When such extremes occur, faults, failures and overloads compound and cascade. Air conditioning is an example. As the outside temperature goes up, it has to do more and more work – and then it consumes more and more electrical power.

When a new air conditioner is installed, there is no obligation on anyone to tell the electricity supply company about it, so no one knows about the extra load until a really hot day, when the power company’s breakers overload and trip out.

An example was the failure in July 2006 that caused Oxford Street to shut down, much to the rage of the retail community.

That has a huge effect on companies whose IT equipment is on their premises and who do not have diesel generators and duplication of their cooling systems. If there is enough demand and not enough supply, the electricity company will selectively switch off areas to match the two.

Unless you happen to be on the same supply as the Olympic Park when the Olympics are running, that puts everyone at potential risk.

So what is to be done? There are two main paths to resilience and reliability. One is to build a complete data centre area onsite to modern standards.

That means Uninterruptable Power Supply systems , which must be duplicated for reliability, onsite diesel generators, which must also be duplicated for reliability, and a complete cooling system, specified to cool the maximum load expected during the lifetime of the equipment, must be installed and maintained – and in duplicate so that cooling can continue without interruption during faults or maintenance.

The other solution is to move the equipment to a professional colocation data centre, where all this is handled daily, where duplicated diesel generators are the norm, and where duplicated cooling and connectivity are everyday matters.

That is what most business that have done the analysis are doing and why the data centre industry continues to grow.

Roger Keenan is managing  director at the City Lifeline  London data centre

 

Have your say

You must sign in to make a comment

Please remember that the submission of any material is governed by our Terms and Conditions and by submitting material you confirm your agreement to these Terms and Conditions.

Links may be included in your comments but HTML is not permitted.