Data Centre Cooling - The Issues

Data Centre Cooling – The Issues

Issues with cooling Data Centres

Whether you are concerned with hot spots and intermittent equipment failures, reducing capital costs by eliminating the need for additional cooling units, or extending the life cycle of your current Data Centre, KoldLok® products and services can help. Properly cooled Data Centres not only work more efficiently but also save valuable company funds. The amount of power that the typical server rack requires is converted 100% to the amount of heat it generates. With a typical rack of computer equipment this is about 1.5Kw. But, with the current trend moving towards Blade technology, heat output has increased to nearly 15Kw per rack.

If the Data Centre is not being cooled efficiently then the effect of these higher temperatures within the Data Centre environment is scary. Thermal stress, ghosts, reduced throughput, reduction in reliability due to improperly cooled equipment and increased chance of static discharge can all be brought on by higher temperatures. The density of servers and computer equipment in mission critical Data Centres has been increasing, and is expected to continue to rise. Even if you have plenty of cooling within the Data Centre, hot spots and other heat related problems can still arise.

So what can be done:

  1. Permanently sealing all unnecessary air escape routes (bypass air) will increase underfloor static pressure and reduce bypass air. This includes sealing cable cutouts behind and underneath products or racks as well as penetrations in the subfloor, walls, ceiling or any other openings in the raised floor. This will enable you to get the cold air where you need it. KoldLok floor grommets are specifically designed to seal cable cutouts in a variety of applications. By sealing each cable cutout you will save about 2kW of cooling.
  2. Remove any perforated tiles that are in the hot aisle (the aisle behind the server cabinets). Having perforated tiles in hot exhaust aisles leads to the hot air being cooled before reaching the cooling units. This results in a reduction in the cooling capacity of the cooling units.
  3. This latent cooling results in a reduction in the humidity within the Data Centre, which in turn can affect the performance of your servers. Latent cooling removes moisture from the Data Centre, which in turn requires you to continually re-humidify. This is an often overlooked extra cost. Latent cooling not only reduces the cooling output of your cooling units, but also requires continual re-humidification – an expensive situation.
  4. In some cases, equipment placed too close to the cooling unit may not get any airflow from the perforated tiles at all. This is due to high velocity airflow under the raised-floor. High velocity airflow close to the cooling units in Data Centres with short underfloor plenum heights can actually reverse the flow of air through perforated floor tiles that are located close to the cooling units. This phenomena will pull ambient air from the Data Centre into the under floor plenum.
  5. Switching to a proper Hot Aisle/Cold Aisle configuration with sufficient underfloor pressure and minimal bypass air will maximise the room’s cooling efficiency. It will also reduce the risk of hot exhaust air from one cabinet flowing into the air intake of adjacent cabinets.
  6. Positioning the cooling units so that the aisles run parallel to them creates an Airflow Plume. This will help minimise exhaust re-circulation and high velocity discharge and maximise the CFM of each perforated tile.

If you sealed just 20 8″ x 5″ cutouts with KoldLok floor grommets the cooling capacity regained would be enough to cool another blade rack. Seal 400 8″ x 5″ cutouts and you would regain enough cooling capacity to cool 20 14 kW Blade Server racks or eliminate 5 – 20 ton / 10,000 CFM cooling units!

Share this post

Blog Categories

Blog Archives