In a conventional data centre, 35 percent to as much as 50 percent of the electrical energy consumed is for cooling versus 15 percent in best-practice ‘green’ data centres, according to Gartner.
“Virtually all data centres waste enormous amounts of electricity using inefficient cooling designs and systems,” said Paul McGuckin, research vice president at Gartner. “Even in a small data centre, this wasted electricity amounts to more than 1 million kilowatt hours annually that could be saved with the implementation of some best practices.”
The overriding reason for the waste in conventional data centre cooling is the unconstrained mixing of cold supply air with hot exhaust air. “This mixing increases the load on the cooling system and energy used to provide that cooling, and reduces the efficiency of the cooling system by reducing the delta-T (the difference between the hot return temperatures and the cold supply temperature). A high delta-T is a principle in cooling,” McGuckin said.
Gartner has identified 11 best practices which, if implemented, could save millions of kilowatt hours annually.
Plug Holes in the Raised Floor —- Most raised-floor environments exhibit cable holes, conduit holes and other breaches that allow cold air to escape and mix with hot air. This single low-tech retrofit can save as much as 10 percent of the energy used for data centre cooling.
Install Blanking Panels — Any unused position in a rack needs to be covered with a blanking panel to manage airflow in a rack by preventing the hot air leaving one piece of equipment from entering the cold-air intake of other equipment in the same rack. When the panels are used effectively, supply air temperatures are lowered by as much as 22 degrees Fahrenheit, reducing the electricity consumed by fans in the IT equipment, and potentially alleviating hot spots in the data centre.
Co-ordinate CRAC Units — Older computer room air-conditioning units (CRACs) operate independently with respect to cooling and dehumidifying the air. These units should be tied together with newer technologies so that their efforts are coordinated, or remove humidification responsibilities from them altogether and place those responsibilities with a newer piece of technology.
Improve Underfloor Airflow — Older data centres typically have constrained space underneath the raised floor that is not only used for the distribution of cold air, but also has served as a place for data cables and power cables. Many old data centres have accumulated such a tangle of these cables that airflow is restricted, so the underfloor should be cleaned out to improve airflow.
Implement Hot Aisles and Cold Aisles — In traditional data centres, racks were set up in what is sometimes referred to as a ‘classroom style’, where all the intakes face in a single direction. This arrangement causes the hot air exhausted from one row to mix with the cold air being drawn into the adjacent row, thereby increasing the cold-air-supply temperature in uneven and sometimes unpredictable ways. Newer rack layout practices instituted in the past 10 years demonstrate that organising rows into hot aisles and cold aisles is better at controlling the flow of air in the data centre.
Install Sensors — A small number of individual sensors can be placed in areas where temperature problems are suspected. This investment in instrumentation can provide insight into the dynamics of possible data centre temperature problems.
Implement Cold-Aisle or Hot-Aisle Containment — Once a data centre has been organised around hot aisles and cold aisles, improved separation of cold supply air and hot exhaust air through containment becomes an option.
Raise the Temperature in the Data Centre — Many data centres are run colder than an efficient standard. This temperature can be adjusted to result in energy savings.
Install Variable Speed Fans and Pumps — Traditional CRAC and CRAH units contain fans that run at a single speed. Emerging best practice suggests that variable speed fans be used whenever possible.
Exploit ‘Free Cooling’ — ‘Free cooling’ is the general name given to any technique that cools air without the use of chillers or refrigeration units. The two most common forms of free cooling are air-side economisation and water-side economisation. The amount of free cooling available depends on the local climate, and ranges from approximately 100 hours per year to more than 8,000 hours per year.
Design New Data Centres Using Modular Cooling — Traditional raised-floor-perimeter air distribution systems have long been the method used to cool data centres. However, mounting evidence strongly points to the use of modular cooling (in-row or in-rack) as a more-energy-efficient data centre cooling strategy.
“Although most users will not be able to immediately implement all 11 best practices, all users will find at least three or four that can be immediately implemented in their current data centres,” said McGuckin. “Savings in electrical costs of 10-to-30 percent are achievable through these most-available techniques. Users committed to aggressively implementing all 11 best practices can achieve an annual savings of 1 million kilowatt hours in all but the smallest tier of data centres.”