Acomputer room was once filled with environmentally sensitive equipment with names such as “mainframe,” “disk drive,” “data cell,” and “impact printer.” In the 1960s, many leading research institutions were just receiving their first mainframe computers, and punch cards served as a computer’s memory and input.

In 1988, after graduating with a degree in mechanical engineering, I started working as an applications engineer for the world’s leading manufacturer of environmental control equipment for data centers. At that time, we solved issues that revolved around concerns such as humidity, vapor barriers, and “printer dust.” We made chillers with integral pumps that served mainframes through rubber hoses located beneath the raised floor. Computer room air conditioning (CRAC) units had ultraviolet light humidifiers with flushing cycles, electric reheat, four steps of compressor cooling, condensate pumps, floor stands, and turning vanes.

Special unit-mounted controls were developed to keep the CRAC units from “fighting” each since some units could be in full cooling mode, while others could be humidifying, dehumidifying, or reheating. This was all state-of-the-art, as the computer equipment and peripherals were environmentally sensitive. High temperatures caused equipment malfunction and failures, low temperatures could cause condensation inside of equipment, low humidity could cause electrostatic discharge (ESD), and high humidity could make printing problematic. Selling computer room air conditioning equipment was a big business and very lucrative. I had heard that some of the company’s sales representatives had become millionaires during the 1970s.

 

EVOLVING GUIDLELINES

For several decades, the conceptual approach to maintaining the data center environment had remained virtually unchanged. Recently, ASHRAE, IT equipment manufacturers, and others have been busy redefining the data center environment. The industry once pegged the ideal data center relative humidity between 45% to 50% and the ideal temperature from 68°F to 72°. The idea was that “cooler is better” when it came to computer rooms. CRAC units were designed to control to +1° and with a tight humidity band.

In 2004, ASHRAE published vendor-neutral guidelines for various classes of data centers. The environmental requirements for Class 1 data centers, the most stringent, were listed with a temperature range of 68° to 77° and 40% to 55% rh, which met normal IT manufacturer’s warranty requirements.

In 2008, the data center classifications were revised and expanded along with the environmental requirements. For Class 1 data centers, the temperature range became 64.4° to 80.6° (18°C to 27°) and the humidity range became 42° dewpoint to 60% rh. Refer to Table 1 for definitions of ASHRAE data center classifications.

One purpose of the 2008 guidelines was to expand the environmental requirements to allow for airside economizer operation in order to save energy without compromising the reliability of IT equipment. The guidelines had a recommended environmental range and an allowable range.

The allowable range is much wider and is the range in which IT vendors test their equipment. The sentiment of the guideline authors was that data center designers and operators felt obligated to design for and operate within the recommended range instead of taking full advantage of the allowable range. The rationale is that the failure rates of IT equipment will be minimized by staying within the recommended range. IT manufacturers’ data support the conclusion that the equipment failure rate does not increase substantially as the environmental temperature increases.

In order to clarify their energy savings goals and to encourage design and operation to take full advantage of free cooling, ASHRAE Technical Committee 9.9 published a white paper in 2011. The allowable range for a Class 1 data center is 59° to 90° and 20% to 80% rh (62.6° maximum dewpoint). The 2011 white paper renamed the Class 1 data center to Class 1A and the recommended and allowable environmental ranges have stayed the same. The energy saving implications of moving the design and operation of data centers from the recommended to the allowable range are substantial. Refer to Table 2 for ASHRAE thermal guidelines.

 

ENERGY CONSUMPTION

Data processing facilities require air conditioning on a 24/7/365 basis to the address high internal heat loads generated in order to maintain a proper environment for computer equipment. Mechanical cooling methods using refrigeration machinery and air-handling equipment can easily consume 35% to 40% of a data center’s total power consumption.

The data center efficiency metric is the power usage effectiveness, or PUE, which is simply the ratio of overall power consumption of the data center to the IT equipment power consumption. Many data centers in operation today may have a PUE as high as 2.0, which means that half of the energy is consumed in order to maintain the specified temperature and humidity.

Migrating from the ASHRAE-recommended to allowable operating range provides a great opportunity to improve the PUE of new data centers. This is due to several factors. First, refrigeration equipment operates more efficiently at elevated room temperatures. Second, an expanded relative humidity range permits humidifiers to operate less frequently or to be eliminated altogether. Lastly, a wider range in both the temperature and humidity operating limits allows the concept of the airside economizer to be introduced into the data center environment.

The ideal PUE of 1.0 would require that the only energy used in the data center is for computation, which is not practical. However, the use of airside economizers by some companies have led to claims of PUE values in the range of 1.2 to 1.4 and even lower.

 

AIRSIDE ECONOMIZERS

An airside economizer, or free cooling, is a process that uses outdoor air when conditions permit to cool a facility or process in lieu of refrigeration. The process is not actually free, since there is fan energy involved, but since no refrigeration equipment is used, energy consumption is drastically reduced. Cool outdoor air is brought into the facility, picks up heat from within the space, and is then relieved to the exterior.

Historically for many climates, the economics of airside economizers could not be justified for data centers due to the high cost of adding humidification during winter months. With an ASHRAE allowable relative humidity floor of just 20%, the use of outdoor air for cooling can become economical for more operating hours and in more climates. Likewise, the allowable temperature range for Class 1 data centers of 59° to 90° expands the airside economizer operating hours.

As an example, for a typical ASHRAE Climate Zone 5 location, the annual hours of economizer operation can increase six-fold by moving from the ASHRAE Class 1 recommended to allowable environmental envelope. There is also a philosophy that endorses pushing the envelope further by assuming that brief periods of operation beyond the allowable environmental limits may not significantly increase the failure rate of IT equipment. Could we be entering the age of a compressor-less data center?

 

ADIABATIC COOLING

For the faint of heart, there is another approach to extending the operation of free cooling without passing beyond the ASHRAE allowable environmental envelope. This approach employs the concept of adiabatic cooling, otherwise known as evaporative cooling. Adiabatic cooling equipment can be categorized as either direct, indirect, or a combination of the two known as indirect/direct. They all work to exchange sensible heat for latent heat by the evaporation of water.

In direct adiabatic cooling, water evaporates into the airstream, which reduces its drybulb temperature and raises its humidity. The enthalpy remains constant, which means the wetbulb temperature remains nearly constant. The drybulb temperature of the air leaving the cooling apparatus approaches the wetbulb temperature of the incoming air.

Direct adiabatic coolers should not recirculate indoor air. The direct contact can be achieved either by an extended wetted-surface material or with a series of sprays. The wetted-surface media can be either random or rigid. Random media is usually aspen wood or an absorbent plastic fiber or foam. Face velocities are limited to about 250 fpm. Rigid media is composed of sections of corrugated material such as fiberglass, plastic, or cellulose. The media is cross-corrugated to increase the mixing of the air and water. Face velocities for rigid media are normally in the range of 400 to 600 fpm.

Spray nozzles can be used for direct adiabatic cooling and the equipment used is often referred to as an air washer. A series of nozzles is followed by an eliminator section for removing entrained water droplets. The use of direct evaporative cooling has the added benefit of removing particulate and gaseous contaminants from the airstream, but should not be used for primary filtering.

There are two main methods to employ indirect evaporative cooling. The first uses a cooling tower to evaporatively cool water that is then circulated through a heat exchanger to cool the process airstream. The other method passes secondary wetted air through one side of an air-to-air heat exchanger to remove heat from the process airstream on the other side of the heat exchanger. Heat pipes and rotary heat wheels can also be used for the indirect process. A two-stage indirect/direct process can also be employed. Advantages of using an adiabatic cooling approach include a quick restart of the cooling systems in the event of a power outage as well as the opportunity to economically place the HVAC systems on emergency power as the loads are much smaller.

A hybrid approach can also be considered for maintaining the data center environment, where the use of an airside economizer and adiabatic cooling can be coupled with the use of refrigeration for when outdoor conditions call for either chilled water or direct expansion cooling. Of course, the first cost of this approach will be greater than implementing either an adiabatic cooling system or a traditional one that employs refrigeration alone. Besides reducing energy consumption, the deployment of these newer environmental control concepts increases the amount of valuable floor space inside the data center for IT equipment. The use of outside air for cooling the data center makes locating the HVAC equipment outdoors the most expedient solution. Roof- and grade-mounted HVAC equipment appear to be the trend.

 

CONCLUSIONS

The initiative of looking at especially energy-consuming building types such as laboratories and hospitals has been extended to data processing facilities. The work of ASHRAE, members of the IT equipment industry, and others has led to a consensus on the recommended and allowable environmental envelopes that have in turn allowed data center designers and operators to explore using new methods to control the critical environment and save energy. The industry should embrace these changes and follow the lead of these companies that are now controlling the data center environment without the use of refrigeration by employing methods such as airside economizers and adiabatic cooling.ES


 

Side Bar

Compliance with a particular environmental class requires full operation of the equipment over the entire allowable environmental range, based on non-failure conditions.

Class A1:Typically a data center with tightly controlled environmental parameters (dew point, temperature, and relative humidity) and mission critical operations; types of products typically designed for this environment are enterprise servers and storage products.

Class A2:Typically an information technology space or office or lab environment with some control of environmental parameters (dew point, temperature, and relative humidity); types of products typically designed for this environment are volume servers, storage products, personal computers, and workstations.

Class A3/A4:Typically an information technology space or office or lab environment with some control of environmental parameters (dew point, temperature, and relative humidity); types of products typically designed for this environment are volume servers, storage products, personal computers, and workstations.

Class B:Typically an office, home, or transportable environment with minimal control of environmental parameters (temperature only); types of products typically designed for this environment are personal computers, workstations, laptops, and printers.

Class C:Typically a point-of-sale or light industrial or factory environment with weather protection, sufficient winter heating and ventilation; types of products typically designed for this environment are point-of-sale equipment, ruggedized controllers, or computers an PDAs.

 

a.  Classes A1, A2, B and C are identical to 2008 classes 1, 2, 3 and 4.  These classes have simply been renamed to avoid confusion with classes A1 through A4. The recommended envelope is identical to that published in the 2008 version.

b.  Product equipment is powered on.

c.  Tape products require a stable and more restrictive environment (similar to Class A1). Typical requirements: minimum temperature is 59.0°F, maximum temperature is 89.6°F, minimum relative humidity is 20%, maximum relative humidity is 80%, maximum dewpoint is 71.6°F, rate of change of temperature is less than 41.0°F/H, rate of change of humidity is less than 5% rh per hour, and no condensation.

d.  Product equipment is removed from original shipping container and installed but not in use (e.g., during repair maintenance, or upgrade).

e.  A1 and A2 – Derate maximum allowable drybulb temperature 33.8°F/300 m above 950 m.

A3 – Derate maximum allowable drybulb temperature 33.8°F/175 m above 950m.

A4 – Derate maximum allowable drybulb temperature 33.8°F/125 m above 950 m.

f.  41.0°F/hr for data centers employing tape drives and 68.0F/hr for data centers employing disk drives.

g.  With diskette in the drive, the minimum temperature is 50.0°F.

h.  The minimum humidity level for class A3 and A4 is the higher (more moisture) of the 10.4°F dew point and the 8% relative humidity. These intersect at approximately 77.0°F. Below this intersection 77°F the dewpoint (10.4°F) represents the minimum moisture level, while above it relative humidity (8%) is the minimum.

i.  Moisture levels lower than 33°F dp, but not lower 14.0°F dp or 8% rh, can be accepted if appropriate control measures are implemented to limit the generation of static electricity on personnel and equipment in the data center. All personnel and mobile furnishings/equipment must be connected to ground via an appropriate static control system. The following items are considered the minimum requirements (see Appendix A for additional details):

1.  Conductive materials

a.  Conductive flooring

b.  Conductive footwear on all personnel that go into the data center, including visitors just passing through

c.  All mobile furnishing/equipment will be made of conductive or static dissipative materials

2.  During maintenance on any hardware, a properly functioning wrist strap must be used by any personnel who contacts IT equipment.