“2011 Thermal Guidelines for Liquid Cooled Data Processing Environments” creates data center classes for liquid cooling that can enable fulltime economizers for a number of applications in many climates, according to according to Don Beaty, chair of the Publications Subcommittee of ASHRAE’s Technical Committee (TC) 9.9, Mission Critical Facilities, Technology Spaces and Electronic Equipment.
“2011 Thermal Guidelines for Liquid Cooled Data Processing Environments” can be downloaded for free from the ASHRAE TC9.9 website at www.tc99.ashraetcs.org.
The increasing heat density of modern electronics is stretching the ability of air to adequately cool the electronic components within servers as well as the data center facilities that house these servers. To meet this challenge, the use of direct water or refrigerant cooling at the rack or board level is now being deployed. This trend of increasing heat densities combined with the interest in energy and waste heat recovery created the need for liquid cooling guidelines to help bridge the gap between IT equipment design and data center facility design, according to Beaty.
Five liquid cooling classes have been created:
• W1 – Facility Water Supply Temperature of 2° to 17°C
• W2 – Facility Water Supply Temperature of 2° to 27°C
• W3 – Facility Water Supply Temperature of 2° to 32°C
• W4 – Facility Water Supply Temperature of 2° to 45°C
• W5 – Facility Water Supply Temperature of > 45°C
In addition to the classes, the white paper provides insight into other considerations for liquid cooling including condensation, operation, water flow rates, pressure, velocitym, and quality as well as information on interface connections and infrastructure heat rejection devices.
This white paper follows an earlier white paper released in May 2011, “2011 Thermal Guidelines for Data Processing Environments – Expanded Data Center Classes and Usage Guidance,” which addresses air cooling in data centers and created new data center environmental classes which expanded the opportunity for chiller-less data centers (fulltime economizers).