When it comes to data center environments, both the temperature/humidity parameters and the conventional wisdom about designing for them have been, well, a little narrow. Look over the results of advanced research and then dig out the owner’s manual of your client’s servers, and you may be ready to make the leap to an equally effective, much less wasteful design.

I have three boys. The eldest is a risk taker. The other two, not so much. This distinction played out on my roof a few years ago when I was putting up the Christmas lights. My oldest son, Andy, had been climbing up and helping me for years. But this year, my middle son, Connor, worked up the nerve to join us up there. He was fine on the ladder, but as soon as he got on the roof he dropped spread eagle on the roof, frozen in fear.

As he cried, I tried to comfort him, explaining that his panic was driven by years of evolution telling him that walking on an inclined surface 15 ft above the ground wasn’t the smartest thing to be doing. In essence, his own common sense was kicking in and giving him the signal to stay low.

When Andy heard this explanation he instantly recognized that he didn’t share his brother’s fear of falling and, in turn, had an advantage on his younger sibling. And like any good big brother, he broke into a jig, on the roof, and in a voice of superiority, taunted his brother with, “I ain’t got no common sense, I ain’t got no common sense…” His thoughtfulness was only exceeded by my pride in his mastery of the English language.

Stepping outside a comfort zone, either literally or figuratively is not easy. Staying with the herd is always simpler than bucking a trend. Consensus and compromise are consoling. Incremental change is preferable to a revolution.

But what happens when things are happening so fast that you really don’t have the luxury of time? When technology and global drivers like environmental sustainability and the cost of energy are shifting so rapidly, how comfortable can you really allow yourself to be? Where is the edge, and do you dare cross it?

STATUS QUO TO-AND-FRO

About ten years ago, if you asked a serious data center HVAC designer what the normal design conditions should be in a data hall, the response would likely have been 72ºF + 2º at 50% rh + 5%. Further, these conditions would have likely been measured at the return of the computer room air conditioning (CRAC) unit, thus representing a room average. This tight control band gave birth to a “precision air conditioning” market segment and constant volume CRAC units with integrated humidifiers and reheat sequences. The energy costs associated with these units were (and are) relatively high, but the precision required outweighed the savings desired.

Figure 3

In 2004, ASHRAE’s prolific TC 9.9 issued the first guide in its essential Datacom Series and provided guidance on recommended temperature and humidity ranges. The recommended bands were 68º to 77º and 40% to 55% rh. And, although these ranges were still relatively tight, the clarification that these were inlet conditions at the equipment and not room averages was the first tug at the “typical HVAC design” house of cards.

In 2008, TC 9.9 issued a seminal whitepaper, and the recommended temperature range became 64.4º to 80.6º. And the humidity reference changed from relative to an absolute range of 41.9º to 59º dewpoint. ASHRAE’s allowable range remained at 59º to 90º, with a 20% to 80% rh and a maximum dewpoint of 63º.

What is fascinating about all of this is that the typical vendor ranges were and continue to be around 50º to 95º and 20% to 80% rh.

When you lay all of these ever expanding criteria one on top of the other on a psych chart (Figure 1), you have to ask: Why is ASHRAE still recommending that we stay in such a relatively tight box, when the equipment we are trying to serve can operate in such a huge band?

In 2004, ASHRAE’s prolific TC 9.9 issued the first guide in its essential Datacom Series and provided guidance on recommended temperature and humidity ranges. The recommended bands were 68º to 77º and 40% to 55% rh. And, although these ranges were still relatively tight, the clarification that these were inlet conditions at the equipment and not room averages was the first tug at the “typical HVAC design” house of cards.

In 2008, TC 9.9 issued a seminal whitepaper, and the recommended temperature range became 64.4º to 80.6º. And the humidity reference changed from relative to an absolute range of 41.9º to 59º dewpoint. ASHRAE’s allowable range remained at 59º to 90º, with a 20% to 80% rh and a maximum dewpoint of 63º.

What is fascinating about all of this is that the typical vendor ranges were and continue to be around 50º to 95º and 20% to 80% rh.

When you lay all of these ever expanding criteria one on top of the other on a psych chart (Figure 1), you have to ask: Why is ASHRAE still recommending that we stay in such a relatively tight box, when the equipment we are trying to serve can operate in such a huge band?

METHINKS IT'S GROUPTHINK

Let me preface everything I am about to say by stating I am a huge proponent of ASHRAE. As a proud member and advocate of the Society, I recognize that without ASHRAE our industry would be muddled and lost. It provides direction, clarification, and, most importantly, it provides a forum for industry consensus.

But its very strength can be a weakness when viewed in the context of rapidly evolving design paradigms. ASHRAE is a volunteer organization whose technical committees are composed of folks with day jobs and differing priorities. Whatever the committees create must be vetted both within the committee itself and by the public at large. By design, nothing happens quickly. And while progress is almost always ensured, it will be deliberate, not dramatic.

Therefore, when it comes to the rapid redefinition of what a data-center environment is or should be, it should come as no surprise that the recommended ranges are well within the groupthink safety zone. In essence, it’s an official path of least resistance.

Unfortunately, there is a disconnect on this path.

On the one hand, ASHRAE states that facilities should be designed and operated to target the recommended range. Then it states that equipment should be designed to operate within the extremes of the allowable operating environment. But if the point is to support the equipment, why would I have a higher threshold for the facility? Recall that all of these conditions are applied at the inlet condition of the equipment anyway, so what exactly would the “facility” be by that definition?

Ultimately, it’s all academic. As I have stated in previous articles, mission critical conditioning is process oriented, not facility oriented. If gear can function in a tent in the parking lot (which has been done by the way), we don’t care, as long as the equipment is operating and the mission supported.

So while ASHRAE guidance is helpful as a super-safe reference point, the design temperature and humidity limits to be applied should be determined based on the equipment and the operational protocols of the client.

SO WHAT'S THE DEAL WITH HUMIDITY ANYWAY?

In previous articles, we have talked about temperatures in the data center ad nauseam. But there are arguably just as many questions regarding humidity requirements as there are temperatures. Why has the humidity range historically been so tight in data centers, and even now why do you get design recommendations that seem to exceed equipment requirements? Like everything, history plays a part.

In the ASHRAE Datacom Series publication, “Best Practices for Datacom Facility Energy Efficiency,” there is a fascinating appendix describing the historical environmental and operating differences between telecom facilities and datacom facilities. In a nutshell, telecom facilities were seen as rugged settings incorporating more robust equipment requiring less operational sophistication (think telephone switching stations). Meanwhile, datacom facilities were high-end spaces with sensitive equipment that required precise and complex support.

One could argue that through the 1980s and possibly the 1990s, this “rugged low-tech vs. sophisticated high-tech” distinction remained valid. But now the stuff in either a datacom or telecom room is pretty similar. However, the folks who grew up in their respective industries carry with them their own preconceived notions of what their spaces should look and feel like. So even though telecom guys rarely worried about humidity, datacom guys did ... and still do.

If you ask most datacom folks why humidity matters on the low end, they will almost always default to electrostatic discharge (ESD) concerns. On the high end, they may allude to non-condensing specs on their equipment or corrosion. But in both cases, most of the time, their opinions are dated and more anecdotal than empirical.

DISCHARGING THE ESD HVAC MYTH

ESD is real. ESD can wreak havoc on equipment and data center operations. ESD is bad. So, obviously you need to stop worrying about ESD.

Huh?

Seriously, you can’t completely forget about ESD, but trying to solve the problem by manipulating humidity levels is wrongheaded. Think of it this way: While low humidity may facilitate ESD, it is not the root cause of the phenomenon.

I marvel at how prolific some in our industry are. Just when I start to think I’m earning my keep professionally, I come across a guy like Mark Hydeman of Taylor Engineering in Alameda, CA, and I wonder why I even get up in the morning. Mark is one of those guys who manages to design, lecture, consult, write, and research, all while running one of the most respected design shops in the country. Bottom line: He is a connected and bright guy.

Mark has been working this ESD issue for quite awhile and advocating quite eloquently for less humidity control in data centers. In a recent article, Mark touched on the ESD issue and reached the conclusion that while ESD is an issue of concern, the solution is in the rating and testing of equipment along with the incorporation of personnel grounding protocols. By understanding that the riddle is electrical in nature and not mechanical (grounding versus humidification), the solution to the conundrum is not just more appropriate - it also requires zero energy.

This conclusion is based in part on the recommendations of the Electrostatic Discharge Association (ESDA) and its standard ANSI/ESD S20.20-2007, which eliminated humidification as a primary control for prevention of ESD events.

Look, if there are a bunch of guys so committed to ESD that they formed an association, then I’m going to listen to them in lieu of the data center curmudgeon who regales me with tales of data center blackouts caused by a single carpet shock on that cold, dry day back in ’82.

HOW HIGH IS TOO HIGH?

So how about the other end of the spectrum? How high can the moisture content be before you have issues? When ASHRAE TC 9.9 issued its revised guidance in 2008, the issue of conductive anodic filament (CAF) growth on circuit boards was raised, along with corrosion and wear due to contaminants and moisture. Go ahead, read the paper. On this particular topic, it should make your head swim.

When folks start talking about issues of voltage gradients and ionic impurities of electronics at the microscopic level, they have likely lost the duct-and-pipe crowd. On the whole, we are macro guys, not micro. So what’s a conscientious engineer to do? Again, I suggest you listen to the experts.

The very smart folks at the Lawrence Berkeley National Labs (LBNL) in their Technical Best Practices for Data Centers Energy Management, state in part, “Humidification specifications and systems have often been found to be excessive and/or wasteful in datacenter facilities. A careful, site specific design approach to these energy-intensive systems is usually needed to avoid energy waste.”

Then, their first recommendation regarding mechanical humidification is:

Design Systems to the Actual Equipment Requirements

What a fascinating and humbling concept. Instead of trying to reverse engineer the electronics and back into the appropriate design conditions, why not simply provide the environmental conditions required by the manufacturer. One has to believe they know the equipment better than a bunch of tin jockeys. And, before you stress out about varying requirements from multiple vendors, take heart in the fact that the industry is pretty consistently working inside that 20% to 80% rh range mentioned earlier.

RECOMMENDED PATH FORWARD

We could reinvent the wheel here, but smarter people have already developed recommended practices. You are encouraged to review and understand LBNL’s guidance, which includes:
  • Design systems to actual equipment requirements

  • Eliminate over-humidification and/or dehumidification

  • If and when required, use efficient centralized humidification technology
When you embrace this guidance, you will see that opportunities begin to arise that go beyond humidity control. The expanded ranges open up more opportunities for airside economizers, which may allow you to save even more energy.

I also think it would be worth your while to visit Taylor Engineering’s website (www.taylor-engineering.com) and their publications link in particular, to learn more about this topic and others that affect datacom and traditional building designs. Further, an internet search on Christian Belady, Microsoft’s efficiency guru, should take you to a safe vantage point from which you can view the cutting edge.

As we have said before, your design should be a reflection of the problem you are trying to solve. And if your data center problem statement includes an equipment criterion of only 20% to 80% rh, then humidity control is not that complicated, regardless of the confusion out there regarding what is allowable and what is recommended.

Returning to the example of my son, Andy: if I recommended that he not jump off a railroad trestle into a river, but I suggested it was allowable, the boy would jump (and he did, by the way). In a similar sense, when it comes to data center humidity control, or the lack thereof, you should acknowledge the ambiguity in the Standards and then embrace your inner Andy: Exercise common sense, but then, jump!ES