Although I’m writing this in December, you will likely be reading it in January, and hopefully you will be at McCormick Place where the 2012 AHR Expo is being held this year. Every three years, the Big Show comes to Chi-town. And while I love Chicago, you have to admit that having your signature conference in the [Very Cold and] Windy City in the throes of winter defies conventional wisdom.

In fact, as my wife pointed out the one year she came along, leave it to a bunch of engineers to hold their most important industry event during the coldest time of the year on the banks of Lake Michigan when the beaches of any number of warm ocean destinations beckon. I must confess I didn’t have a logical counter-argument for my lovely bride.

And speaking of conventional wisdom, you might have your very own Oprah-esque “ah-ha” moment as you briskly walk to your shuttle bus, and conclude that a data center with a direct outdoor air (OA) economizer would really make sense in a climate such as this. But caution, my datacom compadres, there is a vocal “close that darn window” chorus out there that has a logical counter-argument.

So in this article, we will take a moment to understand this argument before, like Cubs fan Steve Bartman in that fateful Game 6 of the 2003 National League Championship Series, you become a hapless victim of your own singular focus.

 

SOME DATA CENTER OLD-TIME RELIGION

Before we begin, I want to layout what I consider the givens of modern data center operation and design. I have concluded that at every opportunity, I must be an evangelist for common sense data center design, and, for me, the data center design absolutes are:

 

• Recognize the recommended

• Advance the allowed

• Separate the streams

 

Recognize the recommended.The “recommended” are the “ASHRAE Recommended Thermal Guidelines.” The recommended ranges shouldn’t be a matter of debate any longer. If you are not designing a new data center to operate within, or optimizing an existing data center to eventually function inside, the latest absolute dewpoint values of 42°F and 59° and temperature range of 64° and 81°, you’re just not being serious. Stop hedging your bets and go all in already.

Advance the allowed.The new frontier can be found at the limits set by the “ASHRAE Allowed Thermal Guidelines.” When you can get to humidity rh levels of 20% to 80% and a maximum dewpoint of 62°, along with temperatures between 59° to 90°, you will have figuratively flung open the window of energy-saving opportunities, and quite literally opened the free cooling window on almost the entire planet. This will be the genesis of truly net zero (energy and water) data centers. So stop circling the wagons and be a Sooner instead.

Separate the streams.And lastly, stop kidding yourself and keep the hot air hot and the cold air cold by keeping the streams separate. It doesn’t matter how you do it, but do it physically not theoretically. Arrows on drawings are just lines on paper. The only way to absolutely guarantee that airstreams won’t mix is to provide a physical demarcation. And when air doesn’t mix, you can bank on your systems’ efficiencies and avoid overdesign and wasteful safety factors. Simply put, you can’t push the envelope unless  and until your envelope has edges. So embrace boundaries.

 

A TALE OF TWO STUDIES

“It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness…,” so wrote Charles Dickens in A Tale of Two Cities. And even though he was speaking of pre-revolutionary France, the same can be said today about expanded ambient conditions in the data center and the proposed use of 100% OA economizers in the technical space.

On the one hand, we are given the expanded thermal guidelines that make it possible to cool with OA even during the warmest times of the year. But then we have another, much less referenced, white paper from ASHRAE that throws a bit of cold water on the whole economizer parade.

In 2009, ASHRAE issued guidance on particulate contamination within the data center that seemed to discourage the use of excessive amounts of OA, implying that too much OA could lead to equipment failures. Two common modes of IT equipment failures due to environmental contamination are copper creep corrosion on printed circuit boards and corrosion of silver termination in miniature surface-mounted components (Figures 1 and 2).

In an almost teeter-totter fashion, these two seminal papers have been issued and updated: Thermal Guidelinesin 2008 and updated in mid-2011, and the Contamination Guidelinesin 2009 and late 2011. But what is even more interesting (and perhaps telling) is the difference in the amount of hype and press the two have received in the HVAC marketplace of ideas.

The thermal guidelines have been hawked like a traveling medicine man’s elixir, with no fewer than a dozen related articles in the major trade magazines and numerous conference presentations. Heck, there was even advance hoopla leaked prior to its official unveiling in May of last year.

The contamination guidance, on the other hand, was issued in relative quiet. For some of us, if it weren’t for the “help” of filter vendors and manufacturers of certain types of AHU technology pointing it out, it may have been missed altogether. And I’m not slighting the messenger here; these folks have a legitimate claim to a legitimate paper. But it seems like the authors of the paper are treating it like a less substantial treatise.

To reinforce this observation, I refer you to an article in the December issue of the ASHRAE Journal  wherein the authors (both associated with TC 9.9) tout the increased potential for economizers and then offer a seemingly innocuous yet ominous caveat two pages later:

 

“Moisture may exacerbate corrosion problems in data centers where corrosive gases are present. Different locations around the world have different ambient levels of potentially corrosive pollutants, and the designer must be cognizant of these pollution levels on a site-specific basis. Even when ambient levels are known to be low, the potential still exists for short-term transient conditions of potentially corrosive pollutants of which the designer must be aware.[emphasis added]”

 

I’ve got a challenge for you: try telling the data center operator you’re working with that your design may lead to “short-term transient conditions” and see if you get a Christmas card next year.

 

BETWEEN A ROCK AND SOFT PLACE?

So what’s the deal? Is ASHRAE quiet because corrosion is a simple matter that can be handled relatively easily? Or could it be there aren’t enough hard data to back up the perceived threat, so instead of picking a fight they do a soft pedal? Well depending on your OA politics you can probably interpret it either way.

The 2011 Contamination Guidelineswaffle a bit on easy vs. hard, stating in part, “Data center contamination and its corrosive effects can be identified by well defined and relatively easy means.” Then it counters with, “…direct measurement of gaseous contamination levels is difficult and is not a useful indicator of the suitability of the environment for IT equipment.”

Then they go on to offer, “… a low-cost, simple approach to monitoring the air quality in a data center…” using copper and silver foil coupons (Figures 3 and 4). But then they reverse course, restating one of the more controversial statements from the 2009 version. Specifically that, “…for data centers with higher gaseous contamination levels, gas-phase filtration of the inlet air and the air in the data center is highly recommended…”

So contamination is easy to identify but difficult to measure directly? However, I can use a simple coupon approach to measure it, but in data centers with higher levels I should still add gaseous filtration, which at best is unfamiliar to most facilities designers and at worst is an unknown.

And you have to ask: Do I carry out the coupon protocol before or after I build my data center and run the systems? And if I need to determine the potential for contamination before I design and build, are there any credible data out there to guide me?

LBNL thinks that ASHRAE got it wrong and it’s all anecdotal evidence and conjecture and said as much in their white paper, wherein they reached the following conclusion:

 

“… The white paper recommendation that gaseous contamination should be monitored and that gas phase filtration is necessary for data centers with high contamination levels is not supported…We are concerned that data center owners will choose to eliminate air economizers…since there are implications that contamination could be worse if air economizers are used. This does not appear to be the case in practice or from the information presented…[emphasis added]”

 

CHICKENS, PIGS AND WEASELS

A pig and a chicken are walking down the road and the chicken says, “Hey, Pig, I was thinking we should open a restaurant!” The pig replies, “Maybe, but what would we serve?” The Chicken responds, “How about ham and eggs?” The pig thinks for a moment and then says, “No, thanks. I’d be committed, but you’d only be involved!”

In a project or a design process there are folks who are like the pig and are totally committed to the project and accountable for its outcome; they have skin in the game. And then there are those who, like the chicken, consult and are informed by the process and its outcomes. Not surprisingly, I think consensus-driven organizations like ASHRAE are very smart chickens.

When I’m stumped, I look to folks who are smarter than me and have more real-world experience than I do. And in a case like this, where equipment and operations are on the line, I look to guys who are the pork equivalent in our fable. In short, I look to users for my wayfinding.

Now, we all know that there is no such thing as the typical user or client. Each is unique and has differing demand and requirements, so in turn, the answer to the OA question cannot be a simple yes or no. It all comes down to the weaseliest of weasel words … it depends. So here is this weasel’s advice based on clients I have worked with and their risk aversion.

Internet-based service providers (probably yes).The business model for most of the operations on the internet is based on maximized virtuality and server utilization. Their success is based on speed and uptime like everyone else’s, but their model is unique in that they are light years ahead of the rest of the IT market and their redundancy is in the e-ether, not necessarily the bricks and mortar. In the simplest terms, when they lose a server, they shift the application to another server instantly.

For this reason and others, they can afford to lose servers without affecting their product delivery. That doesn’t mean they can lose an entire data center, but it does mean that they manage risk differently. In turn an OA economizer is not a push for these folks, which is evident in all the press you read about the Microsofts, eBays, and Googles of the world.

Intelligence community, 9-1-1, and DoD (probably no).They say that a picture is worth a thousand words, so in your mind’s eye, recall the billowing cloud of dust that surged from Ground Zero when the World Trade Center towers fell. It’s the lethal potential of that menacing debris cloud or an agent released into the atmosphere or smoke billowing from the flames due to an explosion or wildfire that all make a direct OA economizer less tenable for facilities that must remain up and running without interruption.

There are options, such as providing systems that can operate with OA during normal operation but have sufficient plant capacity to operate on 100% circulation in an emergency. But there is such a premium associated with protecting all of the additional OA intakes (blast, security, and agent detection) and the redundant cooling systems that usually it just isn’t worth it. These are applications where water-side-economizers and indirect airside economizers make the most sense.

Enterprise data centers such as government, banking, and commercial business (why not?) Here is where the real opportunity lies. This is where I would suggest that you start with the OA economizer as a basis and then prove it can’t work. There will be pushback, especially from the old guard, but in most cases I would counter that the evidence is sketchy that the typical data center operator has to fear OA.

I base part of this on the fact that the risk factors in these segments do not approach those in the intelligence community. Banks will say they can’t do it, but banks have done it (see Deutsche Bank). And also because LBNL developed a study using data collected from data centers being operated by Digital Realty Trust, NetApp, IRS, Cisco, and others, concluded that data centers utilizing OA economizers have no more corrosion issues than traditional data centers .

 

IN CONCLUSION

Let’s not forget the data center design absolutes: recognize the recommended; advance the allowed; and separate the streams.

With that said, ASHRAE has issued thermal and corrosion guidelines that appear to be counter to each other. But their dissemination in the marketplace and further analysis by others in the industry help us to see that the dusty dog with zinc whiskers has a bark that is much worse than its bite.

OA economizers have immense potential for saving energy, but like any technology, there is risk that must be weighed against the reward. And while indirect OA economizers and waterside economizers can in many cases approach (or even beat) direct OA economizers performance-wise, you just can’t beat the simplicity of fresh air when applied in its purest form.

I hope this article provides enough information and references so that you can see beyond your previous horizon. And at the risk of using one too many Chicago references, I will quote Ferris Bueller who said, “Life changes pretty fast. If you don’t stop and look around once in a while, you could miss it.” Same goes for design.

Danke Schoen.ES

 

CITED WORKS

1. ASHRAE.“2011 Thermal Guidelines for Data Processing Environments – Expanded Data Center Classes and Usage Guidance.” Developed by ASHRAE Technical Committee 9.9. , 2011.

2. ASHRAE. “Gaseous and Particulate Contamination Guidelines for Data Centers.” Developed by ASHRAE Technical Committee 9.9. 2009.

3. Steinbrecher, R. and R. Schmidt, “Data Center Environments - ASHRAE’s Evolving Thermal Guidelines.” ASHRAE Journal. 53(12):42-49, 2011.

4. Han, Shehabi, Coles, Tschudi, Gadgil, and Stein. “Should Data Center Owners be Afraid of Air-side Economizer Use? – A Review of ASHRAE TC 9.9 White Paper titled Gaseous and Particulate Contamination Guidelines for Data Centers.” LBNL. 2009.

5. Dunnavant, K. “Indirect Air-Side Economizer Cycle - Data Center Heat Rejection.” ASHRAE Journal. 53(03):44-54, 2011..

6. http://www.banking-on-green.com/en/content/news_3604.html

7. Han, Price, Coles, Tschudi, and Gadgil. “Air Corrosivity in U.S. Outdoor-Air-Cooled Data Centers is Similar to That in Conventional Data Centers.”  LBNL. 2011.