Why are you an engineer? Did you love math and science as a kid and simply follow your bliss? Or were you pushed into it by your guidance counselor even though your heart was somewhere else? Are you in HVAC because the thought of picking a fan gave you goose bumps, or did you take the first decent engineering job you were offered out of college?

We all have arrived at this particular waypoint via differing paths. And we will all take diverse routes to our next and final destinations on this journey we call a career. But it’s reasonable to suggest that there are a few touchstones we can all relate to for guidance as we find our way. 

The reason I am feeling particularly philosophical today is because in the course of catching up on my technical journal backlog, I was left with a been-there-done-that feeling. This exercise left me with a sense that all of the recent breakthroughs in data center HVAC design have been presented and discussed, and now we are just arguing the finer points while we preach to the already converted.

Air-side economization is now accepted as a viable design approach, whereas just a few years ago it was an anathema. The benefits of relatively higher temperatures in the data center have been debated but are now generally acknowledged. Compressorless central plants can be found all across the country, and not just in a few psychometrically blessed bastions like the great Northwest. And the modular conversation is more about form factor than systems.

Not to say that all of the questions have been asked and answered or that data center design has reached its zenith. It’s just that in the last decade there have been so many advances that a quiet year or two feels empty and awkward. 

When you consider that we live in a data-dependent age where life as we currently know it would grind to a halt without all those humming servers in data centers around the globe, breakthroughs are certainly on the horizon.

 

Being Open to the Next Wave

Now, I’m not egotistical enough to believe I will be an architect on this new frontier. But I would like to think I will recognize it, and perhaps more importantly, be prepared for it when it appears. Readiness will require a presence in the marketplace, but also a particular state of mind. Preparation for recognition will require an open mind.

If we are convinced that ours is the only way, then it is highly unlikely we will see a shift when it occurs. All we really “own” in the consulting business is our knowledge and our perceived expertise. So a certain level of confidence is a must. Stay true to the fundamentals, but be prepared to adapt your game. 

 

“Change your opinions, keep to your principles; change your leaves, keep intact your roots.”

— Victor Hugo, author

If you aren’t open to (or preferably seeking) new ideas, then you run the risk of missing the wave.  At best, you simply lose your competitive edge and have to play catch-up. But a far more damaging prospect is that you are tagged an obstacle to progress and rendered intellectually obsolete.

A machine can become obsolete. The same goes for a tool or a particular process. In fact, planned obsolescence is a basic tenant of modern capitalism (see GM and Alfred P. Sloan in the 1930s). But a designer should never be allowed to become an antique.

 

The Paradox of Evolutionary Simplicity

Evolution is a process of continuous change from a lower simpler state to one that is higher and more complex. This definition comes from biology, but the process is certainly evident in our technology.

For example, the seminal ENIAC super-computer looks more like a massive loom from a Dickensian sweat shop than anything you might see in a data center today (Figure 1). But as computers changed, the facilities they occupied and the systems that served them changed as well. 

And as the systems evolved they were true to the definition: They were better and they were certainly more complicated. In the mission critical sector circa 2000, the field of “precision cooling” was ironically out of control. Systems were becoming more and more complicated and controls increasingly more proprietary. But was more complex really better?

 

“Everything should be made as simple as possible, but not simpler.”

— Albert Einstein, theoretical physicist

 

Einstein said simply what Occam’s Razor says more precisely, in particular that one should not make unnecessary assumptions and that the answer to a problem is often the simplest.

In HVAC systems, these seemingly mutually exclusive tenants have somehow combined, and we have actually observed the paradoxical concept of evolutionary simplicity.

Steam has given way to hot water, pneumatic controls to DDC, and dual-duct to VAV. Yes, the individual devices have become more complex (e.g., an electronic controller is more sophisticated than a pneumatic thermostat), but the application of the components is far simpler.

And because data center systems are subject to the same principles, just when it seemed a critical HVAC system couldn’t get more complicated, something revolutionary occurred.

 

Markets and Physics

In 2004, ASHRAE issued Thermal Guidelines for Data Processing Environments and things really got interesting. While only 43 pages in length, this new guide would redefine the way new data centers look and feel. The gamechanger was the expanded environmental parameters allowed in the data center and where those conditions would be monitored.

In the legacy model, the air conditioning systems were designed to maintain the room temperature around 72?F. But now that the critical temperature was to be measured at the server inlet, the room temperature was irrelevant. Further, the temperature leaving the rack would be much higher, more like 90?F.

But hey, what about that simple evolution mumbo jumbo? Right now there are a number of folks scratching their heads and wondering just exactly what I may be smoking. I have just stated that systems design inevitably becomes simpler, but many of you are currently struggling with a new generation of significantly more complicated cooling systems than ever before.

Some of these systems require black box technology in order to optimize the integration of CRACs with in-row coolers, hot and cold aisle separation, chimney cabinets, floor tiles with VAV dampers… all while using more sensors in more places and more complicated programming and algorithms. How can this be characterized as simple? Well, it can’t. 

 

“In design, there are two resource pools available: The laws of physics and the products of the market. The designer of excellence works with the former, and the designer of mediocrity works with the latter.”

— Bill Coad, past president of ASHRAE

 

This complicated hybrid is what you get when try to use the wrong tool for the job. The reality is with 75?F entering air temperatures, we aren’t talking cooling anymore … at least not the way we used to. 

With the changes in the definition of an acceptable data center environment, the products in the marketplace are no longer appropriate. This fact should have forced designers to re-examine the laws of physics involved. But so far, only the most brave have peeked out from under the tent of conventional wisdom.

Using the Open Compute Project¹ as our basis of design for modern data centers, the mechanical system becomes primarily a heat removal scheme (Figure 4). There are variations on this theme (indirect versus direct evaporative cooling, higher temperature waterside economizers instead of direct air, etc.), but one constant is the glaring lack of a refrigeration cycle.

This new compressorless reality is supported by the leaders on the application side of the equation including Open Compute and GreenGrid. And it is documented by those in the research and standards community with the 2011 ASHRAE Updated Guidelines.²

 

Path Forward

So we began this article by ruminating on paths. Paths taken, paths not taken, and those we ultimately choose. The fact is that what drives each of us — including the company we work for and the type of work we do — takes us in different directions. A design/builder’s perspective is very different from a consulting engineer’s, which differs just as markedly from a facility manager’s.

The British use the term “middling,” meaning ordinary or run-of-the-mill. To be classified as a middling engineer connotes failed expectations and a potential unmet. Do you know anyone who would be comfortable with such a moniker? Would you?

For most of us, change has to be incremental. Our risk model does not allow us to step too far outside of our, our management’s, or our client’s comfort zones. But when we can learn from those on the bleeding edge, we should have the courage to step out onto the leading edge. We do that not by stepping out of our comfort zone but by educating ourselves and expanding it instead. 

 

“The path of least resistance inevitably leads to a design of remarkable mediocrity.”

— Carlson Van Klieve, author and engineer

 

What we do as HVAC engineers isn’t especially complicated or consequential. But that doesn’t mean we can’t drive the science forward. And moving ahead requires a mind open to change, an appreciation for the less complex, and the embracing of the basic tenants of thermodynamics.

Regarding the current state of the art in mission critical HVAC, this perceived lull isn’t a technological plateau but rather a respite before the next great leap in our approaches to and definitions of data center design. The next wave will be here soon, and the next one sooner than the last (Figure 4).

We all should be prepared when it comes and be eager to see where it will take us

 

References

  1. http://www.opencompute.org/
  2. ASHRAE, 2011.  2011 Thermal Guidelines for Data Processing Environments – Expanded Data Center Classes and Usage Guidance.  Developed by ASHRAE Technical Committee 9.9.
  3. http://www.opencompute.org/assets/open-rack/DataCenter-Mechanical-Specifications.pdf