Kent Hoyos, director of information services, faced a major roadblock in achieving this vision: increasing heat loads in the hospital's data center were threatening current operations and the ability to grow.
Inadeqaute Infrastructure"We simply did not have the infrastructure to move forward," Hoyos said. Located in the basement of an aging building, the data center had been designed with a 6-in. raised floor and two, 5-ton air conditioners in a redundant configuration. This setup managed heat loads adequately when it was installed, but was quickly outgrown.
"At the time we built the data center, servers were much larger and took up a lot more space than today's servers," Hoyos added. "Plus, the hospital wasn't that dependent on IT. Everything was still paper-based. As we began to add more systems, the heat in the room escalated."
As new higher-density systems were added, heat density was driven even higher. "Where we used to put two servers in a rack, now we may have 16 in the same floor space," Hoyos said. And the role of the data center continued to expand as the hospital implemented Siemens's Picture Archiving and Communication System (PACS), an EMC Clarion, and a Centera system for long-term archiving. "When those systems came online, it was like starting up a furnace," said Hoyos.
Soon, even with both 5-ton units running continuously, the average data center temperature hovered in the high 80s, with zones that were consistently at 90°F. Temperatures that high take a toll on computer room equipment, and the hospital began to experience equipment failures in key systems, including $30,000 in lab hardware. Hoyos believes the conditions in the room contributed to the failure of 10 pieces of equipment in a little over a year.
During this time, Hoyos was scrambling to find a workable solution to increase the cooling capacity in the room. As a temporary solution, the hospital installed two, 1-ton portable cooling units vented through roof hatches. Another 3-ton unit was ducted in through a wall. "We had to run all of the units 24/7 to stay at 92," Hoyos said.
Space limitations prohibited bringing in additional precision cooling systems, and Hoyos was concerned that increasing the capacity of the raised floor system would not provide cooling where it was needed most. Hoyos also investigated and rejected a proposed chilled water solution because of concerns about introducing water into the controlled environment.
An AnswerThe hospital consulted with Quemars Mazloomian, a principal with TMAD/Hengstler Engineers of Anaheim, CA. He suggested the Liebert XD family of high-density cooling solutions, which deliver high-capacity supplemental cooling of up to 500 W/sq ft by bringing the cooling units close to the source of heat and using a high-efficiency coolant.
After speaking with Liebert cooling specialists and viewing the Liebert XD system in action at Virginia Tech University, Pomona Valley installed 20 Liebert XDV units and two Liebert XDP pumping units.
The Liebert XDV is a vertical fancoil that mounts to the top of the rack. The unit draws hot air directly out of the rack or from the hot aisle and exhausts cold air into the cold aisle, where equipment air intakes are located.
The XD system uses a special coolant that is pumped as a liquid to the Liebert XDV, where it vaporizes to a gas while absorbing the heat energy of the air. It then returns to the pumping station, where it recondenses to liquid. The fluid phase change greatly enhances the system's efficiency and eliminates the possibility of damage or electrical hazard from liquid in the event of a leak.
The Liebert XDP circulates the coolant to the XDV units and ensures the coolant is always above the dew point in the room, eliminating the possibility of condensation.
When the Liebert XD began operation, the average temperature in the data center fell by more than 30°. "We went from worrying about heat evacuation to looking for our parkas," Hoyos joked. He estimates that the system will pay for itself simply by eliminating equipment failures.ES