The organizers of the Datacenter Dynamics event held earlier this week in Dallas had an event that flew in the face and the economic downturn and news media reports of corporate travel restrictions impacting IT. In fact, Chris Collins reported that the nearly 400 registered attendees exceeded last year's total.

Collins was particularly pleased by the morning's presentation of the the U.S. Department of Defense Enhanced Use Lease Program.  Bob Penn and Patrick Giardina of the U.S. Army Corps of Engineers spoke before a packed room about the program's value to the data center industry. Collins said, "This was the first time this program had been seen anywhere. It just launched on December 2. With 14 million square feet of space and a lot it in prime locations, it makes quite a value proposition." Look for more details on this program in the coming months.

 

Look also for a video I took in which DCD's CTO Stephen Worn describes highlights of this event and talks about their NY event in March.

 

Microsoft continues to reach out those who can't make time to attend events or get hands-on experience on containers. In a new blog post, David Gauthier, Data Center Infrastructure Architect and Christian Belady, Principal Power and Cooling Architect, Microsoft Corp. write, "Some people got the impression that this announcement was solely about a containerized server room rather than a re-thinking of the entire infrastructure. The goal of Gen 4 is to modularize not only the server and storage components, which a number of companies are already doing, but also to modularize the infrastructure, namely the electrical and mechanical systems. The real innovation is around the commonality, manufacturing, supply chain and integration of these modules to provide a plug-and-play infrastructure along with modularized server environments."

In short, Microsoft wants to make clear that it is overhauling everything about the way the data center environment is designed by taking modularity to the extreme. And they're putting the hearts and budget behind the effort. Read their entire post.

 

While we are talking about sharing information, Emerson Network Power released its annual Data Center Users Group (DCUG) "Inside the Data Center" report. More than 165 data center, facility, and IT managers participated in the survey, which reported these findings:

 

Data center criticality is rising. Three out of four respondents said in this year's survey that the data center is more critical today than in past years. Many of these are large facilities that have had high availability goals for years.

 

Power densities across the room and within the rack are increasing sharply. In doing so, they're pushing facility capacity to the max - and fast. In the last two years, average rack power density has risen from 6 kW to 8 kW. More than 10 percent of respondents said they will be out of data center capacity by the end of this year and a total of 68 percent expect to be at capacity within the next three years.

 

Despite tight capital equipment budgets, changes to the data center are still planned. Thirty-seven percent of survey participants responded that current economic conditions are impacting their ability to operate or expand their data center. However, in a sign of just how dynamic and critical the data center has become, more than 75 percent of survey respondents are still planning for changes to take place within their facilities.

 

Data center managers are unwilling to compromise availability for efficiency. Data center managers will be taking into account corporate initiatives to improve energy usage throughout the data center; yet, availability remains the top priority. 

 

Energy usage strategies are still missing. A comprehensive energy usage strategy remains elusive for a majority of businesses despite one in four respondents having completed an analysis of the efficiency of their data center equipment. Only 28 percent of survey respondents have a documented strategy to reduce energy usage.

 

 

And Symnantec weighed in with its top storage trends for 2008.

 

1. Windows Server 2008 Drives Upgrade, Compatibility Efforts:  Microsoft’s long-awaited Windows OS was made available to users in early 2008 with new features for virtualization, security, and performance, which also created the typical rush among vendors to offer compatible products for backup and recovery.  Furthermore, the significant upgrade process created even more need for backup as users faced possible data loss during the upgrade.   

 

2.     Microsoft Hyper-V Creates Competition: Microsoft’s virtualization hypervisor, Hyper-V, ratcheted up the competition in the market with its open approach (as opposed to VMware’s more proprietary offering).  As users continue to move virtualization into production, Symantec believes they will discover that traditional approaches (such as VMware) will limit the ability to maximize assets and reap the benefits of server virtualization. 

 

3.     Data Growth Drives Backup Redesign Projects: Gartner and The InfoPro reported that backup redesign was a significant storage initiative for 2008.  Backup design projects are driven by increasing complexity across the IT infrastructure, a proliferation of point tools to manage backup, and multiple management interfaces.  Backup Redesign projects are also driven by a need for support for virtual environments. 

 

4.     Backup Moves to Service Model – Limitations in IT resources drove some end users to use SaaS models for some technologies, such as backup, reducing the burden of dealing with purchasing, configuring, and maintaining and on-premises solution.  

 

5.     Data Center Energy Crisis – According to McKinsey, the total estimated energy bill for data centers in 2010 will be $11.5 billion, up from 8.6 billion in 2007.  In 2008, energy costs across all sectors drove IT to look at the cost savings and efficiencies they can achieve through ‘green’ data center solutions. 

 

6.     Disaster Recovery Testing Still Lacking:  Despite making improvements in disaster recovery planning efforts in 2008, organizations are still coming up short when testing those plans.  According to Symantec’s 2008 Disaster Recovery survey, respondents indicated that 30 percent of tests fail to meet recovery time objectives, with top reasons for failure including human error and technology. 

 

7.     Thin Provisioning for Reclaiming Storage: As storage resources become scarce, users are turning to technologies like thin provisioning to make better use of existing storage.  Thin provisioning gives organizations the ability to deploy ‘thin’ storage, reclaiming storage during online migrations and driving operational efficiency.  The technology lets users allocate space to servers on a just-enough and just-in-time basis in order to make better use of existing resources.

 

8.     Economy Impacts IT Priorities: As IT budgets are tightened due to the current economic conditions, many organizations are shifting their focus to finding more efficient technologies that manage complexity while reducing resources.  There is some speculation that implementing new and innovative technologies will be put on the back burner in favor of those that help IT make better use of resources.

 

9.     Protecting and Managing Virtual Machines Comes Together: According to IDC’s first Worldwide Quarterly Server Virtualization Tracker, virtualization technologies have matured significantly over the past year.  In fact, virtualization license shipments for the second quarter of 2008 increased 53 percent over the prior quarter and were up 72 percent over the same quarter last year.  IT professionals recognize that successful virtualization initiatives utilize tools that manage both physical AND virtual machines together.