I recently found myself mulling over what's really important in a data center because of the ongoing discussions over cloud computing.
And, I think the answer is that there is nothing important in a data center. What's really important is access to the information or data that we want or need at any given time. Even as I write this blog, I am simultaneously downloading a version of Firefox 3.5.2 from a server located in a data center. And when I finish writing this blog, I'll post it to another server using a commonly used publishing software called a content management system.
Most users don't care very much about the details of Mozilla's servers, and unless you are selling rack space or managed service, you probably don't care much about the server that hosts this blog.
It is the exchange of information that matters.
Data centers and servers are merely the current technologies we use to support the quick exchange of information and to keep it reliably available 24/7. In short, they are the legacy that old mainframe operators bequeathed the 21st century,
How long will they be with us? Until something better comes along?
What's going to be better? Something that uses less energy; something that is sustainable; something that more readily provides reliability and uptime. Something that is safe from physical and electronic attacks.
Is that the Cloud? Maybe. But from where I sit, clouds are not game changing enough to revolutionize the data center. And not game changing enough to make the data center truly sustainable.
Sure the cloud promises to be self-healing, and its distributed nature should make cloud aps relatively impervious to attack, their server hosts less critical to overall reliability, and electrical bills lower because of load shifting.
Both internal and external clouds rely on traditional data center infrastructure, albeit possibly less hardened facilities with lower Tier ratings. These data centers are inherently non-sustainable, and economies in these facilities still follow increased density, which tends to bring increased complexity. Moore's Law seems to dictate that processing power will continue to grow and so ever-smaller chips will become more power hungry. Taken as a whole, I think this means that clouds do not mean the end of data centers as we know them, merely some changes if cloud technologies become widely adopted.
Do data centers have their drawbacks? Of course, we're all familiar with the challenges of running them. Clouds will have drawbacks too. Some, like data security vulnerabilities, may be so severe so that clouds do not supplant current architectures. If so, we will be off to the next big thing.
For now, the cloud seems to offer some potential for improving how we access and share information, and even for lowering the cost per transaction. Still, it is important to remember that the cloud may only turn out to be an improved way to handle data transactions, reducing the cost per transaction and increasing the volume served, without reducing the overall demand on water and energy resources.