Data Center Domination
Strong title perhaps…but thats the kind of over the top messaging I have been practicing now that I am technically on a marketing team. Today is the big day – all the hard work of a great many people that spent the last several years dreaming, designing and then building what appears to me as the most advanced, eco-friendly data center as yet conceived…(there I go again). Lots to promote here so bear with me – first and foremost – be sure and tune in today as they stream the grand opening event featuring our own Rebecca Jacoby and John Manville – you can catch it live on the uStream channel: http://www.ustream.tv/ciscotv from 2:30 to 3:15 Central Time. (cause its in Texas!). Jimmy Ray recently penned a nice run down of our own recent visit to this high impact low profile data center in his blog – I thought I would share a couple of my favorite things.
Right off the bat, I think its important to remember that although there are a great many innovative things to discuss, none of them are ground-breaking in and of themselves. Its the fact that Cisco combined them all in such a unique manner. At the top of the list – mainly because I don’t want you to miss it.. this is actually a facility you can visit.
1. Go See For Yourself – its probably obvious why Cisco would make the additional investment it takes to make a facility presentable for tours – we built this thing to run Cisco operations using Cisco technology. Now you can’t just walk in (you could try….but I had trouble getting past security and I was scheduled…) but you can chat up your Cisco account manager and see what is needed. Four hundred UCS Servers to get started, its still a fraction of the 230,000 square feet, but its an incredible production level example of full scale virtualization and unified fabric in one location. This facility, known as DC2 is in an active/active status with Cisco’s other facility in North Texas, DC1. Close enough for the sub millisecond latency requirements but far enough to get the desired geographic separation. There are some really good short videos and other goodies at the Data Center Texas 2011 site. (TechWiseTV did get some very nice behind the scenes access just last week and we will rollout some of our findings over time…the first one is our recap of the facility found in our next episode April 28 – watch the techwisetv.com site for that shortly or make sure you register in our virtual environment.
2. The Power – so assuming my facts are correct, this is a 5.25 MW, tier 3 data center with the redundant feeds coming in 10 MW each, one underground and one from above – neither of which are designed to pull more than around 45% of their capability so that failover would not be an issue. But once you get inside the building, and put on your hearing protection , you can witness the flywheel generators, (each flywheel is about 90 db). This is all done without batteries of course which means that you get a longer life-span and take less of a hit with battery disposal and other challenges.
This whole notion of a rotary UPS fascinates me. A rotary UPS uses the inertia of a high-mass spinning flywheel to provide short-term ride-through in the event of power loss. The flywheel also acts as a buffer against power spikes and sags, since such short-term power events are not able to appreciably affect the rotational speed of the high-mass flywheel. It is also one of the oldest designs, predating vacuum tubes and integrated circuits and here it is within a state of the art data center. Beyond the smoothing effect produced by the stored kinetic energy, the system has a big honkin diesel engine attached. The flywheel has incredible inertia (takes over 2 hours to stop on its own) to do what it does but if power is missing for over 10 to 12 milliseconds, that Diesel starts running and the data center keeps on humming. (More information from the manufacturer)
3. The Cooling – There were a number of things that jumped out at us upon going inside data hall 1 for the first time. Its was warm. I had not personally witnessed the environmental differences of a DC following the now relaxed ASHRAE standards that allow data centers to operate not at 65 but now at 78 degrees. To be honest…thats starting to get a bit warm for me – and I thought networking equipment was picky. That one change alone has made HUGE gain for the environment as data centers everywhere simply changed the thermostat. But it also allows designers building a new facility like this one to do things that would not have made sense in the past.
The first big thing sounded ludicrous to me as a Texas resident – a data center located in Texas designed to use free outside air for cooling…what? Well, it turns out that North Texas has good air to use about 65% of a normal year…astounding. I complain about being sweaty at least 80% of the year…but I trust their data. This facility can choose to be a closed or open loop system depending on the weather and subsequently save millions of dollars. We got to stand in one of the (12 plus?) air handlers here and saw how huge louvres can slowly open to grab outside air and pump it through the system.
*Update! – I threw together a quick little video from my tour with Jim Cribari – forgive the lower than normal quality…I shot this on the fly with my own camera and on-board sound.
Although they still call it raised floor – there were none in this facility. Everything was hard tile floors, brilliant white throughout (maximize the LED lights so that less of them need to be used!) followed by open ceilings with beautiful cabling and bus systems for the power dropping 30 amp connectors into each rack. (415 at the rack instead of 208 allows for more room without transformers, the energy tax they impose and all the circuit breakers you have to use normally). Every rack has a chimney that terminates just above the cool air line. Cold air falls naturally in the cold aisles, equipment fans pull the cold air through the vented rack doors and vent the heated air out the back as it rises naturally in that chimney and into the open plenum about 30 feet up. Depending on the weather that day, the heated air will take one of two routes – right out the building or back through the system to be used again. All about as natural as you can get it. Very nice.
There are so many cool things to talk about – but its like trying to tell my wife how hard we worked in London recently but how cool it was to see Abbey Road…she did not experience…did not care. I doubt she will ask for a tour of the Allen Data Center but you sure should. If you read this far…you need to call your account manager. Its worth the trip.
Watch the grand opening today – http://www.ustream.tv/ciscotv from 2:30 to 3:15 Central Time and/or at least check out the interactive and educational website they have set up: Data Center Texas 2011.