Thursday, July 26, 2007

Reflections on the Data Center in a Box

Recently Jack Lyne, Executive Editor at Site Selection magazine, contacted me regarding Sun Microsystems' new Project Blackbox, colloquially dubbed the “data center in a box.” (Check out his article: “Sun’s Blackbox: A Moveable Feast for Data Centers?”) Jack’s questions led me to reflect on the current rate of adoption I’ve observed for the mobile date center.

While the energy-efficient technology offers the benefit of rapid deployment, for many companies the Blackbox does not provide a feasible alternative to the traditional brick-and-mortar data center. Similar solutions have been equally ineffective. APC’s “data center on wheels” never seemed to produce the impact that was desired and it was a neutral processing environment.

The limitation for most companies which would be in the market for this technology is not space as much as it is access to adequate power and cooling. Despite its all-in-one packaging, the Blackbox does not mitigate the need for power and/or chilled water which are two primary cost drivers of any computer room project. At best the Blackbox is a Tier I data center as defined by the Uptime Institute’s Standard, which can be built just about anywhere for equal or less money.

The Data Center Journal summed up the sentiment quite nicely:

“A mobile data center is nothing new. We have seen APC deliver a mobile data center on wheels. We have seen manufacturers such as iFortress or Rittal’s Lampertz product line which both provide heavy duty and easily constructed mobile data center facilities. ...

“The Sun “Data Center in a Box” provides the industry with another choice that can meet the need of the consumer, but is it needed and will the industry embrace it or will it become a small niche market product? Time will tell.”

Monday, July 16, 2007

PTS Weighs in on Data Center Humidity Issues

Mark Fontecchio’s recent article on data center humidity issues at SearchDataCenter.com not only created buzz in the data center blogs, but generated quite a discussion amongst our team at PTS Data Center Solutions.

Data center humidity range too strict?

While some data center professionals find the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)’s recommended relative humidity of 40% to 55% to be restrictive, I think the tight ASHRAE standards have to be adhered to until further research proves otherwise.

PTS’s engineering manager, Dave Admirand, PE notes that the reason the telecommunications industry is able to operate within a wider humidity range of 35% to 65% is because of their very strict grounding regimen. In a well grounded system an electrical charge has no place build up and is more readily dissipated to ground. Mr. Admirand recalls his days at IBM when the ‘old timers’ would swear by wearing leather soled shoes (conductive enough to make connectivity to the grounded raised floor) and/or washing their hands (presumably to carry dampness with them) prior to entering their data centers to avoid a shock from discharging the build up on themselves onto a surface.

Relative humidity vs. absolute humidity

While I think both relative and absolute humidity should be considered, many in the industry are still designing to and measuring relative humidity. PTS mechanical engineer John Lin, PhD points out that only two values of air are independent and data center professionals have to control the air temperature. While we can only control one of the humidity values, it is possible to calculate the absolute humidity (humidity ratio) based on air temperature and relative humidity. Therefore, data centers are fine as long as the both temperature and relative humidity are within the permissible range.

Coy Stine’s example is right on the mark. The high temperature delta between inlet and outlet air that can be realized in some dense IT equipment may lead to some very low humidity air inside critical electronics which can lead to electrostatic discharge (ESD). My experience, however, is that I am not encountering data loss scenarios at the estimated 50-100 data centers I visit each year simply due to ESD concerns. This leads me to believe that there is a slight tendency to ‘make mountains over mole hills’ regarding ESD.

After further reflection on Stine’s scenario about the low relative humidity air at the back of the servers, I was reminded again by Mr. Admirand that it won’t much make a difference since that air is being discharged back to the CRAC equipment. Furthermore, even if the air is recirculated back to the critical load’s inlet the absolute moisture content of the air remains constant and the mixed air temperature is not low enough to cause a problem. John Lin contends this is the reason why we only control temperature and relative humidity.

It’s been our stance at PTS that the most important goal of humidity control is to regulate condensation. The only real danger to very warm, high moisture content air is that it will condense easily should its temperature drop below the dew point temperature.

Separate data center humidity from cooling units?

I have no doubt that R. Stephen Spinazzola’s conclusion that it is cheaper to operate humidity control as a stand-alone air handler is on target. However, experience dictates the approach is an uphill sell since the savings are indirect, realized only as part of operational savings. The reality is that the upfront capital cost is greater to deploy these systems, especially in a smaller environment where it is harder to control humidity anyway.

Humidity control is very dependent on the environment for which you are designing a system. In a large data center, it is actually easier to do because most of the entire building is presumably data center controlled environment. However, for SMEs with tenant space computer rooms the idea of humidity control is much more difficult since it is dictated by the overall building humidity environment. At best, a computer room is a giant sponge – the question is whether you are gaining from or giving off water to the rest of the building.

The design and construction of a data center or computer room, including its cooling system, should meet the specific environmental needs of its equipment. For now, our approach at PTS Data Center Solutions has been to utilize humidity control within DX units for both small and large spaces. Conversely, we control humidity separately when deploying in-row, chilled water cooling techniques for higher density cooling applications in smaller sites.

For more information on data center humidity issues, read “Changing Cooling Requirements Leave Many Data Centers at Risk” or visit our Computer Room Cooling Systems page.