Mark Fontecchio’s recent article on data center humidity issues at SearchDataCenter.com not only created buzz in the data center blogs, but generated quite a discussion amongst our team at PTS Data Center Solutions.
Data center humidity range too strict?
While some data center professionals find the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)’s recommended relative humidity of 40% to 55% to be restrictive, I think the tight ASHRAE standards have to be adhered to until further research proves otherwise.
PTS’s engineering manager, Dave Admirand, PE notes that the reason the telecommunications industry is able to operate within a wider humidity range of 35% to 65% is because of their very strict grounding regimen. In a well grounded system an electrical charge has no place build up and is more readily dissipated to ground. Mr. Admirand recalls his days at IBM when the ‘old timers’ would swear by wearing leather soled shoes (conductive enough to make connectivity to the grounded raised floor) and/or washing their hands (presumably to carry dampness with them) prior to entering their data centers to avoid a shock from discharging the build up on themselves onto a surface.
Relative humidity vs. absolute humidity
While I think both relative and absolute humidity should be considered, many in the industry are still designing to and measuring relative humidity. PTS mechanical engineer John Lin, PhD points out that only two values of air are independent and data center professionals have to control the air temperature. While we can only control one of the humidity values, it is possible to calculate the absolute humidity (humidity ratio) based on air temperature and relative humidity. Therefore, data centers are fine as long as the both temperature and relative humidity are within the permissible range.
Coy Stine’s example is right on the mark. The high temperature delta between inlet and outlet air that can be realized in some dense IT equipment may lead to some very low humidity air inside critical electronics which can lead to electrostatic discharge (ESD). My experience, however, is that I am not encountering data loss scenarios at the estimated 50-100 data centers I visit each year simply due to ESD concerns. This leads me to believe that there is a slight tendency to ‘make mountains over mole hills’ regarding ESD.
After further reflection on Stine’s scenario about the low relative humidity air at the back of the servers, I was reminded again by Mr. Admirand that it won’t much make a difference since that air is being discharged back to the CRAC equipment. Furthermore, even if the air is recirculated back to the critical load’s inlet the absolute moisture content of the air remains constant and the mixed air temperature is not low enough to cause a problem. John Lin contends this is the reason why we only control temperature and relative humidity.
It’s been our stance at PTS that the most important goal of humidity control is to regulate condensation. The only real danger to very warm, high moisture content air is that it will condense easily should its temperature drop below the dew point temperature.
Separate data center humidity from cooling units?
I have no doubt that R. Stephen Spinazzola’s conclusion that it is cheaper to operate humidity control as a stand-alone air handler is on target. However, experience dictates the approach is an uphill sell since the savings are indirect, realized only as part of operational savings. The reality is that the upfront capital cost is greater to deploy these systems, especially in a smaller environment where it is harder to control humidity anyway.
Humidity control is very dependent on the environment for which you are designing a system. In a large data center, it is actually easier to do because most of the entire building is presumably data center controlled environment. However, for SMEs with tenant space computer rooms the idea of humidity control is much more difficult since it is dictated by the overall building humidity environment. At best, a computer room is a giant sponge – the question is whether you are gaining from or giving off water to the rest of the building.
The design and construction of a data center or computer room, including its cooling system, should meet the specific environmental needs of its equipment. For now, our approach at PTS Data Center Solutions has been to utilize humidity control within DX units for both small and large spaces. Conversely, we control humidity separately when deploying in-row, chilled water cooling techniques for higher density cooling applications in smaller sites.
For more information on data center humidity issues, read “Changing Cooling Requirements Leave Many Data Centers at Risk” or visit our Computer Room Cooling Systems page.
Data center humidity range too strict?
While some data center professionals find the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)’s recommended relative humidity of 40% to 55% to be restrictive, I think the tight ASHRAE standards have to be adhered to until further research proves otherwise.
PTS’s engineering manager, Dave Admirand, PE notes that the reason the telecommunications industry is able to operate within a wider humidity range of 35% to 65% is because of their very strict grounding regimen. In a well grounded system an electrical charge has no place build up and is more readily dissipated to ground. Mr. Admirand recalls his days at IBM when the ‘old timers’ would swear by wearing leather soled shoes (conductive enough to make connectivity to the grounded raised floor) and/or washing their hands (presumably to carry dampness with them) prior to entering their data centers to avoid a shock from discharging the build up on themselves onto a surface.
Relative humidity vs. absolute humidity
While I think both relative and absolute humidity should be considered, many in the industry are still designing to and measuring relative humidity. PTS mechanical engineer John Lin, PhD points out that only two values of air are independent and data center professionals have to control the air temperature. While we can only control one of the humidity values, it is possible to calculate the absolute humidity (humidity ratio) based on air temperature and relative humidity. Therefore, data centers are fine as long as the both temperature and relative humidity are within the permissible range.
Coy Stine’s example is right on the mark. The high temperature delta between inlet and outlet air that can be realized in some dense IT equipment may lead to some very low humidity air inside critical electronics which can lead to electrostatic discharge (ESD). My experience, however, is that I am not encountering data loss scenarios at the estimated 50-100 data centers I visit each year simply due to ESD concerns. This leads me to believe that there is a slight tendency to ‘make mountains over mole hills’ regarding ESD.
After further reflection on Stine’s scenario about the low relative humidity air at the back of the servers, I was reminded again by Mr. Admirand that it won’t much make a difference since that air is being discharged back to the CRAC equipment. Furthermore, even if the air is recirculated back to the critical load’s inlet the absolute moisture content of the air remains constant and the mixed air temperature is not low enough to cause a problem. John Lin contends this is the reason why we only control temperature and relative humidity.
It’s been our stance at PTS that the most important goal of humidity control is to regulate condensation. The only real danger to very warm, high moisture content air is that it will condense easily should its temperature drop below the dew point temperature.
Separate data center humidity from cooling units?
I have no doubt that R. Stephen Spinazzola’s conclusion that it is cheaper to operate humidity control as a stand-alone air handler is on target. However, experience dictates the approach is an uphill sell since the savings are indirect, realized only as part of operational savings. The reality is that the upfront capital cost is greater to deploy these systems, especially in a smaller environment where it is harder to control humidity anyway.
Humidity control is very dependent on the environment for which you are designing a system. In a large data center, it is actually easier to do because most of the entire building is presumably data center controlled environment. However, for SMEs with tenant space computer rooms the idea of humidity control is much more difficult since it is dictated by the overall building humidity environment. At best, a computer room is a giant sponge – the question is whether you are gaining from or giving off water to the rest of the building.
The design and construction of a data center or computer room, including its cooling system, should meet the specific environmental needs of its equipment. For now, our approach at PTS Data Center Solutions has been to utilize humidity control within DX units for both small and large spaces. Conversely, we control humidity separately when deploying in-row, chilled water cooling techniques for higher density cooling applications in smaller sites.
For more information on data center humidity issues, read “Changing Cooling Requirements Leave Many Data Centers at Risk” or visit our Computer Room Cooling Systems page.