Mark Fontecchio’s recent article on data center humidity issues at SearchDataCenter.com not only created buzz in the data center blogs, but generated quite a discussion amongst our team at PTS Data Center Solutions.
Data center humidity range too strict?
While some data center professionals find the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)’s recommended relative humidity of 40% to 55% to be restrictive, I think the tight ASHRAE standards have to be adhered to until further research proves otherwise.
PTS’s engineering manager, Dave Admirand, PE notes that the reason the telecommunications industry is able to operate within a wider humidity range of 35% to 65% is because of their very strict grounding regimen. In a well grounded system an electrical charge has no place build up and is more readily dissipated to ground. Mr. Admirand recalls his days at IBM when the ‘old timers’ would swear by wearing leather soled shoes (conductive enough to make connectivity to the grounded raised floor) and/or washing their hands (presumably to carry dampness with them) prior to entering their data centers to avoid a shock from discharging the build up on themselves onto a surface.
Relative humidity vs. absolute humidity
While I think both relative and absolute humidity should be considered, many in the industry are still designing to and measuring relative humidity. PTS mechanical engineer John Lin, PhD points out that only two values of air are independent and data center professionals have to control the air temperature. While we can only control one of the humidity values, it is possible to calculate the absolute humidity (humidity ratio) based on air temperature and relative humidity. Therefore, data centers are fine as long as the both temperature and relative humidity are within the permissible range.
Coy Stine’s example is right on the mark. The high temperature delta between inlet and outlet air that can be realized in some dense IT equipment may lead to some very low humidity air inside critical electronics which can lead to electrostatic discharge (ESD). My experience, however, is that I am not encountering data loss scenarios at the estimated 50-100 data centers I visit each year simply due to ESD concerns. This leads me to believe that there is a slight tendency to ‘make mountains over mole hills’ regarding ESD.
After further reflection on Stine’s scenario about the low relative humidity air at the back of the servers, I was reminded again by Mr. Admirand that it won’t much make a difference since that air is being discharged back to the CRAC equipment. Furthermore, even if the air is recirculated back to the critical load’s inlet the absolute moisture content of the air remains constant and the mixed air temperature is not low enough to cause a problem. John Lin contends this is the reason why we only control temperature and relative humidity.
It’s been our stance at PTS that the most important goal of humidity control is to regulate condensation. The only real danger to very warm, high moisture content air is that it will condense easily should its temperature drop below the dew point temperature.
Separate data center humidity from cooling units?
I have no doubt that R. Stephen Spinazzola’s conclusion that it is cheaper to operate humidity control as a stand-alone air handler is on target. However, experience dictates the approach is an uphill sell since the savings are indirect, realized only as part of operational savings. The reality is that the upfront capital cost is greater to deploy these systems, especially in a smaller environment where it is harder to control humidity anyway.
Humidity control is very dependent on the environment for which you are designing a system. In a large data center, it is actually easier to do because most of the entire building is presumably data center controlled environment. However, for SMEs with tenant space computer rooms the idea of humidity control is much more difficult since it is dictated by the overall building humidity environment. At best, a computer room is a giant sponge – the question is whether you are gaining from or giving off water to the rest of the building.
The design and construction of a data center or computer room, including its cooling system, should meet the specific environmental needs of its equipment. For now, our approach at PTS Data Center Solutions has been to utilize humidity control within DX units for both small and large spaces. Conversely, we control humidity separately when deploying in-row, chilled water cooling techniques for higher density cooling applications in smaller sites.
For more information on data center humidity issues, read “Changing Cooling Requirements Leave Many Data Centers at Risk” or visit our Computer Room Cooling Systems page.
Welcome!
This blog is brought to you by the consultants and engineers at PTS Data Center Solutions.
PTS Data Center Solutions designs, builds, and operates data centers that are great for companies and their people, but better for the planet. Visit us at www.ptsdcs.com.
PTS Data Center Solutions designs, builds, and operates data centers that are great for companies and their people, but better for the planet. Visit us at www.ptsdcs.com.
Tags
computer room design
(8)
computer room energy efficiency
(4)
data center
(4)
data center cleaning
(3)
data center cooling
(8)
data center design
(23)
data center energy efficiency
(5)
Data Center Infrastructure Management
(7)
data center migration
(5)
data center relocation
(7)
DCIM
(12)
DCMMS
(4)
disaster recovery
(12)
pts data center solutions
(68)
server room design
(7)
virtualization
(10)
Links
Monday, July 16, 2007
Search all PTS Sites
Recent Posts
Hyperconvergence eBook
Download a free copy of Hyperconverged Infrastructure for Dummies (eBook in PDF format) by Scott D. Lowe. You may also view the associated pre-recorded webinar: Hyperconvergence for Dummies Q&A.
PTS Open Lease
Popular Posts
-
A number of clients have asked us about the viability of replacing their ‘wet’ sprinkler systems with a dry-type fire suppression system, s...
-
Data centers are facing unprecedented challenges and opportunities as they cope with the growing demand for generative AI and other cutting...
-
Reopen with Confidence As the world enters the next phase of the pandemic, businesses are forced to make changes to accommodate not only a ...
-
How Edge Data Center Solutions are Evolving for New End User Needs What You’ll Learn: Why do edge facilities require more customization? ...
-
Why Make Data-Driven Decisions? For businesses, the importance of data analytics and business intelligence cannot be overstated. Small busin...
-
As anticipation builds around Nvidia’s forthcoming announcements, a critical issue has resurfaced within the data center domain: the sign...
-
Last week I posted the following discussion question in our Computer Room Design networking group at LinkedIn.com. I’m really impressed wit...
-
PTS Offers a Turnkey Leasing Solution There are many compelling reasons to lease equipment. Implementing a leasing strategy provides ben...
-
I’m excited to announce that PTS has launched a strategic distribution relationship with Dell Corporation which includes the full breadth of...
-
The average enterprise data center costs between $10 million and $12 million per megawatt to build, with costs typically front-loaded onto t...