Thursday, July 17, 2008

Tips for Handling Data Center Moves and Shortages of Space

Look for PTS Data Center Solutions in the July 11th issue of Processor magazine (Vol.30, Issue 28).

Kurt Marko interviewed me for the feature article, “Need More Data Center Space?: IT Managers Are Faced With Options Ranging From Simple Housekeeping To Major Construction”. Adding data center space can be a complex and costly issue. If your data center runs out of room, the basic options are 1) reorganize and consolidate to get the most out of your existing space, 2) upgrade your technology to increase density, 3) call in a contractor to renovate and expand your current facility, 4) add on a data center in a box, or 5) build a bigger-better data center. Marko’s article discusses your options and gives a rundown of the pros/cons of each.

Michael Petrino, vice president at PTS Data Center Solutions, also appears in this issue of Processor. In Bruce Gain’s article, “Data Center Moving Day: There Is No Such Thing As ‘Over Planning’”, Michael shares his thoughts on how to prepare for a data center relocation project. Topics covered include the overall planning process, what to look for when hiring professional movers, the costs of up-time and down-time, transport options, and other complications.

Click on the links above to read the articles, or view the entire issue as a PDF.

Tuesday, July 01, 2008

Data Center Energy Summit 2008

On June 26th, the Silicon Valley Leadership Group (SVLG) held its first Data Center Energy Summit in Santa Clara, CA. The industry event focused on issues involving data center sustainability, energy efficiency and green computing.

In conjunction with Accenture and the Lawrence Berkeley National Laboratory (LBNL), the SVLG also unveiled a report containing real world case studies from its Energy Efficient Data Center Demonstration Project. You can download the report here: http://accenture.com/SVLGreport. Put together in response to the Environmental Protection Agency (EPA)’s report to Congress on data center energy efficiency, the report examines a number of innovative energy-saving initiatives.

Ken Oestreich from Cassatt points out in his blog that the bulk of the projects focused on improving infrastructure. He raises the following point:

My take is that the industry is addressing the things it knows and feels comfortable with: wires, pipes, ducts, water, freon, etc. Indeed, these are the ‘low-hanging fruit’ of opportunities to reduce data center power. But why aren't IT equipment vendors addressing the other side of the problem: Compute equipment and how it's operated?


I agree with Oestreich that methods for reducing the energy consumption of IT equipment definitely need to be explored further, but I think this report is a great step forward for the industry in terms of validating the EPA’s research and providing actionable data. I’m sure we’ll see more regarding IT equipment operations in future research.

As a side note, Data Center Knowledge has set up a calendar to help data center professionals keep track of upcoming industry events. Check it out: DataCenterConferences.com.

Thursday, June 12, 2008

PTS Data Center Solutions Turns 10-Years Old!

Time flies like the wind. Fruit flies like bananas.
-- Groucho Marx

All kidding aside, time really does fly! It's hard for me to believe, but it was a decade ago that we founded PTS Data Center Solutions (known way-back-when as Power Technology Sales, Inc.). Our goal then, as it is now, was to provide our clients with unparalleled service and optimal solutions to meet their data center and computer room needs.

As we celebrate the company’s tenth anniversary, I’d like to express my appreciation to our hardworking team of consultants, engineers, designers, field service technicians, IT personnel and business staff, as well as our families, friends, business colleagues and clients for being part of our success.

While founded and headquartered in Franklin Lakes, New Jersey, our firm has experienced significant growth over the years, starting with the opening of our West Coast office in Orange County, California in 2004. Just a few years later, PTS Data Center Solutions completed the expansion and reorganization of our NJ facilities – an accomplishment that doubled the amount of useable office and warehouse space available to our team. We also upgraded our computer room, which hosts PTS' live environment and operates as a demonstration center for potential clients to see our work first-hand.

Over the course of the last decade, we’ve had the pleasure of working with small and medium-sized companies as well as large enterprise organizations across a broad spectrum of industry verticals. We’ve grown to become a multi-faceted turnkey solutions provider, offering services for consulting, engineering, design, maintenance, construction, monitoring and more. One of the more recent additions to our business offerings is our Computational Fluid Dynamic (CFD) Services, which use powerful 3-D CFD software for the design, operational analysis, and maintenance for data center and computer rooms of all types and sizes.

Our online presence has also grown. We’ve expanded our corporate website several times to provide new resources for our visitors. To help provide our clients and other IT professionals with insights on common data center issues, we began blogging in 2006. (I’d like to thank all of our readers for your comments and ongoing support!) Just a few months ago, we launched our own Facebook Page to help you stay up-to-date with the latest blog posts, our speaking engagements and other upcoming events. And, coming soon, look for me to be a guest blogger for the “World’s Worst Data Centers” contest, sponsored by TechTarget and APC.

This really is an exciting time for everyone at PTS Data Center Solutions. Reaching this milestone is a great achievement for our company and we’re looking forward to what the next ten years have to offer. Here's to the decades ahead!

Friday, May 23, 2008

Article: “Changing The Oil In Your Data Center”

This is just a quick update before everyone heads out for the holiday weekend.

If you haven't already done so, I encourage you to check out the May 16, 2008 issue of Processor magazine (Vol.30, Issue 20). Drew Robb interviewed me for his latest article, entitled “Changing The Oil In Your Data Center.”

Maintenance neglect is an all-too-frequent cause of unplanned data center downtime. This people-related problem stems from improper documentation of the maintenance process, failure to adhere to a set maintenance schedule, and the overlooking of critical systems. In the article, Robb talks with me about the value of implementing a scheduled maintenance plan to ensure reliable data center operations.

He also includes insights from several other data center professionals, including Steven Harris, director of data center planning at Forsythe Solutions Group (www.forsythe.com), and James Rankin, a CDW technology specialist (www.cdw.com). To read the full article, please visit the Processor website at http://www.processor.com/).

Have a safe and happy Memorial Day weekend!

Wednesday, May 07, 2008

The National Data Center Energy Efficiency Information Program

Matt Stansberry, editor of SearchDataCenter.com, posted a blog entry on the National Data Center Energy Efficiency Information Program. I'm echoing his post here because energy efficiency is such a critical issue for the industry.

The U.S. Department of Energy (DOE) and U.S. Environmental Protection Agency (EPA) have teamed up on a project with aims to help reduce energy consumption in data centers. In addition to providing information and resources which promote energy efficiency, the National Data Center Energy Efficiency Information Program is reaching out to data center operators and owners to collect data on total energy use.

In the words of the EPA’s Andrew Fanara:

We've put out an information request to anyone who has a data center to ask if you would measure the energy consumption of your data center in a standardized way and provide that to us. That will help us get a better handle on what's going on nationally in terms of data center energy consumption.


Hear the EPA’s Andrew Fanara talk about the program in this video from the Uptime Institute Symposium:



If you’d like to get your data center involved, more information can be found at the EPA's ENERGY STAR data center website and the DOE's Save Energy Now data center website.

Friday, April 25, 2008

CFD Modeling for Data Center Cooling

Computational fluid dynamics (CFD) modeling is a valuable tool for understanding the movement of air through a data center, particularly as air-cooling infrastructure grows more complex. By using CFD analysis to eliminate hot spots, companies can lower energy consumption and reduce data center cooling costs.
Mark Fontecchio at SearchDataCenter.com has written a great new article on the subject, entitled “Can CFD modeling save your data center?”. Fontecchio examines the use of CFD analysis as a tool for analyzing both internal and external data center airflow.
In the article, Carl Pappalardo, IT systems engineer for Northeast Utilities, provides a first-hand account of how CFD analysis helped in optimizing their data center’s cooling efficiency. Allan Warn, a data center manager at ABN AMRO bank, also shares his thoughts on the value of renting vs. buying CFD modeling software. Fontecchio also includes insights from industry experts, including Ernesto Ferrer, a CFD engineer at Hewlett-Packard Co., and from yours truly, Pete Sacco.
For more information on data center cooling, download my White Paper, “Data Center Cooling Best Practices”, at http://www.pts-media.com (PDF format). You can also download additional publications, like Vendor White Papers, from the PTS Media Library.
To learn more about how PTS uses CFD modeling in the data center design process, please visit: http://www.ptsdcs.com/cfdservices.asp.

Tuesday, April 15, 2008

Free White Paper on Relative Sensitivity Fire Detection Systems

Fire detection is a challenge in high-end, mission critical facilities with high-density cooling requirements. This is due primarily to the varying levels of effectiveness of competing detection systems in high-velocity airflow computer room environments.

In a new white paper, PTS Data Center Solutions’ engineers Suresh Soundararaj and David Admirand, P.E. identify and analyze the effectiveness of relative sensitivity-based fire detection systems in a computer room utilizing a high-density, high-velocity, and high-volume cooling system.

In addition to examining the differences between fixed sensitivity and relative sensitivity smoke detection methodologies, Soundararaj and Admirand detail the results of fire detection tests conducted in PTS’ operational computer room and demo center using AirSense Technology’s Stratos-Micra 25® aspirating smoke detector.

The illustrated 13-page white paper, entitled “Relative Sensitivity-based Fire Detection Systems used in High Density Computer Rooms with In-Row Air Conditioning Units,” is available for download on our website in PDF format.

Tuesday, March 25, 2008

Reflections on the DataCenterDynamics Conference

Earlier this month, I had the honor of speaking in two separate sessions at the DatacenterDynamics Conference & Expo in New York City.

My first presentation, "The Impact of Numerical Modeling Techniques on Computer Room Design and Operations," was well received by its 60 or so attendees. Based on audience feedback provided both during and after the presentation, I think people really appreciated the practical examples and case studies of lessons learned since PTS began utilizing 3-D computational fluid dynamic (CFD) software as a tool for designing cooling solutions.

My second stint, with co-presenter Herman Chan from Raritan Computer, Inc., was equally well received. Our presentation on "Stop Ignoring Rack PDUs" described the research both our companies have undertaken regarding rack-level, IT equipment, real-time power monitoring.

As part of our presentation we displayed the results of our power usage study of PTS’ computer room. The data revealed that 58% of the total power consumption of PTS’ computer room is consumed and dissipated as heat by the IT critical load. This proves to be far better than some other industry data. In the coming months, both Raritan and PTS hope to release a co-written white paper documenting the results of our study.

Overall, the DatacenterDynamics’ show was even better attended and better sponsored than it was in 2007. I estimate there were some 500-700 people in attendance for the event. If the trend holds true from last year, about 50% of them were data center operators. The balance of the attendance was made up of consultants, vendors, and others.

This show has become my favorite regional data center industry event because of its unique single day format and for the quality of the content provided by its featured speakers (and I’m not just saying that because I’m a presenter). Many shows of this type turn into a commercial for the vendors that pay good money to sponsor the event.

What sets DataCenterDynamics apart is that the event organizers demand that each presentation be consultative in nature. Additionally, they make every effort to review and comment on each presentation before the event. If you haven’t attended this key data center industry event yet, I hope you’ll get the chance to do so in the near future.

Have you attended DataCenterDynamics? What sessions did you find most valuable? Please leave a comment to share your experience.

Tuesday, February 26, 2008

Are the Uptime Institute's Data Center Rating Tiers Out of Date?

Let me start by saying I have the utmost respect for the Uptime Institute’s Pitt Turner, P.E., John Seader, P.E., and Ken Brill and the work they have done furthering the cause of providing some standards to an otherwise standard-less subject like data center design. However, as a data center designer I feel their definitive work, Tier Classifications Define Site Infrastructure Performance, has passed its prime.

The Institute’s systems have been in use since 1995, which is positively ancient in the world of IT.

In its latest revision, the Uptime Institute’s Tier Performance Standards morphed from a tool for IT and corporate decision makers to consider the differences between different data center investments into a case study for consulting services pushing for certification against their standard.

While the data their standard is based upon has been culled from real client experiences, the analysis of the data has been interpreted by only one expert company, ComputerSite Engineering which works in close collaboration with the Uptime Institute. Surely, the standard could be vastly improved with the outside opinion and influence of many of the, just as expert, data center design firms that exist.

Case in point, the Uptime Institute has repeatedly defended the notion that there is no such thing as a partial tier conforming site (Tier 1+, almost Tier III, etc.). They argue that the rating is definitive and to say such things is a misuse of the rating guide. While I understand the argument that a site is only as good as its weakest link, to say that a site incorporating most, but not all of the elements of the tier definition is mathematically and experientially wrong.

PTS’ actual experiences bear this out. Our clients that have all the elements of a Tier II site, except for the second generator, are clearly better than those with no UPS and/or air conditioning redundancy (Tier I). Therefore, if not for Tier I+, how do they suggest to account for the vast realization between the real availability of the two sites?

It is interesting that most data center consulting, design, and engineering companies nationwide utilize elements of the white paper as a communications bridge to the non-facility engineering community, but not as part of their design process. In fact, most have developed and utilize their own internal rating guides.

While I will continue to utilize their indisputable expertise as a part of my own interpretation in directing PTS’ clients with their data center investment decisions, I suggest that clients would be wise not put all of their eggs in the Institute’s basket at this point in time.

What is your outlook on the Uptime Institute’s Tier Performance Standards? Is the four-tier perspective outdated or is it still a meaningful industry standard?

Friday, February 15, 2008

Facebook user? Add yourself as a fan of our blog!

Do you have a Facebook account? If so, you can help spread the word about the Data Center Design blog by joining our newly created Facebook Page. Be among the first to hear about blog updates, speaking engagements and other upcoming events.

Click here to view the Facebook Page for PTS Data Center Solutions and add yourself as a Fan.