Thursday, June 12, 2008

PTS Data Center Solutions Turns 10-Years Old!

Time flies like the wind. Fruit flies like bananas.
-- Groucho Marx

All kidding aside, time really does fly! It's hard for me to believe, but it was a decade ago that we founded PTS Data Center Solutions (known way-back-when as Power Technology Sales, Inc.). Our goal then, as it is now, was to provide our clients with unparalleled service and optimal solutions to meet their data center and computer room needs.

As we celebrate the company’s tenth anniversary, I’d like to express my appreciation to our hardworking team of consultants, engineers, designers, field service technicians, IT personnel and business staff, as well as our families, friends, business colleagues and clients for being part of our success.

While founded and headquartered in Franklin Lakes, New Jersey, our firm has experienced significant growth over the years, starting with the opening of our West Coast office in Orange County, California in 2004. Just a few years later, PTS Data Center Solutions completed the expansion and reorganization of our NJ facilities – an accomplishment that doubled the amount of useable office and warehouse space available to our team. We also upgraded our computer room, which hosts PTS' live environment and operates as a demonstration center for potential clients to see our work first-hand.

Over the course of the last decade, we’ve had the pleasure of working with small and medium-sized companies as well as large enterprise organizations across a broad spectrum of industry verticals. We’ve grown to become a multi-faceted turnkey solutions provider, offering services for consulting, engineering, design, maintenance, construction, monitoring and more. One of the more recent additions to our business offerings is our Computational Fluid Dynamic (CFD) Services, which use powerful 3-D CFD software for the design, operational analysis, and maintenance for data center and computer rooms of all types and sizes.

Our online presence has also grown. We’ve expanded our corporate website several times to provide new resources for our visitors. To help provide our clients and other IT professionals with insights on common data center issues, we began blogging in 2006. (I’d like to thank all of our readers for your comments and ongoing support!) Just a few months ago, we launched our own Facebook Page to help you stay up-to-date with the latest blog posts, our speaking engagements and other upcoming events. And, coming soon, look for me to be a guest blogger for the “World’s Worst Data Centers” contest, sponsored by TechTarget and APC.

This really is an exciting time for everyone at PTS Data Center Solutions. Reaching this milestone is a great achievement for our company and we’re looking forward to what the next ten years have to offer. Here's to the decades ahead!

Friday, May 23, 2008

Article: “Changing The Oil In Your Data Center”

This is just a quick update before everyone heads out for the holiday weekend.

If you haven't already done so, I encourage you to check out the May 16, 2008 issue of Processor magazine (Vol.30, Issue 20). Drew Robb interviewed me for his latest article, entitled “Changing The Oil In Your Data Center.”

Maintenance neglect is an all-too-frequent cause of unplanned data center downtime. This people-related problem stems from improper documentation of the maintenance process, failure to adhere to a set maintenance schedule, and the overlooking of critical systems. In the article, Robb talks with me about the value of implementing a scheduled maintenance plan to ensure reliable data center operations.

He also includes insights from several other data center professionals, including Steven Harris, director of data center planning at Forsythe Solutions Group (www.forsythe.com), and James Rankin, a CDW technology specialist (www.cdw.com). To read the full article, please visit the Processor website at http://www.processor.com/).

Have a safe and happy Memorial Day weekend!

Wednesday, May 07, 2008

The National Data Center Energy Efficiency Information Program

Matt Stansberry, editor of SearchDataCenter.com, posted a blog entry on the National Data Center Energy Efficiency Information Program. I'm echoing his post here because energy efficiency is such a critical issue for the industry.

The U.S. Department of Energy (DOE) and U.S. Environmental Protection Agency (EPA) have teamed up on a project with aims to help reduce energy consumption in data centers. In addition to providing information and resources which promote energy efficiency, the National Data Center Energy Efficiency Information Program is reaching out to data center operators and owners to collect data on total energy use.

In the words of the EPA’s Andrew Fanara:

We've put out an information request to anyone who has a data center to ask if you would measure the energy consumption of your data center in a standardized way and provide that to us. That will help us get a better handle on what's going on nationally in terms of data center energy consumption.


Hear the EPA’s Andrew Fanara talk about the program in this video from the Uptime Institute Symposium:



If you’d like to get your data center involved, more information can be found at the EPA's ENERGY STAR data center website and the DOE's Save Energy Now data center website.

Friday, April 25, 2008

CFD Modeling for Data Center Cooling

Computational fluid dynamics (CFD) modeling is a valuable tool for understanding the movement of air through a data center, particularly as air-cooling infrastructure grows more complex. By using CFD analysis to eliminate hot spots, companies can lower energy consumption and reduce data center cooling costs.
Mark Fontecchio at SearchDataCenter.com has written a great new article on the subject, entitled “Can CFD modeling save your data center?”. Fontecchio examines the use of CFD analysis as a tool for analyzing both internal and external data center airflow.
In the article, Carl Pappalardo, IT systems engineer for Northeast Utilities, provides a first-hand account of how CFD analysis helped in optimizing their data center’s cooling efficiency. Allan Warn, a data center manager at ABN AMRO bank, also shares his thoughts on the value of renting vs. buying CFD modeling software. Fontecchio also includes insights from industry experts, including Ernesto Ferrer, a CFD engineer at Hewlett-Packard Co., and from yours truly, Pete Sacco.
For more information on data center cooling, download my White Paper, “Data Center Cooling Best Practices”, at http://www.pts-media.com (PDF format). You can also download additional publications, like Vendor White Papers, from the PTS Media Library.
To learn more about how PTS uses CFD modeling in the data center design process, please visit: http://www.ptsdcs.com/cfdservices.asp.

Tuesday, April 15, 2008

Free White Paper on Relative Sensitivity Fire Detection Systems

Fire detection is a challenge in high-end, mission critical facilities with high-density cooling requirements. This is due primarily to the varying levels of effectiveness of competing detection systems in high-velocity airflow computer room environments.

In a new white paper, PTS Data Center Solutions’ engineers Suresh Soundararaj and David Admirand, P.E. identify and analyze the effectiveness of relative sensitivity-based fire detection systems in a computer room utilizing a high-density, high-velocity, and high-volume cooling system.

In addition to examining the differences between fixed sensitivity and relative sensitivity smoke detection methodologies, Soundararaj and Admirand detail the results of fire detection tests conducted in PTS’ operational computer room and demo center using AirSense Technology’s Stratos-Micra 25® aspirating smoke detector.

The illustrated 13-page white paper, entitled “Relative Sensitivity-based Fire Detection Systems used in High Density Computer Rooms with In-Row Air Conditioning Units,” is available for download on our website in PDF format.

Tuesday, March 25, 2008

Reflections on the DataCenterDynamics Conference

Earlier this month, I had the honor of speaking in two separate sessions at the DatacenterDynamics Conference & Expo in New York City.

My first presentation, "The Impact of Numerical Modeling Techniques on Computer Room Design and Operations," was well received by its 60 or so attendees. Based on audience feedback provided both during and after the presentation, I think people really appreciated the practical examples and case studies of lessons learned since PTS began utilizing 3-D computational fluid dynamic (CFD) software as a tool for designing cooling solutions.

My second stint, with co-presenter Herman Chan from Raritan Computer, Inc., was equally well received. Our presentation on "Stop Ignoring Rack PDUs" described the research both our companies have undertaken regarding rack-level, IT equipment, real-time power monitoring.

As part of our presentation we displayed the results of our power usage study of PTS’ computer room. The data revealed that 58% of the total power consumption of PTS’ computer room is consumed and dissipated as heat by the IT critical load. This proves to be far better than some other industry data. In the coming months, both Raritan and PTS hope to release a co-written white paper documenting the results of our study.

Overall, the DatacenterDynamics’ show was even better attended and better sponsored than it was in 2007. I estimate there were some 500-700 people in attendance for the event. If the trend holds true from last year, about 50% of them were data center operators. The balance of the attendance was made up of consultants, vendors, and others.

This show has become my favorite regional data center industry event because of its unique single day format and for the quality of the content provided by its featured speakers (and I’m not just saying that because I’m a presenter). Many shows of this type turn into a commercial for the vendors that pay good money to sponsor the event.

What sets DataCenterDynamics apart is that the event organizers demand that each presentation be consultative in nature. Additionally, they make every effort to review and comment on each presentation before the event. If you haven’t attended this key data center industry event yet, I hope you’ll get the chance to do so in the near future.

Have you attended DataCenterDynamics? What sessions did you find most valuable? Please leave a comment to share your experience.

Tuesday, February 26, 2008

Are the Uptime Institute's Data Center Rating Tiers Out of Date?

Let me start by saying I have the utmost respect for the Uptime Institute’s Pitt Turner, P.E., John Seader, P.E., and Ken Brill and the work they have done furthering the cause of providing some standards to an otherwise standard-less subject like data center design. However, as a data center designer I feel their definitive work, Tier Classifications Define Site Infrastructure Performance, has passed its prime.

The Institute’s systems have been in use since 1995, which is positively ancient in the world of IT.

In its latest revision, the Uptime Institute’s Tier Performance Standards morphed from a tool for IT and corporate decision makers to consider the differences between different data center investments into a case study for consulting services pushing for certification against their standard.

While the data their standard is based upon has been culled from real client experiences, the analysis of the data has been interpreted by only one expert company, ComputerSite Engineering which works in close collaboration with the Uptime Institute. Surely, the standard could be vastly improved with the outside opinion and influence of many of the, just as expert, data center design firms that exist.

Case in point, the Uptime Institute has repeatedly defended the notion that there is no such thing as a partial tier conforming site (Tier 1+, almost Tier III, etc.). They argue that the rating is definitive and to say such things is a misuse of the rating guide. While I understand the argument that a site is only as good as its weakest link, to say that a site incorporating most, but not all of the elements of the tier definition is mathematically and experientially wrong.

PTS’ actual experiences bear this out. Our clients that have all the elements of a Tier II site, except for the second generator, are clearly better than those with no UPS and/or air conditioning redundancy (Tier I). Therefore, if not for Tier I+, how do they suggest to account for the vast realization between the real availability of the two sites?

It is interesting that most data center consulting, design, and engineering companies nationwide utilize elements of the white paper as a communications bridge to the non-facility engineering community, but not as part of their design process. In fact, most have developed and utilize their own internal rating guides.

While I will continue to utilize their indisputable expertise as a part of my own interpretation in directing PTS’ clients with their data center investment decisions, I suggest that clients would be wise not put all of their eggs in the Institute’s basket at this point in time.

What is your outlook on the Uptime Institute’s Tier Performance Standards? Is the four-tier perspective outdated or is it still a meaningful industry standard?

Friday, February 15, 2008

Facebook user? Add yourself as a fan of our blog!

Do you have a Facebook account? If so, you can help spread the word about the Data Center Design blog by joining our newly created Facebook Page. Be among the first to hear about blog updates, speaking engagements and other upcoming events.

Click here to view the Facebook Page for PTS Data Center Solutions and add yourself as a Fan.

Wednesday, February 13, 2008

Are “free” computer room site assessment services worth the money you pay for them?

It has become commonplace for the myriad of IT and support infrastructure OEMs to offer free site assessment services in an effort to woo clients into purchasing their equipment.

While it was already difficult enough for small- to mid-size design consulting service providers to build credibility and brand-identity in the ultra-competitive world of computer room design, in the past few years these firms have seen some of their most valuable vendor partners become chief competitors.

This is not just a case of sour apples. The design services provided by most OEMs do their clients a disservice. Clients are usually only provided the part of the picture that suits the manufacturer and they are forced to fill in the blanks. Unfortunately, the blanks are often not identified. This leads to some very unhappy bean counters.

One leading power and cooling system manufacturer’s entire go-to-market strategy is based on allowing inexperienced enthusiasts to represent themselves as capable designers by providing them with access to an online configuration tool. Being an expert in its use myself, I can safely say the information it provides is rudimentary at best. Our team at PTS Data Center Solutions uses this tool only for ordering purposes and never for design. These online tools are being used by the manufacturer’s own systems engineers, reseller partners, or end users themselves to try to simplify the inherently complicated subject of computer room support infrastructure design.

The manufacturer’s configuration tool only provides solution recommendations for the equipment they manufacture. Much of the rest of the complete solution is missing, including the infrastructure they don’t sell, the labor to install any of it, and/or the engineering services to produce the design documentation required to file the necessary permits. Worse, little advice is provided as to the best project delivery methodology. While I would be the first to admit the traditional consulting engineering community has been slow to adapt to the latest design practices, the truth remains that as-a-matter-of-course changes to facilities still require the services of a licensed engineer. This includes the sizing of the power and cooling infrastructure.

That’s not to say the use of tools doesn’t have its place. Any consultant-recommended solutions should always be based on sound engineering using the latest technologies, such as computational fluid dynamic (CFD) modeling.

Individuals seeking computer room solutions are better served by hiring experienced, licensed, capable design engineers that are well versed in all of the major infrastructure solutions. This ensures that for a moderate amount of money spent in the planning stage you come away with a properly designed project with a well-defined scope, schedule, and budget.

Tuesday, February 05, 2008

PTS’ 2008 Predictions for the Data Center Industry

I consider myself a veteran of the data center design industry. Additionally, I have the good fortune to visit as many as fifty data centers and computer rooms in the course of a year. And while I have seen good ones and bad ones, they all seem to share certain commonalities. As a result of my experiences and research covering a broad scope of concerns, I have compiled a list of the challenges the data center industry as a whole will face over the next few years.

The talent pool of senior level experts is disappearing. Worse yet, as a nation we have not educated tomorrow’s engineers and/or technicians. This severe lack of experts will be an ever present obstacle to sustainable corporate growth due to technology evolution. In turn, this threatens the nation’s overall economic growth and will cause the United States to fall as the technical leader of the world. Our only saving grace will be to embrace the new world order and adapt to global solutions.

The original equipment manufacturers will own the data center design space. This is their best recourse in maintaining an ability to sell their ever improving infrastructure to customers with old, out-dated, ill-prepared facilities. A further prediction is that it will be difficult for these OEMs to provide heterogeneous and not self-serving designs. And even if they can, will clients believe it to be so?

Big surprise, data centers and computer rooms nationwide are running out of power, cooling, and space. Furthermore, due to the high capital cost and the time it takes to undertake a computer room improvement project, operators will choose not to. My prediction is that this will lead to business impacting disruptions for at least 20% of businesses over the next three (3) years.

We will run out of utility power producing capacity as a nation before the technical revolution is over. Furthermore, no amount of ‘green’ building will prohibit this from inevitably happening. Like virtualization has been for processing capacity, ‘greenness’ is only an incremental band aid on the proverbial bullet wound. My prediction is that the U.S. will experience more wide area outages, such as the one in August of 2003, in the near future.

As the saying goes, ‘necessity is the mother of invention’. We had better hope so. My final prediction is that our technological leadership as a nation will be saved not by a band aid application, by a grassroots conservation effort, or by sheer will alone. Ultimately, it will be saved by a sweeping improvement in the efficiency of how power is used by IT infrastructure. Materials research within the semiconductor industry will yield a massive reduction in the power dissipation of IT infrastructure. As a result, companies worldwide will take advantage by refreshing their IT equipment, thus allowing them to survive using their existing aging facilities and support infrastructure.

What is your number one prediction for the industry in the coming years? Whether you’re optimistic or foresee doom and gloom, I would love to hear what you think.

PTS' 2008 Predictions for the Data Center Industry

I consider myself a veteran of the data center design industry. Additionally, I have the good fortune to visit as many as fifty data centers and computer rooms in the course of a year. And while I have seen good ones and bad ones, they all seem to share certain commonalities. As a result of my experiences and research covering a broad scope of concerns, I have compiled a list of the challenges the data center industry as a whole will face over the next few years.



The talent pool of senior level experts is disappearing. Worse yet, as a nation we have not educated tomorrow’s engineers and/or technicians. This severe lack of experts will be an ever present obstacle to sustainable corporate growth due to technology evolution. In turn, this threatens the nation’s overall economic growth and will cause the United States to fall as the technical leader of the world. Our only saving grace will be to embrace the new world order and adapt to global solutions.



The original equipment manufacturers will own the data center design space. This is their best recourse in maintaining an ability to sell their ever improving infrastructure to customers with old, out-dated, ill-prepared facilities. A further prediction is that it will be difficult for these OEMs to provide heterogeneous and not self-serving designs. And even if they can, will clients believe it to be so?



Big surprise, data centers and computer rooms nationwide are running out of power, cooling, and space. Furthermore, due to the high capital cost and the time it takes to undertake a computer room improvement project, operators will choose not to. My prediction is that this will lead to business impacting disruptions for at least 20% of businesses over the next three (3) years.


We will run out of utility power producing capacity as a nation before the technical revolution is over. Furthermore, no amount of ‘green’ building will prohibit this from inevitably happening. Like virtualization has been for processing capacity, ‘greenness’ is only an incremental band aid on the proverbial bullet wound. My prediction is that the U.S. will experience more wide area outages, such as the one in August of 2003, in the near future.



As the saying goes, ‘necessity is the mother of invention’. We had better hope so. My final prediction is that our technological leadership as a nation will be saved not by a band aid application, by a grassroots conservation effort, or by sheer will alone. Ultimately, it will be saved by a sweeping improvement in the efficiency of how power is used by IT infrastructure. Materials research within the semiconductor industry will yield a massive reduction in the power dissipation of IT infrastructure. As a result, companies worldwide will take advantage by refreshing their IT equipment, thus allowing them to survive using their existing aging facilities and support infrastructure.



What is your number one prediction for the industry in the coming years? Whether you are optimistic or foresee doom and gloom, I would love to hear what you think.

Wednesday, January 30, 2008

2008 Data Center Industry Trends

A recent article from Network World points to security as the dominant issue for the data center design industry in 2008. Potential threats identified by experts include:

  • Malware attacks which piggy back on major events such as the ‘08 Olympics or the US Presidential Elections
  • The opportunity for the first serious security exploit in corporate VoIP networks
  • Additional malware vulnerability for users as participation in Web 2.0 continues to grow

Other important issues for 2008 as identified by Network World staff include:
  • The early adoption of 802.11n WLAN technology
  • A shift in IT’s approach to managing mission critical environments as virtualization and green computing are deployed more broadly
  • The growing acceptance of open source technology at the corporate level
  • Tightly controlled budgets as IT spending growth drops (particularly in response to news of economic recession)
  • Increased demand for “IT hybrids” – professionals with both business acumen and technical know-how – as the most sought-after hires

Source: Security dominates 2008 IT agenda

Wednesday, December 26, 2007

Data Center Wish Lists

In the spirit of the holiday season, the folks at SearchDataCenter.com have taken a look at what’s on data center managers’ holiday wish lists. It’s a fun read – check it out when you have a minute.

Here are some of the highlights:

“Extra processor horsepower”

“Information on building new data centers to carry us through the next 20 years and beyond”

“A pill that we could give folks in the physical plant and IT that would give them an understanding of what the data center is and what it takes to operate one under best practices”

And all I thought to ask for was a Nintendo Wii. I’ll have to be more creative next year.

No matter what you’re wishing for this year, the team at PTS Data Center Solutions wishes you a happy holiday season and a fantastic New Year!

Monday, December 17, 2007

Embracing the Expanding Role of IT in Business

I was recently asked by Processor Magazine to answer a few questions about IT’s role in business, and it occurred to me that now might be a perfect time to give a “shout out” to the IT folks out there. A sort of gift, if you will, in the spirit of the season.

First, let me dispel an all too common myth – IT is not just a group of “geeks” typing code all day in the server closet down the hall. Far from it. As technology continues marching forward, IT’s role and its importance to the bottom line continues to grow. And don’t just take my word for it – according to the MIT Sloan Management Review, Information and Information Technology have become the fifth major resource available to executives for shaping an organization, alongside people, money, material and machines.[1] In fact, we’re witnessing all businesses, from large to small, expanding what was traditionally thought of as IT, to a broader corporate responsibility known as Information Systems (IS). This new IS paradigm is responsible for the development and implementation of business processes (BP) throughout an organization. These BP’s are often technology based and therefore the logical domain of the technology leaders of the organization.

IT, or “IS” I should say, is responsible for much more than just fixing uncooperative computers. IS deals with the use of infrastructure including PCs, servers, storage, network, security, communications, and related software to manipulate, store, protect, process, transmit, and retrieve information securely. Today, the IT umbrella is quite large and covers many disciplines. IT professionals perform various duties ranging from installing applications, implementing LAN/WAN networks, designing information databases, and managing communications. A few of the duties that IT professionals perform may include data management, networking, network security, deploying infrastructure, managing communications, database and software design & implementation, as well as the monitoring and administration of entire systems.

So what’s my point, you ask? Simply to reinforce the value of IT and help shift the corporate perception of IT as a “necessary evil” to IT as an important value center that can help businesses and employees to accomplish more, with greater accuracy, in less time, while utilizing less company resources. For 2008, I encourage companies to make a New Year’s resolution to embrace IT and look for ways to make the most of this extremely valuable resource.

[1] Rockart et. Al (1996) "Eight imperatives for the new IT organization," Sloan Management Review.

Monday, October 29, 2007

Server Cabinet Organization Tips

Just in time for Halloween, check out this classic server room cabling nightmare at Tech Republic. Scary stuff.

Good data center design is a combination of high-level conceptual thinking and strategic planning, plus close attention to detail. Obviously, things like the cooling system and support infrastructure are critical to maintaining an always-available data center, but smaller things like well organized server cabinets can also contribute to the overall efficiency of a data center or computer room. That being said, I thought I’d share a few of our guidelines and best practices for organizing your cabinets.

In no particular order:

1. Place heavier equipment on the bottom, lighter equipment towards the top

2. Use blanking plates to fill equipment gaps to prevent hot air from re-circulating back to the front

3. Use a cabinet deep enough to accommodate cable organization and airflow in the rear of the cabinet

4. Use perforated front and rear doors when using the room for air distribution

5. Make sure doors can be locked for security

6. PTS prefers using a patch panel in each cabinet for data distribution. We typically install it in the top rear U’s, but are experimenting with vertical rear channel patch cable distribution

7. PTS prefers using vertical power strips in a rear channel of the cabinet with short power cords for server-to-power-strip distribution

8. While they are convenient, do not use cable management arms that fold the cables on the back of the server as they impede outlet airflow of the server

9. Don’t use roof fans without front-to-rear baffling. They suck as much cold air from the front as they do hot air from the rear.

10. Monitor air inlet temperature ¾ of the way up the front of the cabinet

11. Use U-numbered vertical rails to make mounting equipment easier

12. Have a cabinet numbering convention and floor layout map

13. Use color-coded cabling for different services

14. Separate power and network cabling distribution on opposite sides of the cabinet

15. PTS often uses the tops of the cabinet to facilitate cabinet-to-cabinet power and data cable distribution

As you can see, the little things do make a difference. And by instituting some or all of these, you’ll be one step closer to 24-7 availability.

Wednesday, October 24, 2007

The Role of Sprinklers in Computer Room Fire Protection

A number of clients have asked us about the viability of replacing their ‘wet’ sprinkler systems with a dry-type fire suppression system, such as FM-200. Not many IT personnel understand the role of water-based fire suppression systems, but all realize the potential for water in the data processing environment to be a “bad thing.”
 
The short answer is that sprinkler systems protect the building and dry-type systems protect the equipment. In most cases a dry-type system cannot take the place of a sprinkler system, it can only be installed in addition to it. At the end of the day, the local fire inspection is the authority and has jurisdiction over what is permissible. This is the reason that pre-action sprinkler systems are primarily used for computer room fire protection.
 
That being said, fire prevention provides more protection against damage than any type of detection or suppression equipment available. For Tier I and Tier II computer rooms, PTS often recommends installing only a pre-action sprinkler system activated by a photo-electric smoke detection system and forgo a dry-type system and VESDA system. We find the most effective strategy is to emphasize prevention and early detection. This allows the client to maximize availability by investing in solutions for areas of higher risk, such as fully redundant power and cooling systems.
 
For more information on fire protection, read our vendor white paper “Mitigating Fire Risks in Mission Critical Facilities,” which provides a clear understanding of the creation, detection, suppression and prevention of fire within mission critical facilities. Fire codes for Information Technology environments are discussed. Best practices for increasing availability are provided.