Thursday, August 31, 2006

Data Centers vs. Computer Rooms: What’s the Difference?

The differences between data center and computer room design don’t amount to a hill of beans for most people. The terms are often used interchangeably, but using them correctly makes a big difference if you’re trying to communicate with a data center design firm or an IT expert. If you want to sound like a pro, it’s important to know what sets data centers and computer rooms apart.

Data centers are designed to provide a secure, power protected, environmentally controlled space used for housing server, network and computer equipment. As the operating theatre for an enterprise’s network service delivery, a data center site may utilize the entire site and building shell.

The design of computer rooms is more limited in scope. A computer room is merely a functional space within a data center. It serves as a secure environment for the equipment and cabling directly related to the critical load. In other words, a computer room’s basic design is that of a collapsed data center where the entrance room is contained within the computer room space.

The easiest way to tell the design of a data center from that of a computer room is by looking at how the space’s functional pieces are put together. A data center is a larger space composed of smaller spaces, such as a computer room, network operations center, staging area and conference rooms.

In either case, data center design and computer room design are both accomplished by identifying the key design criteria for the two main areas of the project focus – the technology infrastructure and services (IT) and the support infrastructure and services (the facility). The key design criteria are:
- Business Objects (Scope)
- Availability Requirement
- Power and Cooling Density

While site selection is also a criterion for data center projects, a computer room design project can be as involved as a bigger base-building project or as simple as an upgrade of an existing computer room.

Understanding the differences between data centers and computer rooms is the first step on the road to delivering a successful data center or computer room project. The more you know about the elements of a data center, the easier it will be for you to get your design ideas across to others. If you’d like to learn more about this topic or others, we invite you to visit our White Paper archive at http://www.pts-media.com (registration required) or contact us.

Friday, August 25, 2006

Dark Data Centers: Dream or Reality?

I was just contacted by Processor.com for my thoughts on this topic and thought it might be useful to share some information about it with you all here.

If data center operators had only one wish, it would be this: build me a dark data center.

Many of the daily problems that affect data centers have less to do with the design of a facility and more to do with variables introduced by human involvement. For most data centers, not only does the IT staff have access to the facility. Facility staff, other employees, outside consultants, contractors, and mechanics may enter the data center for a whole host of reasons. As human traffic increases within the data center, so do the risks, amount of clutter, and the number of potential technical problems.

Despite expert design and planning, people do not always follow preset procedures and may meddle with equipment that they are not qualified to use. This is a nightmare for IT professionals. The mistakes are difficult to trace and are a drain on a business’ money and the time of its IT staff.

The ideal solution is to design a dark data center, a remotely monitored IT environment, in which computer systems analyze and correct problems with minimal human involvement. To achieve a completely dark data center, your IT infrastructure, support infrastructure, and software systems need to be autonomous. The majority of companies are no where near this point and most data centers will never be able to run without any human interaction, but technology is quickly taking us closer to this design goal.

Not-So-Dark Data Center Design

Cutting the human element entirely out of the picture may be out of our current reach, but you can reduce foot traffic and the number of unmanaged changes within your data center. “Dim” data center designs are a realistic goal for most companies.

The dim data center approach focuses more on preventative maintenance than on reactive problem solving. The most effective dim data center designs are secure, can independently troubleshoot most problems, can be managed remotely, and implement processes and procedures to control the who, what, where, and when’s of the events taking place within the space.

Dim data centers remain a sought-after solution for IT professionals and users, and an attainable design goal for most companies. Although the dark data center is still a dream, the dim data center is a happy reality.

Tuesday, August 22, 2006

Data Centers Go to Washington

Data center power and cooling issues are creating quite a buzz on Capitol Hill.

In recent years, the power and cooling costs of the average data center have gone through the roof. Data centers rack up more than $3-billion in energy costs each year. That number is expected to rise dramatically within the next decade as more data centers are built. Adding to the energy drain are factors such as inefficient cooling systems, more powerful servers, and rising energy prices.

Part of the problem is that there isn’t enough energy to go around. This is a bigger issue during the summer months as the nation’s reserve electric capacity is declining. Some utility companies have asked business customers to cut power usage during peak times, even if that means switching from the power grid to a generator.

Congress’ aim is to pass legislation that will help to promote the use of energy efficient technology. This past July, the House passed a resolution that instructs the federal Environmental Protection Agency (EPA) to study the issues surrounding data center power consumption, assess what the industry is doing to develop more energy-efficient technology, and seek incentives that would encourage companies to make the switch.

Now it’s the Senate’s turn to address the issue. In the hope of finding ways to cut back on the amount of power consumed by corporate and federal data centers, the Senate introduced a bill that is nearly identical to the one passed by the House just a few weeks earlier. If the data center power efficiency bill passes, it will go before President Bush for his signature.

Industry Reaction

So far, industry reaction has been more positive than negative. Congressional legislation is seen as an important step in raising national awareness of data center power issues, although some technology professionals are worried that this could lead to unnecessary regulation.

Rather than having the federal government set down rules for how data centers are designed, industry insiders hope the data center power efficiency bill will result in something similar to the Energy Star rating seen on computers and appliances. Such a rating would encourage manufacturers to improve the efficiency of technology without stifling industry growth.

PTS’ Perspective

Whether or not the legislation has a profound impact on the data center industry, there are steps that your organization can take to improve efficiency and save on data center power costs:

- Choose the most energy efficient data processing equipment. According to AMD’s Tony DiColli, AMD’s Opteron processor can consume 27% to 80% less power than its Intel Xenon counterpart.

- Use scalable modular support infrastructure that allows the data center power and cooling infrastructure to grow with the load, rather than over-sizing your data center to compensate for future growth.

- Improve the efficiency of your cooling systems. The cooling load should include both the IT load and the room heat load components including skin loads, lighting, people, outside air sources, and heat dissipated due to inefficiency of power and cooling components.

- Reduce the non-critical load losses of power and cooling components in the data center. These are losses that are independent of the load, such as control logic losses.

For an expert analysis of your data center’s power efficiency, consult a data center design firm. By looking at your IT environment as a whole, data center design professionals can weigh the complex interactions of your facility's elements and provide recommendations for improvement.

If you’d like to learn more about data center power efficiency, please read “Electrical Efficiency Modeling of Data Centers” by Neil Rasmussen. A complimentary copy of the White Paper can be downloaded at PTSDCS.com.

Thursday, August 17, 2006

Data Center Solutions to Beat the Summer Heat

This summer, data center managers are sweating – and it’s not just because of the heat. As temperatures hit record highs this year, the increased demands placed on US energy grids have lead to brownouts and occasional blackouts. Between the soaring temperatures and the power outages, data center cooling systems and backup generators are getting a real work-out.

The situation isn’t likely to improve in the summers to come. As our national energy usage increases, power reliability decreases during peak times. The demand for energy is growing at a much faster rate than our power generation capacity can handle.

Unfortunately, if you can’t take the heat, the solution isn’t “Get out of the data center.” Data centers that don’t want to get burned by future heat waves are investing in some preventative measures. Many businesses are installing extra cooling systems and on-site generators in the hope that these data center solutions will prevent costly downtime.

Heat removal is essential to the proper functioning of data centers, yet poor design and maintenance choices prevent many air conditioning systems from operating at peak efficiency. The availability and reliability of your network services hinge on the continued operation of your precision cooling solutions. If you’re looking for help minimizing the frequency and severity of unexpected downtime, try the following data center solutions:

- Provide redundancy throughout the entire cooling infrastructure by maintaining at least one additional computer room air conditioner (CRAC), pumps, and heat rejection equipment for each cooling zone. This is referred to as N+1 redundant.

- If you have a cooling system that employs on-site thermal storage such as a chilled water system, consider providing the air handlers, inside the computer room, with uninterruptible power supply (UPS) power to provide uninterrupted power & cooling to the site.

- To sidestep power outages altogether, size the on-site emergency power generators to handle your system’s cooling as well as power needs.

- Perform regular checks on your computer room air conditioner (CRAC) and heat rejection equipment including inspecting all filters and operating parameters.

For an expert assessment of your data center’s cooling system, consult a data center design firm. By evaluating your present and future loads, capacity and redundant capacity plans can be created and utilized to keep you cool when the heat turns up.