- THE MAGAZINE
Data center design still places a premium on security, redundancy and uptime. Yet, even as virtualization and cloud computing technologies continue to evolve dramatically, so too does the growing importance of operational efficiency — of taking a holistic approach to creating critical facilities. One new data center for a major West Coast public utility, now nearing completion, exemplifies this “new norm.” It is innovation-rich, relies on a broad set of sustainable MEP solutions and is on target to achieve LEED Gold certification.
A Sustainable Prototype
This green data center, in fact, will serve as a model for the utility’s own facilities as the first significant building of its kind erected by the company in 20 years. Here, energy conservation is a top priority — a data center designed to demonstrate sustainable technologies along with reliability, performance, scalability and other core design parameters. The utility also aims to set a new benchmark for data centers within its regional market, showcasing the possibilities of efficiency, flexibility and energy savings through use of materials and design.
To realize this new prototype, the utility looked to a project team that included Turner Construction, Callison and Glumac. “We had the freedom and the mandate to step outside of normal data center requirements and do something better and less costly ... more efficiently,” notes Mike Steinmann, PE, a senior electrical engineer in Glumac’s Critical Facilities Group. And to meet strict budget constraints as well as operational and legal requirements, the client and team agreed on a design-build approach — decidedly non-traditional for public projects. As a result, Seattle-based Callison established a flexible design, responsive to the budget-driven changes required during the course of construction while also resulting in a very efficient building.
Innovation by Design
The team’s design concepts focused on two areas: power utilization effectiveness (PUE) and water efficiency. Central to this scheme, Glumac specified an indirect evaporative cooling system (IDEC), as an alternative to a conventional chilled water plant, for climate control. Designers also recommended use of a flywheel uninterruptible power supply (UPS) system for energy storage, rather than a traditional bank of electrical/mechanical batteries, to supply power conditioning and as a backup for the critical load downstream of the center’s generator. Other unique elements of the new data center include two rainwater harvesting/reclamation systems and a waste heat recovery system used to warm office spaces in winter.
IDEC represents one of the most tried and true forms of cooling, a concept Glumac first applied to data center design more than three years ago. In combination with air-side economizer air handling units (AHUs), this rooftop IDEC system recirculates air within the building’s whitespace area — essentially using an evaporative process on incoming air to deliver cooler temperatures. This represents a vast departure from traditional data center cooling that relies on a chilled water system with computer room air conditioning (CRAC) units around the perimeter within a raised-floor environment. Removing chillers in favor of free cooling as part of the heat rejection strategy can lead to major capital and operational cost savings, requiring as little as 20 percent of the cooling energy of conventional cooling systems. Additionally, all non-critical areas feature outside air (OSA) economizing units.
The flywheel UPS, integrated into the center’s critical power architecture, represents another significant innovation for the project. In the event of a disruption or weak power conditions, this system assumes the entire electrical load. It utilizes a metallic disk (the flywheel) spinning at high revolutions per minute (RPMs) to convert mechanical energy to electrical energy, providing backup power for 15- to 20-second transition periods until the center’s generators start. As an alternative to a conventional UPS installation, these units may be deployed within a denser environment, contain no hazardous materials (lead or sulfuric acid) and do not require strict temperature controls — operating up to 104 F. In keeping with the project’s sustainability goals, Glumac promoted this concept to the client, noting successful applications for a public broadcasting station in 2003 as well as a colocation company and several others since then. “This is still unique in the industry, but becoming more mainstream,” notes Steinmann. “In addition to its space-saving advantages, we continue to be struck by the simplicity, ruggedness and efficiency of this technology.”
Strong collaboration between members of the project team made meeting LEED criteria within the design-build framework easier to accomplish. “A data center is usually dominated by the mechanical and electrical equipment,” observes Bill Fetterley, Callison’s program manager on the new data center. “In this case, it called for a balance of design and cooperation among the mechanical, electrical and architectural designers.”
“This is a unique shape for a data center — even for a data hall,” he continues. “We sculpted the spaces to respond to natural air currents and take advantage of how those evaporative coolers function — with cooling accomplished by creating a high bay space over the computing area that allows the air to move up naturally, to circulate back through the coolers and then be brought back into the operating area.”
In addition to 14,700 square feet of critical whitespace, the data center includes a high-end operations center, large tape storage room and support spaces such as build and main distribution frame (MDF) rooms. All critical spaces feature a pre-action fire protection system, with gaseous fire suppression for selected areas along with a central underground fuel storage system to support potential continuous operation of site generators.
Ultimately, this became a budget-driven project, the client mandating that all new technologies specified lead to cost savings. Given this direction, the project team was able to shrink the size of the final building from the original 100,000 square feet down to approximately 80,000 square feet by using mechanical and electrical systems that required less space than conventional chilled water piping and UPS floor units. The team ended up slashing nearly $15 million of total construction costs.
Collaborative decision-making throughout design also led to measurable performance improvements. Compared to a traditional data center operating with a PUE of 2.0, the client requested that designers aim for a PUE of 1.5 or less. As a result of the building’s many innovative features, Glumac estimates the operations will achieve a peak PUE of 1.48 and an annual average PUE of 1.26 — and that total energy performance will beat ASHRAE 90.1 by about 32 percent.
New Benchmark for Data Centers
Could this data center represent the shape of things to come for this dynamic, fast-paced market? Damon Barnett of Turner thinks so. He represented the firm’s Global Critical Facilities group on the project. “Even five years ago, data center design was about reliability and uptime,” he explains. “Efficiency? Not part of the conversation at a significant level. Clients are now driving things in that direction because the cost to operate a data center is so high. Today, designing to criticality has to be balanced with efficiency and the total cost of ownership, particularly with electrical systems, while being smarter about water usage as well.”
Barnett and Fetterley see other distinct trends as well: movement away from raised floors, increases in power density (more kW per rack) and rising temperatures within data centers — from 68 F up to as high as 75 F — driven by improved technologies and new ASHRAE standards for cooling. Scalability — both physical and vertical — represents another evolving trend, due to modular/pod designs for data halls as well as the ability to increase critical power capacity within those halls as an owner’s business needs to grow.
“It takes time to make big changes in the industry,” concludes Steinmann of Glumac. “But that’s how we progress in designing these facilities: by doing something and learning from it, observing best practices and continuously improving.”
TIER LEVEL: III
CRITICAL IT POWER: 2.4 MW
TOTAL AREA: 80,000 sf
WHITESPACE AREA: 14,700 sf
MEP ENGINEER: Glumac
CONTRACTOR: Turner Construction Company
ESTIMATED COMPLETION DATE: December 2012