Data Center Power and Cooling Construction: The Mission-Critical Systems That Define Data Center Complexity
Data center construction has distinctive features centered on power and cooling. Servers require continuous power — outages mean lost revenue and data. Servers produce heat requiring continuous removal — overheating shuts down equipment or destroys it. Redundant systems provide fault tolerance. 24/7/365 operation means no scheduled downtime for maintenance. Tier ratings (I-IV per Uptime Institute) define redundancy requirements.
Data center power and cooling systems are complex, specialized, and critical. Construction requires attention to redundancy, commissioning, and operational handoff. This post covers data center power and cooling systems for construction teams.
Uptime Institute Tier ratings:
Tier rating system
- Tier I — basic capacity, 99.671% uptime
- Tier II — redundant components, 99.741%
- Tier III — concurrently maintainable, 99.982%
- Tier IV — fault tolerant, 99.995%
- Each tier has specific requirements
- Certification vs design-only
- Higher tier = more redundancy = more cost
Tier ratings drive design and cost. Tier III is common commercial colocation. Tier IV for highest criticality. Each tier has specific requirements for power and cooling. Certification by Uptime Institute separates from self-certification claims.
Data center power path:
Power path
- Utility power (primary)
- Backup generator (diesel typical)
- ATS (Automatic Transfer Switch)
- UPS (Uninterruptible Power Supply)
- Switchgear
- PDU (Power Distribution Unit)
- Rack PDU
- Server power supplies
Power flows through multiple stages, each providing redundancy or conditioning. Utility is normal source. Generator backup for extended outages. UPS bridges gap during transitions. Switchgear distributes. PDUs condition for server use. Each component is potential failure point — redundancy addresses.
UPS provides bridging power:
UPS systems
- Double conversion online — common commercial
- Line interactive — smaller systems
- Battery banks for runtime (minutes to hours)
- Flywheel UPS (alternative to batteries)
- Redundancy (N+1, 2N, 2N+1)
- Maintenance bypass
- Battery monitoring
UPS bridges utility outage until generator starts. Battery banks provide energy storage (5-15 minutes typical for data centers). Double conversion technology protects loads from power quality issues. Redundancy configurations increase uptime. Maintenance bypass enables UPS service without affecting load.
Generators provide extended backup:
Generator considerations
- Diesel typical
- Fuel storage (24-72 hours typical)
- Automatic start on utility loss
- Sizing for full load
- Redundancy (N+1, 2N)
- Load bank testing
- Emission requirements
- Fuel contracts for extended outages
Generators pick up load when utility fails and UPS bridges. Sized for full data center load. Fuel capacity for extended outages. Regular testing verifies readiness. Emission regulations (Tier 4 Final for newer units) affect selection. Fuel delivery contracts for weeks-long outages.
Switchgear distributes power:
Switchgear
- Medium voltage incoming
- Transformers to low voltage
- Distribution switchgear
- Paralleling switchgear for generators
- Tie breakers for redundancy
- Protective relays
- Arc flash protection
Switchgear is heart of power distribution. Medium voltage entry (typically 15 kV) transforms to low voltage for distribution. Paralleling gear synchronizes generators. Tie breakers enable redundant configurations. Arc flash protection critical for worker safety.
PDUs distribute to rows:
PDU systems
- Step-down transformer typically
- Branch circuit breakers
- Monitoring of circuit-level power
- Typical 300-500 kW capacity
- Feed racks or row manifolds
- Redundancy (A+B feeds)
- Modular construction
PDUs distribute power to server rows. Branch monitoring tracks per-circuit load. Redundant A+B feeds enable maintenance. Modular construction supports expansion. PDUs often in own rooms or secured spaces.
Cooling removes server heat:
Cooling approaches
- Chilled water (common commercial)
- DX (direct expansion) — smaller facilities
- Air-cooled CRAC units
- Water-cooled CRAH units
- In-row cooling (high density)
- Rear-door heat exchangers
- Immersion cooling (emerging)
Cooling technology varies by density and efficiency goals. Chilled water with CRAH units common in commercial. In-row cooling for high-density applications. Immersion cooling emerging for extreme density. Cooling approach affects architectural and mechanical design.
Data center commissioning is make-or-break. Systems integrated across electrical, mechanical, controls, and BMS must operate together flawlessly. Full-load testing, failure mode testing (intentionally dropping components to verify failover), and integrated operations testing all precede go-live. Rushed commissioning produces data centers that fail in production.
Get AP insights in your inbox
A short monthly roundup of construction AP + accounting posts. No spam, ever.
No spam. Unsubscribe anytime.
Hot/Cold Aisle
Aisle configuration manages airflow:
Hot/cold aisle
- Cold aisles serve server intakes
- Hot aisles remove server exhaust
- Containment prevents mixing
- Hot aisle containment or cold aisle containment
- Overhead ducts or raised floor
- Temperature and pressure monitoring
- Efficiency benefits substantial
Cold aisle-hot aisle arrangement prevents mixing of cold supply and hot return. Containment (walls, doors, ceiling panels) further separates. Efficiency benefit substantial — allows higher supply temperatures and less cooling energy. Modern data centers nearly universal.
Chilled water serves large facilities:
Chilled water
- Chiller plants (air or water cooled)
- Primary/secondary pumping
- Thermal storage tanks sometimes
- Redundant chillers (N+1)
- Cooling tower (for water-cooled)
- Water treatment
- Pipe distribution throughout data center
Chilled water plants provide cooling capacity. Air-cooled chillers for smaller facilities or water-constrained sites. Water-cooled for efficiency with cooling tower. Pipe distribution to CRAH units or cooling infrastructure. Water treatment prevents equipment damage.
Computer room cooling:
CRAC vs CRAH
- CRAC — Computer Room Air Conditioner (DX)
- CRAH — Computer Room Air Handler (chilled water)
- Perimeter or inline configurations
- Humidification and reheat capabilities
- Free cooling (economizer) modes
- Monitoring and controls integration
- Redundant units
CRAC uses DX cooling (refrigerant). CRAH uses chilled water. Both serve as air handlers delivering cooled air to cold aisles. Perimeter or inline placement. Redundancy through N+1 or 2N arrangements. Modern units include economizer modes for efficiency.
BMS and DCIM integrate:
Controls and monitoring
- Building Management System (BMS)
- Data Center Infrastructure Management (DCIM)
- Power monitoring at all levels
- Cooling monitoring
- Temperature and humidity sensors
- Leak detection
- Security integration
- Alarming
Integrated monitoring across power, cooling, and security. DCIM tracks asset-level data (which server on which circuit, etc.). BMS controls mechanical systems. Leak detection critical — water plus power equals disaster. Alarming brings operator attention to issues.
Extensive commissioning required:
Data center commissioning
- Level 5 commissioning (full integrated)
- Factory acceptance testing
- Site acceptance testing
- Unit-level testing
- System-level testing
- Integrated systems testing
- Failure mode testing
- Owner acceptance
Data center commissioning more rigorous than typical buildings. Level 5 commissioning tests integrated operation under full load and simulated failures. Failure mode testing intentionally trips components to verify redundancy. Extensive documentation supports operations.
Data center construction centers on redundant power and cooling systems supporting 24/7/365 operation. Tier ratings drive redundancy requirements. Power path from utility through generators, UPS, switchgear, and PDUs provides multi-level backup. Cooling through chilled water, CRAC/CRAH, and hot/cold aisle configurations removes server heat. Controls and monitoring via BMS and DCIM. Extensive commissioning including failure mode testing precedes go-live. Construction complexity is high; coordination across electrical, mechanical, controls, and structural is extensive. Contractors with data center expertise deliver these demanding projects; generalists struggle. With cloud and AI driving continued data center demand, the sector continues to grow. Understanding power and cooling systems is foundational for contractors pursuing this specialized market.
Written by
Marcus Reyes
Construction Industry Lead
Spent twelve years running AP at a $120M general contractor before joining Covinly. Lives in the world of AIA G702/G703, retainage schedules, and lien waiver deadlines. Writes about the construction-specific workflows that generic AP tools get wrong.
View all posts