PUE Baselining: Methods and Resources · • PUE is already tracked in real time (~1.4) using...
Transcript of PUE Baselining: Methods and Resources · • PUE is already tracked in real time (~1.4) using...
Click To Edit Master
Title Style
PUE Baselining: Methods
and Resources
Better Buildings
Summit
May 27, 2015
Data Center Metering and Resource
Guide
Steve Greenberg
Lawrence Berkeley National Laboratory
Better Buildings Summit
Washington, DC
May 27, 2015
Agenda
• Definitions, including the Power Usage
Effectiveness (PUE)
• Discussion of data center types
• Anticipated scenarios of metering systems and how
they integrate with data center types
• Metering methods, including leveraging existing
meters, starting from scratch
• Challenges to installing meters and gathering data
• Resources
3
Definitions
4
PUEPower Usage Effectiveness
• The ratio of total energy use to that of the information technology (IT)
equipment.
PUE = Total Data Center Facility Annual Energy Use
IT Equipment Annual Energy Use
• A measure of how efficiently the data center infrastructure uses
energy.
• Three levels (1=Basic, 2=Intermediate, 3=Advanced)
• Focus on Level 1
What PUE is good for (infrastructure overhead)
Data Center Types1. Stand-alone
5
Building Switch Gear
DC Cooling
DC
UPS OR
Distribution Panel
IT
PDU
M1
M2
PUE = M1/M2
Data Center Types2. Embedded
6PUE = (M4 + data center cooling) /M2
Building Switch Gear
Site Cooling
Non DC DC
IT
PDU
UPS or
Distribution Panel
T
M2
M1
M4
Data Center Types2. Embedded, con’t
7PUE calculation varies, depending on metering
UTILITY
Automatic Transfer
Switch
GENERATOR
SWITCH GEAR
DISTRIBUTION SWITCH GEAR
DISTRIBUTION PANEL
IT
PDU
AIR COOLED /
WATER
COOLED
CHILLER
PUMP
COOLING
TOWER FAN
CRAH/CUPS
M2
M1
Data
Center
LIGHTING
DISTRIBUTION PANEL
OFFICE and
OTHERS
M3M4
COMPRESSOR
/CONDENSER
FOR DX
SYSTEMS
M5
T
Steps in Metering
1. Planning
o Determine data center type
o Determine existing metering
o Review drawings
o Interview staff/visit site
o Decide on PUE calculation approach
8
Steps in Metering, con’t
2. Implementation
o project initiation
o defining needs and expectations
o obtaining buy-in from all stakeholders
o design (including review cycles)
o installation
o integration and configuration
o Commissioning
end-to-end
sum-checking
o training
9
Challenges to Meter Installation and Possible
Solutions
• Electrical metering: Shut down one system
at a time in N+x systems
• Electrical metering: Wait for system
maintenance
• Thermal metering: Use hot-taps or
ultrasonic meters
10
PUE level diagram
11
Resources
• Data Center Metering and Resource Guide:https://datacenters.lbl.gov/resources/data-center-metering-and-
resource-guide
• PUE: a Comprehensive Examination of the Metric:https://www.thegreengrid.org/en/Global/Content/white-
papers/WP49-PUEAComprehensiveExaminationoftheMetric
• Center of Expertise for Energy Efficiency in Data
Centers:
https://datacenters.lbl.gov/
• Data Center Energy Practitioner (DCEP) Program:
https://datacenters.lbl.gov/dcep
12
13
Questions?
Steve Greenberg, P.E.
Lawrence Berkeley National Laboratory
(510) 486-6971
Approaches to PUE at the Lawrence
Berkeley National Lab: Three Case
Studies
Steve Greenberg
Lawrence Berkeley National Laboratory
Better Buildings Summit
Washington, DC
May 27, 2015
Examples of getting to PUE at LBNL
data centers
• Building 50A-1156: the hodgepodge
• Building 50B-1275: the case-study king
• Building 59: the many-megawatt supercomputer
center
15
LBNL site map
16
Lawrence Berkeley National Laboratory Room 50A-1156: “the hodge-podge”
17
• decades old, embedded data center in office building
• 2450 square feet
• ~100 kW IT load
• shared AHU for primary cooling on house chilled water
• standby CRAC with remote air-cooled condenser
• 2’ raised floor
• combination of telecom, house services, and high-performance computing
• mix of UPS and direct power distribution
Lawrence Berkeley National Laboratory Room 50A-1156, con’t
18
Approach to PUE:
• Level 1
• Measured IT
• Data center is embedded with multiple power and cooling feeds
• There are some existing meters on IT loads
• Identify meter additions needed
• Triage based on cost vs. effect on PUE
• Implement changes
• Calculation will use IT load and estimate HVAC based on system ratings
and one-time readings
Lawrence Berkeley National Laboratory Room 50B-1275 “the case-study king”
19
45-year-old data center
5600 square feet
~450 kW IT load
7 CRACs 15 to 30 tons of cooling each in 2-4 stages
Down-flow units (raised floor)
Water-cooled
Other cooling including rear doors, enclosed racks, AHU
Numerous case studies
LBNL Room 50B-1275
Images: ANCIS
20
IT Computing Facilities 5600 sq ft – Raised floor; 7 CRACs
30 ton 30 ton
15 ton 15 ton 15 ton OffOff
20 ton Bldg
xxxxx
xxxxx
Lawrence Berkeley National Laboratory Room 50B-1275, con’t
21
Approach to PUE:
• Level 2 (transformer losses measured or estimated)
• Measured IT, HVAC, lighting
• Data center is embedded and has multiple power and cooling feeds
• PUE is already tracked in real time (~1.4) using numerous meters
• Metering needs update to reflect changes in power and cooling
• Identify meter additions, deletions, and moves needed
• Triage based on cost vs. effect on PUE
• Implement changes
LBNL Room 50B-1275, con’t
22
Electric metering
LBNL Room 50B-1275, con’t
23
Thermal metering
Lawrence Berkeley National Laboratory Building 59: the Computational Research and
Theory Facility
“the multi-megawatt supercomputer center”
24
• Brand-new supercomputer center, embedded
• 142,000 square feet total
• 7 MW IT load to start, then up to 17, then ???
• IT load will dominate building
• 2 large AHUs for air-cooled loads
• 4 cooling towers with heat exchangers for water-cooled loads
• Water-cooled supercomputers
• Air and water side economizers
• Air-side heat recovery for heating offices
• IT loads cooled without compressors
LBNL Building 59
25
Lawrence Berkeley National Laboratory Building 59, con’t
26
Approach to PUE:
• Level 2 (PDU outputs for IT)
• Measured IT, HVAC, lighting
• Data center is embedded with multiple power and cooling feeds
• PUE will be tracked in real time (~1.06) using hundreds of meters
• Meter location, accuracy, and reporting capability in review and
commissioning
• Identify meter additions needed
• Triage based on cost vs. effect on PUE
• Implement changes
Lawrence Berkeley National Laboratory Building 59, con’t
27
Conclusions: PUE determination at LBNL
28
• Is case-by-case—every
center is different
• Takes advantage of
existing meters
• Minimizes estimation
• Typically involves
numerous meters to
resolve energy flow in
embedded spaces
29
Questions?
Steve Greenberg, P.E.
Lawrence Berkeley National Laboratory
(510) 486-6971
Metering for PUE
HIGH BAY
LABORATORIES
DATA CENTER
OFFICE
30
Steve Hammond May 2015
NREL – Two Data Center Use Cases
31
RSF - Enterprise Air Cooled
• Fully contained hot aisle
• Server fans pull cold air into servers and exhaust heat to hot aisle.
• 1.1-1.2 PUE Winter, Spring, Fall
• 1.2-1.4 PUE Summer
• Waste heat used to heat building
• Economizer and Evaporative cooling
• Low fan energy design
• ~200kW, ~2,000 Sq Ft.
ESIF – HPC Liquid Cooled Data Center
• 95% of IT heat load directly to liquid.
• 1 fully contained hot aisle for lower power, low density air cooled equipment.
• Annualized average PUE 1.06
• Waste heat used to heat building
• Evaporative cooling
• VFD motors, Low fan energy design
• 10MW, 10,000 Sq Ft.
Steve HammondSteve Hammond NREL 31
http://www.nrel.gov/docs/fy12osti/52785.pdf
RSF Enterprise “Air
Cooled” Data Center
Steve HammondSteve Hammond NREL 32
RSF Enterprise “Air Cooled” Data CenterDehumidification for hours when the outdoor air was more humid than the
acceptable supply air criteria. Some form of mechanical
cooling or dehumidification would be required for these
hours.
44 hours/year
Economizer for hours when the outdoor air can satisfy the supply air
criteria with no additional conditioning
559 hours/year
Evaporative Cooling for hours when adiabatic humidification/cooling (70%
effectiveness) of outside air can meet the supply air criteria
984 hours/year
Mixing for hours when the outside air can be mixed with hot aisle
air to meet the supply air criteria
1063 hours/year
Mixing and
Humidification
When the outside air is cool and dry, outdoor air can be
mixed with hot aisle air, then adiabatically humidified to the
supply air criteria.
6110 hours/year
Steve HammondSteve Hammond NREL 33
PUE Metering– simple & effective
Steve HammondSteve Hammond NREL 34
Air Cooled Data Center PUE
-20
0
20
40
60
80
100
0.75
0.85
0.95
1.05
1.15
1.25
1.35
1.45
Ou
tdo
or
Tem
pera
ture
(°F
)
PU
E
Steve HammondSteve Hammond NREL 35
6 week data: mid-November to end of December
NREL ESIF Liquid Cooled Data CenterShowcase Facility• ESIF 182,000 s.f. research facility
• Includes 10MW, 10,000 s.f. data center
• LEED Platinum Facility, PUE 1.06
• NO mechanical cooling (eliminates expensive and inefficient chillers).
• Use evaporative cooling only.
Compared to a typical data center:
• Lower CapEx – cost less to build
• Lower OpEx – efficiencies save
~$1M per year in operational
expenses.
Integrated “chips to bricks” approach.
Steve HammondSteve Hammond NREL 36
Data Center Features• Direct, component-level liquid
cooling (75F cooling water).
• 95-110F return water (waste heat), captured and used to heat offices and lab space.
• Pumps more efficient than fans.
• High voltage 480VAC power distribution directly to compute racks (improves efficiency, eliminates conversions).
Utilize the bytes and the BTUs!
Key HPC Data Center Specs• Warm water cooling, 24C (75F)
• ASHRAE “W2” category• Water much better working fluid than air -
pumps trump fans.• Utilize high quality waste heat, +35C (95F).• +95% IT heat load to liquid.
• Racks of legacy equipment• Up to 10% IT heat load to air.
• High power distribution• 480VAC, Eliminate conversions.
• Think outside the box• Don’t be satisfied with an energy efficient data
center nestled on campus surrounded by inefficient laboratory and office buildings.
• Innovate, integrate, optimize.
Steve HammondSteve Hammond NREL 37
Direct Liquid Cooling & Energy Recovery On Display
0
1
2
3
4
5
6
7
8
9
10
Energy Targets
PUE 1.06
EUE 0.9
Current Design
PUE 1.05
EUE 0.7
Annual Energ
y G
Wh/y
r
NREL-ESIF Data CenterEnergy Recovery at 1 MW IT Load
Data Center
Equipment
Load
Thermal
Energy
RecoveredEUE 0.9
EUE 0.7
“We want the bytes AND the BTU’s!”
Steve HammondSteve Hammond NREL 38
ESIF HPC Datacenter PUE CalculationNREL ESIF – DATA CENTER ENERGY EFFICIENCY METRICS
(TYPICAL FOR HPC RACKS)
LIGHTING & PLUG POWER COOLING LOADS PUMP LOADS HVAC LOADS IT EQUIPMENT POWER
DATA CENTER NORMAL POWER
SERVICE ENTRANCE SWITCHBOARD
AUTOMATIC TRANSFER SWITCH, ATS-U1SB
ENGINE BLOCK & FUEL HEATERS
UPS DISTRIBUTION BOARD ATS-U1SB
AIR
-HA
ND
LIN
G U
NIT
S A
HU
-DC1
, AH
U-D
C2
MA
KE-U
P A
IR-H
AN
DLI
NG
UN
IT, M
AU
-DC1
W
ITH
EVA
PORA
TIV
E CO
OLI
NG
PU
MP
ENER
GY
RECO
VER
Y PU
MPS
P-
604A
, P-6
04B
MECHANICAL DISTRIBUTION BOARD, DSB-MS1
AUTOMATIC TRANSFER SWITCH, ATS-LS
EMER
GEN
CY L
IGH
TIN
G –
DAT
A C
ENTE
R Pa
nel E
SL2
EPM
NO
RMA
L LI
GH
TIN
G –
DAT
A C
ENTE
R PA
NEL
SL3
EPM EPM EPM
NO
RMA
L PO
WER
(PLU
G) –
DAT
A C
ENTE
R PA
NEL
SP3
DISTRIBUTION BOARD
DSB-DC1
DISTRIBUTION BOARD
DSB-DC2
DISTRIBUTION BOARD
DSB-DC3
DISTRIBUTION BOARD
DSB-DC4
STAND-BY GENERATOR
CRH
EPM EPM
CENTRAL PLANT DISTRIBUTION BOARD DSB-CP
CENTRAL PLANT EMERGENCY DISTRIBUTION BOARD, DSB-CPE
TOW
ER W
ATER
PU
MPS
– D
ATA
CEN
TER
P-60
2A, P
-602
B
COO
LIN
G T
OW
ER F
ILTR
ATIO
N U
NIT
&
TRA
CE H
EATE
RS F
OR
PIPI
NG
COO
LIN
G T
OW
ERS –
DAT
A C
ENTE
R CT
-602
A, C
T-60
2B, C
T-60
2C, C
T-6O
2D
EPM EPM
AUTOMATIC TRANSFER SWITCH, ATS-CPLE
DIS
TRIB
UTI
ON
BO
ARD
DSB
-DCU
-1
EPM
COOLING DISTRIBUTION RACK, CDU-1
COOLING DISTRIBUTION RACK, CDU-2
COOLING DISTRIBUTION RACK, CDU-3
COOLING DISTRIBUTION RACK, CDU-4
COOLING DISTRIBUTION RACK, CDU-5
COOLING DISTRIBUTION RACK, CDU-6 PO
WER
DIS
TRIB
UTI
ON
UN
IT,
LEG
ACY
PD
U-D
CU-2
DIS
TRIB
UTI
ON
BO
ARD
D
SB-D
C1-2
DIS
TRIB
UTI
ON
BO
ARD
D
SB-D
C1-1
DIS
TRIB
UTI
ON
BO
ARD
D
SB-D
C2-2
DIS
TRIB
UTI
ON
BO
ARD
D
SB-D
C2-1
POW
ER D
ISTR
IBU
TIO
N U
NIT
, LE
GA
CY P
DU
-DC4
-2
POW
ER D
ISTR
IBU
TIO
N U
NIT
, H
PC P
DU
-DC4
-1
POW
ER D
ISTR
IBU
TIO
N U
NIT
, H
PC P
DU
-DCU
-1
EPM EPM EPM EPM
DATA CENTER ELECTRICAL
TRANSFORMER
HPC
FLE
X RA
CK #
9
HPC
FLE
X RA
CK #
10
HPC
FLE
X RA
CK #
11
HPC
FLE
X RA
CK #
12
NORMAL LOAD
STAND-BY POWER
STA
ND
-BY
LOA
D
ELECTRICAL POWER METER (TYPICAL)
POWER USE EFFECTIVENESS (PUE)
DATA CENTER ENERGY EFFICIENCY IS BENCHMARKED USING AN INDUSTRY STANDARD METRIC OF POWER USE EFFECTIVENESS (PUE). THE PUE IS DEFINED AS FOLLOWS:
POWER USE EFFECTIVENESS = TOTAL FACILITY POWER IT EQUIPMENT POWER
TOTAL FACILITY POWER = LIGHTING & PLUG POWER + COOLING LOADS + PUMP LOADS + HVAC LOADS + IT EQUIPMENT POWER
IT EQUIPMENT POWER = TOTAL POWER USED TO MANAGE, PROCESS, STORE, OR ROUTE DATA WITHIN THE DATA CENTER
CRH
DISTRIBUTION BOARD DSB-DCU
UPS UPM1 & UPM2
RECOVERED ENERGY BENEFICIALLY USED
OUTSIDE DATA CENTER, HEAT EXCHANGER
HX-605A
ENERGY USE EFFECTIVENESS (EUE)
ENERGY USE EFFECTIVENESS (EUE) IS A METRIC OF RECOVERED ENERGY BENEFICIALLY USED OUTSIDE OF THE DATA CENTER (HEATING). THE EUE IS DEFINED AS FOLLOWS:
ENERGY USE EFFECTIVENESS = TOTAL FACILITY ENERGY – RECOVERED ENERGY TOTAL FACILITY ENERGY
TOTAL FACILITY ENERGY = LIGHTING & PLUG ENERGY + COOLING LOADS + PUMP LOADS + HVAC LOADS + IT EQUIPMENT ENERGY
IT EQUIPMENT ENERGY = TOTAL ENERGY USED TO MANAGE, PROCESS, STORE, OR ROUTE DATA WITHIN THE DATA CENTER
RECOVERED ENERGY
CALCULATED POWER BASED ON RUNTIME HOURS (TYPICAL)
DSB-U1SB
Steve HammondSteve Hammond NREL 39
Initial HPC Datacenter PUE metering* NREL ESIF – DATA CENTER ENERGY EFFICIENCY METRICS
(TYPICAL FOR HPC RACKS)
LIGHTING & PLUG POWER COOLING LOADS PUMP LOADS HVAC LOADS IT EQUIPMENT POWER
DATA CENTER NORMAL POWER
SERVICE ENTRANCE SWITCHBOARD
AUTOMATIC TRANSFER SWITCH, ATS-U1SB
ENGINE BLOCK & FUEL HEATERS
UPS DISTRIBUTION BOARD ATS-U1SB
AIR
-HA
ND
LIN
G U
NIT
S A
HU
-DC1
, AH
U-D
C2
MA
KE-U
P A
IR-H
AN
DLI
NG
UN
IT, M
AU
-DC1
W
ITH
EVA
PORA
TIV
E CO
OLI
NG
PU
MP
ENER
GY
RECO
VER
Y PU
MPS
P-
604A
, P-6
04B
MECHANICAL DISTRIBUTION BOARD, DSB-MS1
AUTOMATIC TRANSFER SWITCH, ATS-LS
EMER
GEN
CY L
IGH
TIN
G –
DAT
A C
ENTE
R Pa
nel E
SL2
EPM
NO
RMA
L LI
GH
TIN
G –
DAT
A C
ENTE
R PA
NEL
SL3
EPM EPM EPM
NO
RMA
L PO
WER
(PLU
G) –
DAT
A C
ENTE
R PA
NEL
SP3
DISTRIBUTION BOARD
DSB-DC1
DISTRIBUTION BOARD
DSB-DC2
DISTRIBUTION BOARD
DSB-DC3
DISTRIBUTION BOARD
DSB-DC4
STAND-BY GENERATOR
CRH
EPM EPM
CENTRAL PLANT DISTRIBUTION BOARD DSB-CP
CENTRAL PLANT EMERGENCY DISTRIBUTION BOARD, DSB-CPE
TOW
ER W
ATER
PU
MPS
– D
ATA
CEN
TER
P-60
2A, P
-602
B
COO
LIN
G T
OW
ER F
ILTR
ATIO
N U
NIT
&
TRA
CE H
EATE
RS F
OR
PIPI
NG
COO
LIN
G T
OW
ERS –
DAT
A C
ENTE
R CT
-602
A, C
T-60
2B, C
T-60
2C, C
T-6O
2D
EPM EPM
AUTOMATIC TRANSFER SWITCH, ATS-CPLE
DIS
TRIB
UTI
ON
BO
ARD
DSB
-DCU
-1
EPM
COOLING DISTRIBUTION RACK, CDU-1
COOLING DISTRIBUTION RACK, CDU-2
COOLING DISTRIBUTION RACK, CDU-3
COOLING DISTRIBUTION RACK, CDU-4
COOLING DISTRIBUTION RACK, CDU-5
COOLING DISTRIBUTION RACK, CDU-6
POW
ER D
ISTR
IBU
TIO
N U
NIT
, LE
GA
CY P
DU
-DCU
-2
DIS
TRIB
UTI
ON
BO
ARD
D
SB-D
C1-2
DIS
TRIB
UTI
ON
BO
ARD
D
SB-D
C1-1
DIS
TRIB
UTI
ON
BO
ARD
D
SB-D
C2-2
DIS
TRIB
UTI
ON
BO
ARD
D
SB-D
C2-1
POW
ER D
ISTR
IBU
TIO
N U
NIT
, LE
GA
CY P
DU
-DC4
-2
POW
ER D
ISTR
IBU
TIO
N U
NIT
, H
PC P
DU
-DC4
-1
POW
ER D
ISTR
IBU
TIO
N U
NIT
, H
PC P
DU
-DCU
-1
EPM EPM EPM EPM
DATA CENTER ELECTRICAL
TRANSFORMER
HPC
FLE
X RA
CK #
9
HPC
FLE
X RA
CK #
10
HPC
FLE
X RA
CK #
11
HPC
FLE
X RA
CK #
12
NORMAL LOAD
STAND-BY POWER
STA
ND
-BY
LOA
D
ELECTRICAL POWER METER (TYPICAL)
POWER USE EFFECTIVENESS (PUE)
DATA CENTER ENERGY EFFICIENCY IS BENCHMARKED USING AN INDUSTRY STANDARD METRIC OF POWER USE EFFECTIVENESS (PUE). THE PUE IS DEFINED AS FOLLOWS:
POWER USE EFFECTIVENESS = TOTAL FACILITY POWER IT EQUIPMENT POWER
TOTAL FACILITY POWER = LIGHTING & PLUG POWER + COOLING LOADS + PUMP LOADS + HVAC LOADS + IT EQUIPMENT POWER
IT EQUIPMENT POWER = TOTAL POWER USED TO MANAGE, PROCESS, STORE, OR ROUTE DATA WITHIN THE DATA CENTER
CRH
DISTRIBUTION BOARD DSB-DCU
UPS UPM1 & UPM2
RECOVERED ENERGY BENEFICIALLY USED
OUTSIDE DATA CENTER, HEAT EXCHANGER
HX-605A
ENERGY USE EFFECTIVENESS (EUE)
ENERGY USE EFFECTIVENESS (EUE) IS A METRIC OF RECOVERED ENERGY BENEFICIALLY USED OUTSIDE OF THE DATA CENTER (HEATING). THE EUE IS DEFINED AS FOLLOWS:
ENERGY USE EFFECTIVENESS = TOTAL FACILITY ENERGY – RECOVERED ENERGY TOTAL FACILITY ENERGY
TOTAL FACILITY ENERGY = LIGHTING & PLUG ENERGY + COOLING LOADS + PUMP LOADS + HVAC LOADS + IT EQUIPMENT ENERGY
IT EQUIPMENT ENERGY = TOTAL ENERGY USED TO MANAGE, PROCESS, STORE, OR ROUTE DATA WITHIN THE DATA CENTER
RECOVERED ENERGY
CALCULATED POWER BASED ON RUNTIME HOURS (TYPICAL)
DSB-U1SB
6 + 2 + 34 + 7 + 636
636 = 1.077IT
Equipment, 636KW
Lighting, Plug Load,
Misc., 6
EvapCooling
Towers, 2
Pumps, 34 Fan Walls,
7
Steve HammondSteve Hammond NREL 40
*Very first meter readings, November 2013.
Steve Hammond NREL 41