35
1 CTO Challenge William Tschudi [email protected] 510-495-2417 February 27, 2008

1 CTO Challenge William Tschudi [email protected] 510-495-2417 February 27, 2008

Embed Size (px)

Citation preview

Page 1: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

1

CTO Challenge

William [email protected]

February 27, 2008

Page 2: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Selected LBNL Data Center findings Future bold moves

Page 3: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Benchmarks of energy end use

local distribution lines

to the building, 480 V

HVAC system

lights, office space, etc.

UPS PDU computer racks

backup diesel generators

Electricity Flows in Data CentersElectricity Flows in Data Centers

computerequipment

uninterruptible

load

UPS = Uninterruptible Power Supply

PDU = Power Distribution Unit;

Page 4: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Overall electrical power use

Courtesy of Michael Patterson, Intel Corporation

Page 5: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Your mileage will vary

Data Center Server Load

51%

Data Center CRAC Units

25%

Cooling Tower Plant4%

Electrical Room Cooling

4%

Office Space Conditioning

1%

Lighting2%

Other13%

Computer Loads67%

HVAC - Air Movement

7%

Lighting2%

HVAC - Chiller and

Pumps24%

The relative percentages of the energy doing computing varied considerably.

Page 6: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

High level metric— Ratio of electricity delivered to IT equipment IT Power to Total Data Center Power

0.00

0.10

0.20

0.30

0.40

0.50

0.60

0.70

0.80

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

Data Center Number

Ra

tio

Average .57

Higher is better

Source: LBNL Benchmarking

Page 7: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

High level metric—

IT Power to Total Data Center Power

0.00

0.10

0.20

0.30

0.40

0.50

0.60

0.70

0.80

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

Data Center Number

Ra

tio

Average .57

Source: LBNL Benchmarking

CTO Challenge – get everyone to this level

Page 8: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

On-line profiling tool: “Data Center Pro”

OUTPUTS

Overall picture of energy use and efficiency

End-use breakout

Potential areas for energy efficiency improvement

Overall energy use reduction potential

INPUTS

Description

Utility bill data

System information

IT

Cooling

Power

On-site gen

Page 9: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

DOE Save Energy Now Data Center programMajor Program Elements

1. Develop and test “DC Pro” Software using

pilot energy assessments

2. Create consensus metrics

3. Create and publicize Save Energy Now case studiesbased on pilot energy assessments

4. Create best practice information and a training curriculum

5. Develop Qualified Specialists program for Data Centers

6. Create guidelines for “Best-in-Class” data center within various classes of data centers, including strategies for incorporating distributed generation technologies

Page 10: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Federal Energy Management Program

• Best practicesshowcased at Federal data centers

• Pilot adoption of Best-in-Class guidelines at Federal data centers

• Adoption of to-be-developed industry standard for Best-in-Class at newly constructed Federal data centers

EPA• Metrics• Server performance

rating & ENERGY STAR label• Data center performance

benchmarking

Industrial Technologies Program• Tool suite & metrics• Energy baselining• Training• Qualified specialists• Case studies• Certification of continual

improvement• Recognition of high energy savers• Best practice information• Best-in-Class guidelines

Industry

• Tools

• Metrics

• Training

• Best practice information

• Best-in-Class guidelines

• IT work productivity standard

Page 11: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Energy assessment tools

Data Center Assessment OutputData Center Assessment OutputOverall energy performance (baseline) of data center

Performance of IT & infrastructure subsystems compared to benchmarks

Prioritized list of energy efficiency actions and their savings, in terms of energy cost ($), source energy (Btu), and carbon emissions (Mtons)

IT ModuleIT Module

Servers

Storage & networking

Software

Power SystemsPower Systems

UPS

Distribution

CoolingCooling

Air Mgmt

CRAC/CRAH

AHU

Chillers

On-Site GenOn-Site Gen

Renewables

Co-gen

Page 12: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Server Load/

ComputingOperations

Cooling Equipment

Power Conversion

& Distribution

AlternativePower

Generation

• High voltage distribution• Use of DC power• Highly efficient UPS

systems• Efficient redundancy

strategies

• Load management

• Server innovation

Energy efficiency opportunities are everywhere

• Better air management• Better environmental

conditions• Move to liquid cooling• Optimized chilled-water

plants• Use of free cooling

• On-site generation• Waste heat for

cooling• Use of renewable

energy/fuel cells

Page 13: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

HVAC best practices

Air Management Air Economizers Humidification

Control Centralized Air

Handlers Low Pressure

Drop Systems Fan Efficiency

Cooling Plant Optimization

Water Side Economizer

Variable Speed Chillers

Variable Speed Pumping

Direct Liquid Cooling

Page 14: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Electrical best practices

UPS systems Self-generation AC-DC

distribution Standby

generation

Page 15: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Best practices and IT equipment

Power supply efficiency

Standby/sleep power modes

IT equipment fans

VirtualizationLoad shifting

Page 16: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Best Practices—Cross-cutting and misc. issues

Motor efficiency Right sizing Variable speed

drives Lighting Maintenance Continuous

Commissioning and Benchmarking

Heat Recovery Building

Envelope Redundancy

Strategies

Page 17: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Design guidelines for ten best practices were developed

Guides available through LBNL’s website & PG&E’s Energy Design Resources website

Page 18: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

• Broaden recommended and allowable ranges of environmental conditions

• Debunk contamination and ESD fears• Move to liquid cooling• Integrate computing equipment and the

building• Minimize power conversion loss - end to end• Facilitate IT – Facilities – CFO understanding

CTO Challenge – Some bold steps to improve energy efficiency (and save your customers money)

Page 19: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

• Broaden recommended and allowable ranges of environmental conditions

– HVAC can be greatly reduced if higher temperatures can be used for cooling IT equipment (using air or liquid)

– ASHRAE addressing this but not based upon science

– IT equipment operating at 80º F or higher has huge energy implications

CTO Challenge

Page 20: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Temperature guidelines – Temperature guidelines – at the inlet to IT equipmentat the inlet to IT equipment

ASHRAE TEMPERATURE GUIDELINES

40

50

60

70

80

90

100

De

gre

es

F

ASHRAE Allowable Maximum

ASHRAE Allowable Minimum

ASHRAE Recommended Maximum

ASHRAE Recommended Minimum

Page 21: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Humidity guidelines – Humidity guidelines – at the inlet to IT equipmentat the inlet to IT equipment

ASHRAE HUMIDITY GUIDELINES

0

10

20

30

40

50

60

70

80

90

100

% R

ela

tiv

e H

um

idit

y

ASHRAE Allowable Maximum

ASHRAE Allowable Minimum

ASHRAE Recommended Maximum

ASHRAE Recommended Minimum

Page 22: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

CTO challenge: CTO challenge: Broaden environmental conditionsBroaden environmental conditions

ASHRAE Allowable Maximum

ASHRAE Allowable Minimum

ASHRAE Recommended Maximum

ASHRAE Recommended Minimum

0

50

100

150

200

250

300

350

400

450

500

Hrs/YR

28 35 42 49 56 63 70 77 84 91 98 105 112

Temperature (F)

Berkeley Weather

hr/yr0 10 0

11 20 021 30 331 40 50441 50 232551 60 315361 70 154271 80 84681 90 29791 100 75

101 110 15111 120 0

Range F

Page 23: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Total Data Center Power/IT Power

1.00

1.50

2.00

2.50

3.00

3.50

1 2 3 4 5 6

OSF 200

3 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

OSF 200

5 24CRT

Metric 1.2 for planned LBNL supercomputer facility

Average of Facilities Measured-1.74

Total Power/IT Power

Page 24: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

• Debunk contamination and ESD fears– Direct use of outside air for cooling can

result in large HVAC savings but fears of contamination hinder its adoption

– LBNL studies suggest this should not be a problem

– Failure data due to contamination has been requested – none has been produced

– ESD is poorly understood.

CTO Challenge

Page 25: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Outdoor MeasurmentsFine Particulate Matter

0

20

40

60

80

100

120

140

160

12:00PM

6:00PM

12:00AM

6:00AM

12:00PM

6:00PM

12:00AM

6:00AM

12:00PM

6:00PM

12:00AM

6:00AM

12:00PM

6:00PM

12:00AM

6:00AM

12:00PM

6:00PM

12:00AM

Par

ticl

e C

on

c. (

mg

/m3) LBNL

NERSC

Center 3

Center 4

Center 5

Center 6

Center 7

Center 8

Outdoor measurements

IBM Standard

EPA Annual Health Standard

EPA 24-Hour Health Standard

and ASHRAE Standard

Page 26: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Indoor Measurments Fine Particulate Matter

0

20

40

60

80

100

120

140

160

12:00PM

6:00PM

12:00AM

6:00AM

12:00PM

6:00PM

12:00AM

6:00AM

12:00PM

6:00PM

12:00AM

6:00AM

12:00PM

6:00PM

12:00AM

6:00AM

12:00PM

6:00PM

12:00AM

Par

ticl

e C

on

c. (

mg

/m3) LBNL

NERSC

Center 3

Center 4

Center 5

Center 6

Center 7

Center 8

Measurements inside the centers

IBM Standard

EPA Annual Health Standard

EPA 24-Hour Health Standard

and ASHRAE Standard

Page 27: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

• Move to liquid cooling– Liquid can remove 3500 times as much

heat as air– Liquid cooling could eliminate (or greatly

reduce) the need for chillers– Liquid is creeping in now – how to

accelerate it?

CTO Challenge

Page 28: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

• Integrate computing equipment and the building– Often multiple fans in series– air and liquid cooling– High delta T is efficient– Eliminate boxes– Control HVAC from servers on board

sensors (Demo being planned)

CTO Challenge

Page 29: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

• Minimize power conversion loss - end to end– On site generation– Distribute high voltage AC or DC– Eliminate conversions with use of DC– Insist on high efficiency power supplies and UPS– Optimize DC conversions in the box– AC to the chip?– Redundancy

CTO Challenge

Page 30: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Measured UPS efficiency

Redundant Operation

Page 31: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Inverter

In Out

Bypass

Battery/ChargerRectifier

Internal Drive

External Drive

I/O

Memory Controller

mProcessor

SDRAM

Graphics Controller

DC/DCAC/DC

DC/DC

AC/DC Multi output Power Supply

Voltage Regulator Modules

5V

12V

3.3V

12V 1.5/2.5V

1.1V-1.85V

3.3V

3.3V

12V

PWM/PFCSwitcher

Unregulated DCTo Multi Output Regulated DC

Voltages

Data Center power conversions

Power Distribution Unit (PDU)

ServerUninterruptible Power Supply (UPS)

AC DC AC DC

Page 32: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

UPS factory measurements

Typical Operation

Page 33: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

Power supply efficiency

Typical operation

Page 34: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

• Facilitate IT – Facilities – CFO understanding– Disconnect between facilities and IT– Operating budget vs capital budget– Operating cost equals or exceeds capital

cost of IT equipment– How to get CFO’s engaged?

CTO and CFO Challenge

Page 35: 1 CTO Challenge William Tschudi wftschudi@lbl.gov 510-495-2417 February 27, 2008

websites: http://hightech.lbl.gov/datacenters/

http://www1.eere.energy.gov/industry/saveenergynow/

partnering_data_centers.html