55-Data Centres Final 27-5-09

download 55-Data Centres Final 27-5-09

of 40

Transcript of 55-Data Centres Final 27-5-09

  • 8/13/2019 55-Data Centres Final 27-5-09

    1/40

    Energy Efficient DataCentres in Further and

    Higher Education

    A Best Practice Review prepared for

    the

    Joint Information Services

    Committee (JISC)

    May 27 2009

    Peter James and Lisa Hopinson

    Higher Education En!ironmenta" Performance #mpro!ement

    Pro$ect% &ni!ersity of 'radford

    (ustain#)% &* Centre for Economic and En!ironmenta"

    De!e"opment

  • 8/13/2019 55-Data Centres Final 27-5-09

    2/40

  • 8/13/2019 55-Data Centres Final 27-5-09

    3/40

    Contents

    Introduction.....................................................................................................4

    1. Data Centres in Further and Higher Education............................................5

    2. Energy and Environmental Impacts of Data Centres...................................

    2.1 Em!edded Environmental Impacts......................................................1"

    2.2 Energy Issues in Data Centres.............................................................1"

    2.# $atterns of Energy %se in Data Centres..............................................1#

    #. Data Centre &olutions ' &trategy...............................................................1(

    4. Data Centre &olutions ) $urchasing *ore Energy Efficient Devices...........2"

    5. Data Centre &olutions ) Changing Computing +pproaches........................22

    5.1 Energy $roportional Computing...........................................................22

    5.2 Consolidation and ,irtualisation of &ervers.........................................22

    5.# *ore Energy Efficient &torage.............................................................2#

    -. Data Centre &olutions ) *ore Efficient Cooling and $oer &upply.............25

    -.1 *ore Effective Cooling.........................................................................25

    -.2 *ore Energy Efficient $oer &upply....................................................2

    -.# /educing +ncillary Energy...................................................................#"

    -.4 0etter *onitoring and Control.............................................................#"

    -.5 e &ources of Energy Inputs.............................................................#"

    . etor3ing Issues......................................................................................#2

    .1 he Environmental Impacts of ,I$ elephony...................................#2

    .2 6iring and Ca!ling..............................................................................##

    . Conclusions ...............................................................................................##

    0i!liography...................................................................................................#5

    #

  • 8/13/2019 55-Data Centres Final 27-5-09

    4/40

    Introduction

    his paper provides supporting evidence and analysis for the discussion of datacentres and servers in the main &usteI report 78ames and Hop3inson 2""(a9.

    *ost university and college computing today uses a more decentralised :client)

    server; model. his involves a relatively large num!er of :servers; providing

    services< and managing netor3ed resources for< an even greater num!er of

    :clients;< such as personal computers< hich do much of the actual computing :or3;

    re=uired !y users. he devices communicate through netor3s< !oth internally ith

    each other< and e>ternally through the Internet. + typical data centre< or :server

    room;< therefore contains?

    &ervers< such as application servers 7usually dedicated to single applicationsternal netor3s.

    his infrastructure has considera!le environmental and financial costs< includingthose of?

    Energy use< car!on dio>ide emissions and other environmental impacts fromproduction@

    Direct energy consumption hen servers and other IC e=uipment are usedtend

    the lives of servers and other devices to gain the ma>imum compensating !enefit

    from the environmental !urden created !y production. If loer ' and if ne models

    of server can !e significantly more energy efficient than the ones they are replacing

    ' it ould suggest that a more vigorous :scrap and replace; policy ould !e

    appropriate.

    +s the parallel paper discusses< different estimates have !een produced for the

    em!eddeduse energy ratio in $Cs< ranging from #?1 to 1?# 78ames and Hop3inson2""(!9. he paper concludes that it is reasona!le to assume a 5"?5" ratio in %

    non)domestic applications. his is even more li3ely to !e true of servers than $Cs

    as?

    *ost operate on a 24 !asis< and therefore have much higher levels of energyuse 7per unit of processing activity9 than $Cs@

    he intensity of use is increasing as more servers are virtualised@

    he devices are stripped don to the !asic activities of processing and storingdata< and are therefore less materials) 7and therefore energy)9 intensive than$Cs 7this effect may !e offset< !ut is unli3ely to !e e>ceeded< !y the avoidanceof poer consumption for peripherals such as monitors< graphics cards< etc.9@and

    *anufacturers have reduced em!edded energy< !oth through cleaner andleaner production< and greater revalorisation of end of life e=uipment 7FuAitsu&iemens Computers and nurr 2""9.

    2.2 Energy Issues in Data Centres

    he energy consumption of data centres has greatly increased over the last decadepected to dou!le !y 2"11.

    his high energy consumption of course translates into high energy costs. Even

    !efore the 2"" price rises< the artner consultancy as predicting that energy

    1"

  • 8/13/2019 55-Data Centres Final 27-5-09

    11/40

    costs ill !ecome the second highest cost in " of the orld;s data centres !y

    2""(< trailing staffpersonnel costs< !ut ell ahead of the cost of the I hardare

    7artner Consulting 2""9. his is li3ely to remain the case< even after the price

    fall!ac3s of 2""(. his is one reason hy *icrosoft is !elieved to !e charging for

    data center services on a per)att !asis< since its internal cost analyses

    demonstrate that groth scales most closely to poer consumed 7Denegri 2""9.

    Increasing energy consumption creates other pro!lems. + %& study concluded that/ @ =;9 31Cooling poer dra #%niversal $oer &upply 7%$&9 and distri!ution

    losses

    5

    0uilding &itchgearransformer #ighting 1$oer Distri!ution %nit 7$D%9 1

    )a+"e

  • 8/13/2019 55-Data Centres Final 27-5-09

    12/40

    *emory #-6Dis3s 126$eripheral slots 5"6*other!oard 256

    otal 2516

    12

  • 8/13/2019 55-Data Centres Final 27-5-09

    13/40

    2.# $atterns of Energy %se in Data Centres

    &ervers re=uire supporting e=uipment such as a poer supply unit 7$&%9< connected

    storage devices< and routers and sitches to connect to netor3s. +ll of these have

    their on poer re=uirements or losses. a!les 5 and - present %& data on these

    from to sources 7Emerson 2""@ %&E$+ 2""!9< ith the first focusing on all poerconsumed ithin server rooms< and the second on the consumption of the servers

    themselves. he fact that< even alloing for their lac3 of compara!ility< Emerson

    estimates server poer consumption to !e much greater than the E$+ illustrates

    some of the difficulties of analysing the topic.

    &ervers also generate large amounts of heat< hich must !e removed to avoid

    component failure< and to ena!le processors to run most efficiently. +dditional

    cooling to that provided !y the server;s internal fans is usually re=uired. he need

    for cooling is increasing as servers !ecome more poerful< and generate larger

    amounts of heat 7I0* lo!al echnology &ervices 2""< see also a!le #9.

    Cooling also helps to provide humidity control through dehumidification.

    Humidification is also re=uired in some data centres and ' as it is achieved !y

    evaporation ' can consume additional energy.

    he ;mission critical; nature of many of their applications also means that data

    centres must have an :%ninterrupti!le $oer &upply; 7%$&9 to guard against poer

    failures or potentially damaging fluctuations. ne study 7Emerson 2"" ' see also

    a!le 59 found that?

    nly #" of the energy used for computing as actually consumed in the

    processor itself@ and

    IC e=uipment accounted for only 52 of the total poer consumption of 11236< i.e. there as a support :overhead; of cooling< poer supply and lighting of(2.

    +lthough the situation has improved since then< the figures nonetheless

    demonstrate the potential for reducing energy efficiency.

    he figures are certainly rather high for many data centres in % universities and

    colleges. For e>ample?

    he Hector supercomputing facility at the %niversity of Edin!urgh has anoverhead of only #( even on the hottest of days< and this falls to 21 inmidinter< hen there is 1"" :free cooling; 7see &usteI case study and !o>2 in &ection -9@ and

    he %niversity of &heffield estimates the overhead on its on data centres to!e in the order of 4" 7Cartledge 2""a9.

    1#

  • 8/13/2019 55-Data Centres Final 27-5-09

    14/40

    his apparent divergence !eteen the % and %&+ is credi!le !ecause?

    he %& sample includes many data centres in much hotter and more humidareas than the %< hich ill have correspondingly greater cooling loads@

    Energy and electricity prices are higher in the % than most part of the %&+< so

    there are greater incentives for efficient design and use of e=uipment@

    Energy efficiency standards for cooling< poer supply and other e=uipment aregenerally more stringent in the % than most areas of the %&+@ and

    %& data centres are also improving ' a detailed !enchmar3ing e>ercise foundthat energy efficiency measures and other changes had reduced the averageoverhead from ( in 2""# to -# in 2""5 7reen!erg< *ills< schudiimum poer even

    hen idle 7Fichera 2""-9. Cooling and %$& e=uipment also operates fairly

    independently of computing load in many data centres.

    hese figures suggest that there is considera!le potential to increase the energy

    efficiency of most data centres< including those in % further and higher education.

    Indeed< one %& study has suggested that a complete optimisation of a traditional

    data centre could reduce energy consumption and floor space re=uirements !y -57Emerson 2""9.

    &ome means of achieving this are summarised in a!le and 0o> 1< hich

    represent to slightly differing vies of prioritisation from a European and a orth

    +merican source. In !road terms< the options fall into four main categories?

    $urchasing more energy efficient devices@

    14

  • 8/13/2019 55-Data Centres Final 27-5-09

    15/40

    Changing computing approaches@

    Changing physical aspects such as layouts< poer supply and cooling@ and

    *odular development.

    15

    0o> 1 ) /educing Energy Consumption Data Centres ' + &upplier ,ie

    Emerson suggest that applying the 1" !est practice technologies to data

    centres ' ideally in se=uence ) can reduce poer consumption !y half< and

    create other !enefits. hese technologies are?

    1. o poer processors

    2. High)efficiency poer supplies

    #. $oer management softare

    4. 0lade servers

    5. &erver virtualisation

    -. 415, +C poer distri!ution 70 *ore relevant to the %&+ than the %9

    . Cooling !est practices 7e.g. hotcold aisle rac3 arrangements9

    . ,aria!le capacity cooling? varia!le speed fan drives

    (. &upplemental cooling

    1". *onitoring and optimisation? cooling units or3 as a team.

  • 8/13/2019 55-Data Centres Final 27-5-09

    16/40

  • 8/13/2019 55-Data Centres Final 27-5-09

    17/40

    Category )ype Description+ir Flo

    *anagement and

    Design

    Design '

    Contained hot

    or cold air

    here are a num!er of design concepts hose !asic intent is to contain and separa

    air from the heated return air on the data floor@

    L Hot aisle containment

    L Cold aisle containment

    L Contained rac3 supply< room return

    L /oom supply< Contained rac3 return

    L Contained rac3 supply< Contained rac3 returnhis action is e>pected for air cooled facilities over 136 per s=uare meter poer de

    emperature and

    Humidity &ettings

    E>panded I

    e=t inlet

    environmental

    conditions

    7temp and

    humidity9

    6here appropriate and effective< Data Centres can !e designed and operated ith

    inlet temperature and relative humidity ranges of 5 to 4"MC and 5 to " /H< non)

    condensing respectively< and under e>ceptional conditions up to K45MC. he curren

    relevant standard is E&I E #"" "1(< Class #.1.

    Free and

    Economised

    Cooling

    Direct +ir Free

    Cooling

    E>ternal air is used to cool the facility. Chiller systems are present to deal ith hum

    high e>ternal temperatures if necessary. E>haust air is re)circulated and mi>ed it

    air to avoid unnecessary humidification dehumidification loads.+s a!ove Indirect +ir Free

    Cooling

    /e circulated air ithin the facility is primarily passed through a heat e>changer ag

    e>ternal air to remove heat to the atmosphere.+s a!ove Direct 6ater

    Free Cooling

    Condenser ater chilled !y the e>ternal am!ient conditions is circulated ithin the

    ater circuit. his may !e achieved !y radiators or !y evaporative assistance throuonto the radiators.

    +s a!ove Indirect 6ater

    Free Cooling

    Condenser ater is chilled !y the e>ternal am!ient conditions. + heat e>changer is

    !eteen the condenser and chilled ater circuits. his may !e achieved !y radiato

    evaporative assistance through spray onto the radiators or evaporative cooling in a

    toer.+s a!ove +dsorptive

    Cooling

    6aste heat from poer generation or other processes close to the data centre is us

    poer the cooling system in place of electricity< reducing overall energy demand. I

    deployments adsorptive cooling can !e effectively free cooling. his is fre=uently p

    en com!ined cooling heat and poer system.

    1

  • 8/13/2019 55-Data Centres Final 27-5-09

    18/40

    1

  • 8/13/2019 55-Data Centres Final 27-5-09

    19/40

    #. Data Centre &olutions ' &trategy

    + strategic approach to data centre energy efficiency is re=uired to ensure that theapproaches adopted< and the e=uipment purchased< meets institutional needs in

    the most cost effective and sustaina!le ay possi!le. Compared to personal

    computing< data centres involve :lumpier; and larger scale investments< and so the

    scope for action ill !e constrained !y circumstances. he 3ey strategic moment is

    clearly hen significant ne investment is !eing planned< for there ill !e maAor

    opportunities to save money and energy consumption !y doing the right thing.

    he 3ey to effective action at this stage ' and a definite help in others ' is effective

    colla!oration !eteen I and Estates !ecause many of the 3ey decisions are around

    physical layout of !uilding< cooling and poer supply< for hich Estates are often

    :suppliers; to I customers. %nfortunately< communication ' or mutualunderstanding ' is not alays good and special effort ill !e needed to try to

    achieve it. he &usteI cases on Cardiff %niversity and Gueen *argaret %niversity

    sho that this can pay off ' in the former case through a very energy efficient data

    centre< and in the latter through perhaps the most advanced application of thin

    client ithin the sector.

    hree 3ey topics then need to !e considered?

    Careful analysis of needs< to avoid over)provisioning@

    E>amination of alternative approaches< such as shared services andvirtualisation@ and

    vercoming !arriers.

    he traditional approach to designing data centres has !een to try and anticipate

    future needs< add a generous margin to provide fle>i!ility< and then !uild to this

    re=uirement. his has the maAor disadvantages of incurring capital and operating

    costs ell in advance of actual need< and higher than necessary energy

    consumption !ecause cooling and poer supply is over)siBed in the early years< and

    an ina!ility to ta3e advantage of technical progress. he E% Code of Conduct 7EC

    8oint /esearch Centre 2""9 and other e>perts 7e.g. ecom!e 2""9 thereforeadvocate more modular approaches< so that ne !atches of servers and associated

    e=uipment can !e installed on an :as needs; !asis. ver)provisioning can also !e

    avoided !y careful e>amination of actual poer re=uirements< rather than

    manufacturer;s claims. 7+lthough on a fe occasions< it may !e that e=uipment

    actually uses more energy and so additional provision is re=uired9.

  • 8/13/2019 55-Data Centres Final 27-5-09

    20/40

    ne option hich also needs to !e considered today is hether some or all ofplanned data centres can either !e outsourced to third party providers< or hostedithin common data centres< in hich several institutions share a single data centrehich is under their control. his could !e managed !y the institutions themselves

    the shared centre9 is one of the fe e>amples in the sector !ut several feasi!ilitystudies have !een done on additional proAects 7see !elo9. he main &usteI reportdiscusses some of the potential sustaina!ility advantages of such shared services78ames and Hop3inson 2""(a9.

    Common data centres are made feasi!le !y virtualisation< hich !rea3s the lin3

    !eteen applications and specific servers< and therefore ma3es it possi!le to locate

    the latter almost anyhere. he &usteI survey found that 52 of respondents ere

    adopting this to some degree< and it is important that the potential for it is fully

    considered 78ames and Hop3inson 2""(c9. he &usteI case study on virtualisation

    of servers at &heffield Hallam %niversity demonstrates the large cost and energy

    savings that can !e realised.

    It is also important that all investment decisions are made on a total cost of

    onership 7C9 !asis< and that every effort is made to estimate the full costs of

    cooling< poer supply and other support activities.

    4. Data Centre &olutions ) $urchasing *ore Energy

    Efficient Devices

    here is a ide variation in energy efficiency !eteen different servers. Hence

    consumption. hree main options 7hich are not mutually e>clusive9 are availa!le

    at present?

    &ervers hich have !een engineered for lo poer consumption throughdesign< careful selection of components 7e.g. ones a!le to run at relatively hightemperatures9< and other means@

    :Guad)core; servers 7i.e. ones containing four processors ithin the samechassis9@ and

    :0lade servers;ing of incoming cooled air ith armer air 7hich re=uires inputtemperatures to !e loer than otherise necessary to compensate9@

    Dispersal of cooled air !eyond the e=uipment that actually needs to !e cooled@and

    ver)cooling of some e=uipment !ecause cooling units deliver a constantvolume of air flo< hich is siBed to match the ma>imum calculated coolingload ) as this occurs seldom< if ever< much of the cool air supplied is asted.

    +necdotal evidence also suggests that relatively crude approaches to air cooling

    can also result in higher failure rates of e=uipment at the top of rac3s 7here

    cooling needs are greater !ecause hot air rises from loer units9.

    hese pro!lems can !e overcome !y?

    25

  • 8/13/2019 55-Data Centres Final 27-5-09

    26/40

  • 8/13/2019 55-Data Centres Final 27-5-09

    27/40

    Free cooling is especially effective hen it is com!ined ith an e>panded

    temperature range for operation. 0 no allo their 25" or so sites top operate

    ithin a range of 5 and 4" degrees Celsius 7compared to a more typical 2")24

    degrees Celsius9. his has reduced refrigeration operational costs !y 5< ith the

    result that they have less that 4" of the total energy demand of a tier # data

    centre< ith similar or greater relia!ility 7;Donnell 2""9. +lthough there remainsconsidera!le concern amongst smaller operators a!out the relia!ility of such

    approaches< they are !eing encouraged !y changes in standards< e.g. the C(.(

    standard of +&H/+E 7a %& !ody9 hich increases operating !ands for temperature

    and humidity.

    2

  • 8/13/2019 55-Data Centres Final 27-5-09

    28/40

    -.1.# %sing alternative cooling media

    +ir is a relatively poor heat transfer medium. 6ater is much more effective< so its

    use for cooling can greatly reduce energy consumption. Chilled ater is used to cool

    air in many C/+C units !ut it can also !e used more directly< in the form of a sealed

    chilled ater circuit !uilt into server rac3s. +s the &usteI case study on Cardiff%niversity shos< this can provide considera!le energy efficiency !enefits over

    conventional approaches. + less common< and more comple> ) !ut potentially more

    energy efficient 7as it can !e operated at 14"C< rather than the oC hich is normal

    ith chilled ater9 ) is use of car!on dio>ide as a cooling medium< as has !een

    adopted in Imperial College 7ro> 2""-9.

    -.2 *ore Energy Efficient $oer &upply

    In 2""5 the %&E$+ estimated the average efficiency of installed server poer

    supplies at 2 7=uoted in Emerson 2""9. Hoever (" efficient poer supplies

    are availa!le< hich could reduce poer dra ithin a data centre !y 117Emerson 2""9.

    *ost data centres use a type of %$& called a dou!le)conversion system hich

    convert incoming poer to DC and then !ac3 to +C ithin the %$&. his effectively

    isolates I e=uipment from the poer source. *ost % %$&s have a 415, three)

    phase output hich is converted to 24", single)phase +C input directly to the

    server. his avoids the losses associated ith the typical %& system of stepping

    don 4", %$& outputs to 2", inputs.

    Energy efficiency could !e further increased if servers could use DC poer directlycess of 1"

  • 8/13/2019 55-Data Centres Final 27-5-09

    29/40

    2(

    0o> 2 ) Free Cooling at the %niversity of Edin!urgh

    he Hector supercomputing facility 7High End Computing erascale /esources9

    generates 136 of heat per rac3. Free cooling is used for around 2 of the

    year< and provides all the cooling needed for a!out ( of the year. his has

    reduced energy consumption< !y 2- annually. Further reductions have come

    from full containment of the rac3s so that cooled supply air cannot mi> itharmer room or e>haust air< and ma>imum use of varia!le speed drives on

    most pumps and fans. +t early 2"" prices< the measures created annual

    savings of 45#

  • 8/13/2019 55-Data Centres Final 27-5-09

    30/40

    -.# /educing +ncillary Energy

    %sing remote 3ey!oardvideomouse 7,*9 units can reduce the amount of

    electricity used in these applications< especially monitors 7oodCleanech 2""9.

    Inefficient lighting also raises the temperature in the server room< ma3ing the

    cooling systems or3 harder to compensate. %sing energy)efficient lights< ormotion)sensitive lights that on;t come on until needed< can cut don poer

    consumption and costs 7Hengst 2""9.

    -.4 0etter *onitoring and Control

    ne of the conse=uences of rising e=uipment densities has !een increased diversity

    ithin the data center. /ac3 densities are rarely uniform across a facility and this

    can create cooling inefficiencies if monitoring and optimiBation is not implemented.

    /oom cooling units on one side of a facility may !e humidifying the environment

    !ased on local conditions hile units on the opposite side of the facility are

    dehumidifying. /ac3 level monitoring and control systems can trac3 ' and respondlocally to ' spot overheating or humidity issues rather than providing additional

    cooling to the entire data center 76orrall 2""9.

    -.5 e &ources of Energy Inputs

    here are several synergies !eteen data centres and renea!le or lo car!on

    energy sources. + considera!le proportion of data centre capital cost is concerned

    ith protection against grid failures. &ome of this e>penditure could !e avoided !y

    on)site generation. Indeed< !oth oogle and *icrosoft are said to !e see3ing 1""

    renea!le energy sourcing< and technical developments in a num!er of areas such

    as fuel cells< trigeneration 7hen an energy centre produces cooling< electricity andheat from the same fuel source9 and ground source heat pumps are ena!ling this

    7Denegri 2""9. Hopper and /ice 72""9 have also proposed a ne 3ind of data

    centre< co)located ith renea!le energy sources such as ind tur!ines< hich act

    as a :virtual !attery;. hey ould underta3e fle>i!le computing tas3s< hich could

    !e aligned ith energy production< increasing hen this as high and decreasing

    hen it as lo.

    Data centres also have affinities ith com!ined heat and poer 7CH$9< hich '

    although usually fossil fuelled< !y natural gas ' is loer car!on than conventional

    electricity and heat production. his is partly !ecause of the relia!ility effects of on)

    site generation< !ut also !ecause many CH$ plants discharge aste ater atsufficiently high temperatures to !e used in a!sorption chillers to provide cold ater

    for cooling. his :trigeneration; can replace conventional chillers< and therefore

    reduce cooling energy consumption considera!lypected to set !enchmar3s for

    the performance of a server across the entire server load 7%&E$+ 2""(!9. In

    parallel the &tandard $erformance Evaluation Corp. 7&$EC9< a nonprofit

    organisation< is developing its on !enchmar3s for server energy consumption

    7&$EC undated9. hese may form the !asis for a ier 2 Energy &tar 76u 2""9.

    he reen rid 72""(9 has also pu!lished several metrics< including the $oer

    %sage Effectiveness 7$%E9 inde>. his divides the centre;s total poer

    consumption 7i.e. including cooling and poer supply losses9 ith the poer

    consumed ithin IC e=uipment. *easurements of 22 data centres !y

    arence 0er3eley ational a!oratory found $%E values of 1.# to #."

    7reen!erg< *ills< schudi< /umsey< and *yatt 2""-9. + recent study has

    argued that 1.2 or !etter no represents :state of the art; 7+ccenture 2""9.

    he ne +DC facility near &acramento ' said to !e the greenest data centre in

    the %&< if not the orld ' achieved 1.12 7see !o> 49.

    he European %nion 7E%9 has also developed a Code of Conduct for Energy

    0o> 4 ) he 6orld;s reenest Data CentreJ

    he +dvanced Data Centers 7+DC9 facility< on an old air !ase near &acramentocellent in the %9. eyfactors included reuse of a !ronfield site< and high use of sustaina!le

    materials and recycled ater. Computing energy consumption has !een

    reduced !y :free cooling; 7using am!ient air to cool the ventilation air streaming@ using ( energy efficient universal poer

  • 8/13/2019 55-Data Centres Final 27-5-09

    32/40

    . etor3ing Issues

    +s noted a!ove< routers and other e=uipment connected ith netor3s account for

    around of IC)related electricity consumption at the %niversity of &heffield. Inaddition< there ill !e additional energy consumption related to &heffield;s use of

    the national 8+E netor3. enerally spea3ing< netor3)related energy and

    environmental issues have received less attention than those ith regard to

    computing and printing !ut it is clear that there is considera!le scope for

    improvement 70aliga et al 2""@ Ceuppens< haritonov and &ardella 2""9. + ne

    energy efficiency metric has also !een launched for routers in the %& 7EC/ 2""9.

    .1 he Environmental Impacts of ,I$ elephony

    ne netor3)related issue of groing importance in universities and colleges is

    Internet $rotocol 7I$9 telephony< Conventional telephony involves dedicated circuits.Its phones operate on lo poer typically a!out 269 and< hilst telephone

    e>change e=uipment consumes large amounts of energy< this has !een reduced

    through decades of improvement. 0y contrast< I$ telephony< hich uses the Internet

    7and therefore a variety of different circuits9 to transmit calls< can !e more energy

    intensive< hen !ased on specialiBed phones.1hese have relatively high poer

    ratings 7often 126 or higher9< largely !ecause they contain microprocessors. It has

    !een estimated that on a simple per)phone !asis< running I$ telephony re=uires

    roughly #" to 4" more poer than conventional phones 7Hic3ey 2""9. In

    institutions< their energy is usually supplied !y a special :$oer over Ethernet; 7$oE9

    netor3 hich operates at higher ratings than conventional netor3s< and hich

    has therefore has greater energy losses through heating as a result of resistance.

    he current $oE standard has roughly 156 per ca!le< and a proposed ne standard

    could increase this to 45)5"6 atts 7Hic3ey 2""9. he volume of calls also

    increases data centre energy usage< !oth ithin the institution< and at those of its

    I$ telephony supplier< hich ' as discussed a!ove ' is relatively energy intensive.

    verall< therefore< installing an I$ telephone system as the main user of as $oE

    netor3 in a university or college is li3ely to increase electricity consumption.

    +s noted< the energy consumption of I$ telephony can !e reduced !y ma3ing

    ma>imum use of :softphones;< i.e. simple< lo poer< handheld devices hich

    connect to a computer< hich in turn underta3es call processing activities. Hoeveremplary fashion ithout e>cessive rises in capital cost.

    #4

  • 8/13/2019 55-Data Centres Final 27-5-09

    35/40

    0i!liography

    0angeman< E.< 2"". "artner# $irtuali%ation to rule server room b& '. +/&

    echnica< *ay 2"". RnlineS +vaila!le at?http?arstechnica.comnes.arspost2"""5")gartner)virtualiBation)to)rule)

    server)room)!y)2"1".htmlR+ccessed 2 8uly 2""S.

    0arroso< . and HolBle< %.< 2"". he Case for Energy)$roportional Computing* I+++

    Compute*rDecem!er 2"". RnlineS +vaila!le at?

    http?.!arroso.orgpu!licationsieeeTcomputer".pdfR+ccessed #1 Decem!er

    2""S.

    0ronstein< *.< 2"". ips for 0uying reen. Processor< ,ol.#" Issue #< 1 8anuary

    2"". RnlineS +vaila!le at? http?.processor.comeditorialarticle.aspJ

    articleUarticles2Fp#""#2F22p"#2F22p"#.aspR+ccessed 1 cto!er 2""S.

    Ca!inet ffice< 2"". "reenin, "overnment ICT- RnlineS ondon. +vaila!le at?http?.ca!inetoffice.gov.u3Vmediaassets.ca!inetoffice.gov.u3pu!licationsreportsgreeningTgovernmentgreeningTgovernmentTict2"pdf.ash>. R+ccessed2 8uly 2""S.

    Cartledge< C.< 2""a. Sheffield ICT .ootprint Commentar&. /eport for &usteI.RnlineS +vaila!le at? http?.susteit.org.u3 7under tools9. R+ccessed 2"ovem!er 2""S.

    Cartledge< C. 2""!. $ersonal Communication !eteen Chris Cartledge< formerly%niversity of &heffield and $eter 8ames< 2# ovem!er 2"".

    Ceuppens< .< haritonov< D.< and &ardella< +.< 2"". Power savin, Strate,ies and

    Technolo,ies in /etwor!- +0uipment 1pportunities and Challen,es* Ris! and

    Rewards. &+I 2"". International &ymposium on +pplications and the Internethi!ition

    7I$+C 2""19. +vaila!le at? http?.hpl.hp.comresearchpaperspoer.html

    R+ccessed 2" ovem!er 2""S.

    Cohen< &.< ren< .< and *aheras .

  • 8/13/2019 55-Data Centres Final 27-5-09

    36/40

  • 8/13/2019 55-Data Centres Final 27-5-09

    37/40

  • 8/13/2019 55-Data Centres Final 27-5-09

    38/40

    8ames< $. and Hop3inson< .< 2""(!. +ner,& and +nvironmental Impacts of Personal

    Computin,. + 0est $ractice /evie prepared for the 8oint Information &ervices

    Committee 78I&C9. RnlineS +vaila!le at? .susteit.org.u3R+ccessed 2 *ay

    2""S.8ames< $. and Hop3inson< .< 2""(c. +ner,& +fficient Printin, and Ima,in, in

    .urther and 3i,her +ducation. + 0est $ractice /evie prepared for the 8oint

    Information &ervices Committee 78I&C9. RnlineS +vaila!le at? .susteit.org.u3R+ccessed 2( *ay 2""(S.

    8ames< $. and Hop3inson< .< 2""(c. Results of the '? SusteIT Surve&-+ 0est

    $ractice /evie prepared for the 8oint Information &ervices Committee 78I&C9.

    8anuary 2"" RnlineS. +vaila!le at? .susteit.org.u3R+ccessed 22 *ay 2""(S.

    oomey< 8.< .< 2""< +stimatin, Total Power Consumption b& Servers in the 8S and

    the 7orld. Fe!ruary 2"". RnlineS +vaila!le at?

    http?enterprise.amd.comDonloadssvrprusecompletefinal.pdfR+ccessed 2#

    8une 2""S.

    arence 0er3eley a!oratories< undated. Data Center Energy *anagement 0est

    $ractices Chec3list. RnlineS +vaila!le at? http?hightech.l!l.govDCraining0est)

    $ractices.htmlR+ccessed 21 cto!er 2""S.

    *odine< +. 2"". /esearchers? +*D less poer)hungry than Intel. The Re,ister< #1

    +ugust 2""< RnlineS +vaila!le at?

    http?.theregister.co.u32"""#1nealTnelsonTassociatesTclaimTamdT!eatsTi

    ntelR+ccessed #" 8uly 2""S.

    eal elson and +ssociates< 2"".A2 Beats Intel in @uad Core Server Power

    +fficienc&-nline 6hite $aper. RnlineS +vaila!le at? http?.orlds)

    fastest.comfB(-.htmlR+ccessed 8uly #" 2""S.

    ecom!e .< 2"". ata Centre Coolin,- A report for SusteIT b& "rid Computin,

    /ow< cto!er 2"". RnlineS +vaila!le at http?.susteit.org.u3R+ccessed 22

    *ay 2""(S.

    ;Donnell< &.< 2"". The 'stCentur& ata Centre. $resentation at the seminar