CSC Modular Datacenter

Post on 21-Nov-2014

471 views 0 download

description

About CSC's air cooled modular datacenter project as presented at DCD Stockholm October 2012

Transcript of CSC Modular Datacenter

Kajaani  Datacenter  Case-­‐Study  

Peter  Jenkins  System  Specialist  &  Project  Manager  

CSC  –  IT  Centre  for  Science    

Contents

! About CSC –  High Performance Computing

! Drivers –  Why a new datacenter? –  Why Kajaani?

! The project ! What we built ! Construction and commissioning ! Lessons learned

I’m from there

Who am I?

! Computer Science background ! 6 years at Sun Microsystems ! HPC consultant in Financial Services ! 3 years in Finland working for CSC ! I’m not a datacenter specialist!

–  It helps!

My role at CSC

IT •  Rapid change •  New hardware

challenging datacenters

Facilities •  Traditionally very stable •  Disruptive innovations in

last 5 years

•  Traditionally separate departments must now work together

•  I try to bridge the gap

About  CSC  •  100%  owned  by  ministry  of  educaGon  

– Public  sector,  Non  profit  •  Major  service  areas:  

– High  Performance  CompuGng  (HPC)  – Storage  and  archive  – FUNET  –  Academic  network  in  Finland  – Managed  services  and  IaaS  cloud  – Server  hosGng  and  co-­‐lo  

CSC and High Performance Computing

   

CSC  CompuGng  Capacity  1989–2011  

338  

227  

146  141  

124  

95  

42  37  

35   34  167  

Users  of  CSC  resources  by  discipline  2011  (total  1386  users)    

Biosciences  

Physics  

Nanoscience  

Language  research  

Chemistry  

Grid  usage  

ComputaGonal  fluid  dynamics  

Structural  analysis  

ComputaGonal  drug  design  

Earth  sciences  

Other  

WHY A NEW DATACENTER?

0  

1  

2  

3  

4  

5  

6  

7  

8  

9  

10  

2005   2006   2007   2008   2009   2010   2011  

Energy  GWh  

CSCs  Datacenter  Energy  Use  2005  -­‐  2011  

DC  2  Infra   DC  2  IT  

DC  1  Infra   DC  1  IT  

 -­‐  €    

 10  €    

 20  €    

 30  €    

 40  €    

 50  €    

 60  €    

2000   2001   2002   2003   2004   2005   2006   2007   2008   2009   2010   2011  

Elspot  prices  FI  in  EUR/MWh  

New  site  drivers  

•  Capacity  limiGng  growth  –  Super  computers  – Managed  hosGng  

•  Costs  increasing  –  Tax  –  Energy  –  Carbon  neutral  

•  New  site  focused  on  efficiency  and  capacity  

0  

1  

2  

3  

4  

5  

6  

7  

8  

9  

10  

2005   2006   2007   2008   2009   2010   2011  

Energy  GWh  

CSCs  Datacenter  Energy  Use  2005  -­‐  2011  

DC  2  Infra   DC  2  IT  

DC  1  Infra   DC  1  IT  

2011 Energy cost per MW/hour

Espoo Kajaani

VAT Tax Transfer Electricity

WHY KAJAANI?

10/10/2012 18

Where is Kajaani?

national grid

fibre

fibre

Electricity  reliability  

29      

Finnish  naGonal  grid   Renforsin  Ranta  substaGon  

Fingrid connections to Kajaani

Power capabilities

! Within perimeter fence: –  National grid connection access to 340 MW –  110 kV / 10 kV main transformer capacity

!  Current capacity 240 MW

–  Biopower on site

! Green power options –  3 hydro power plants within 3 km

!   feeding directly to site.

! Diverse power supply = reliable power

Related  datacenter  sites  

Facebook:  Luleå  120  MW  Free  air  cooling  

CSC:  Kajaani  1.4  MW  Free  air  cooling  0.9  MW  Water  cooling  PUE  design  

Google:  Hamina  ???  MW  Sea  water  cooling  

CSC:  Espoo  1.6  MW  ConvenGonal  PUE  1.4  &  1.8  

Government support

! Google Hamina = wakeup call –  Unused assets ideal for fast growing industry –  Jobs, skills, international competitiveness

! Government acted –  Regional development money –  Extra money to CSC to build a new site

! Site selection: long story short –  Initial concept study 2010 –  Several former paper mills considered –  Kajaani was successful in bidding

THE PROJECT

Approach

! Design goal: multi-MW facility PUE <1.2 ! Leverage features of site

–  Matched to business requirements –  Avoid redundancy and backup

! Only 100kW UPS from day one –  < 5% of load –  core network, management, automation –  Emphasis on monitoring and rapid recovery

! No generators day one ! Option to add 100% UPS and generators

Approach continues

! Free cooling year round ! Use modular to right-size and scale quickly ! Green

–  CSC buys certificates of carbon neutral energy –  100% Finnish hydro power

Outside  

Timeline  

Paper  mill  

2010   2011   2012  

Site  selecGon  

Planning  

Warehouse  

2013  

1st  ITT  

Datacenter  

Supercomputer  

October  

2nd  ITT  

Analysis  

Delivery  Conven&onal  

build  

Modular  build  

WHAT DID WE BUILD?

CSC’s new Kajaani datacenter

!  Renovated paper warehouse !   12 000 m2 of space !   ~ 1.7 x Old Trafford football ground

Warehouse  has  two  DC  tenants  

Hub  datacenter  

SGI  Ice  Cube  R80  

TWO  DATACENTERS  IN  ONE  

3D WALKTHROUGH VIDEO

Specification

! 2.4 MW combined hybrid capacity ! 1.4 MW modular free air cooled datacenter

–  Upgradable in 700kW factory built modules –  Order to acceptance in 5 months –  35kW per extra tall racks – 12kW common in

industry –  PUE forecast < 1.08 (pPUEL2,YC)

! 1MW HPC datacenter –  Optimised for Cray super & T-Platforms

prototype –  90% Water cooling

Hub  building  and  DC  

Hub datacenter

Hub building

! Redundant network rooms ! 100kW UPS – upgradable to MW’s ! 10KV switchgear ! Fire suppression ! Storage rooms – warehouse is cold ! Hub Datacenter

Hub  Datacenter  

Chillers  2  x  450  kW  

External  dry  coolers  

Hub DC facts

! Due November ! 900kW water cooling

–  + 100kW air from hub

! Purpose built for water cooled HPC

IT  summary  

•  Cray  “cascade”  supercomputer  –  10M€  -­‐  five  year  contract  –  Fastest  computer  in  Finland  due  mid-­‐November  –  Phase  one  385  kW  2012  of  Intel  processors  –  Very  high  density,  large  racks  

•  T-­‐Plamorms  prototype  –  Very  high  density  hot-­‐water  cooled  racks  –  Intel  processors,  Intel  and  NVIDIA  accelerators  –  TheoreGcal  400  TFlops  performance  

SGI  Ice  Cube  R80  

SGI  Ice  Cube  R80  

SGI Ice Cube R80

! One head unit and two expansion modules ! More modules can be added ! Fully automated free cooling system

–  Dozens of cooling fans, louvers and sensors ! Extremely energy efficient – pPUE 1.08 ! Set point allowed to vary (10-27C for us) ! Adiabatic cooling on warm days ! Exhaust heat used to warm incoming air

Factory  visit  August  2012  

! “Super Cluster” –  4.5M€ five year contract –  1,152 Intel CPUs –  190 TFlop/s –  30kW 47U racks

! HPC storage –  3PB of fast parallel storage –  Supports Cray and HP systems

IT summary

CONSTRUCTION

Site  in  January  2012  

MDC TIME LAPSE VIDEO

Head  load  test  

•  700kW  of  load  banks  – 20  racks  – 120  x  6kW  load  banks  – 2  days  to  rack  mount  – Half  MDC  capacity  

•  pPUE  1.05  •  Very  useful  •  Silenced  the  criGcs  

Commissioning

!   For MDC it took longer than construction!

!  MDC internal workings and drawings are sensitive –  Test plan process challenging –  Training required for new free

cooled system type !  Where possible plan in

advance !  Allow plenty of time

LESSONS LEARNED

Lessons learned

! Simplify your procurement! ! Share every detail of your site with bidders ! A modular datacenter is still a datacenter

–  Even if design fixed is expect many decisions ! Pay special attention to integration

–  fire, security, power, electricity, etc. –  Any small issue can cause major delay

! Get expert advice –  don't assume!

Reference or define every ’standard’ Don't assume

! Racks –  Define depth, height, RU, posts –  Strongly consider alternatives like

opencompute ! Door heights ! Make a list of required standards

–  Fire suppression – insurance, building permit ! Get a datacenter contract

–  Modular buildings contracts != IT contracts –  If possible multiple phases of acceptance

Things we got right

! Make the selection based on 10 year TCO –  IT TCO was 5 years per phase –  highlights energy efficiency without defining 'green’ –  could also use an artificially high energy price

! Require that IT vendors list –  PDU efficiency use 80Plus –  Temperature and humidity ranges –  ASHRAE TC 9.9 2011

!  All our suppliers support A2 range (5 to 35C)

Datacenter operations team

QUESTIONS? Thanks

More information

! Contact me –  peter.jenkins@csc.fi

! Blog: –  http://www.csc.fi/blogs/gridthings