14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech...

25
14th October 2010 Graduate Lectures 1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator

Transcript of 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech...

Page 1: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 1

Oxford University Particle Physics Unix Overview

Pete Gronbech

Senior Systems Manager andSouthGrid Technical Co-ordinator

Page 2: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 2

Strategy Local Cluster Overview Connecting to it Grid Cluster Computer Rooms

Page 3: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 3

Particle Physics Strategy The Server / Desktop Divide

Win XP PC Linux Desktop

Des

ktop

sS

erve

rs

General Purpose Unix

Server

Group DAQ

Systems

Linux Worker nodes

Web Server

Linux FileServers

Win XP PC

Win 7 PC

Win 7 PC

Approx 200 Windows XP Desktop PC’s with Exceed, putty or ssh used to access central Linux systems

Virtual Machine Host

NIS Server

torque Server

Page 4: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 4

Particle Physics Linux Unix Team (Room 661):

Pete Gronbech - Senior Systems Manager and SouthGrid Technical Coordinator Ewan MacMahon - Systems Administrator Kashif Mohammad - Deputy Technical Coordinator

Aim to provide general purpose Linux based system for code development and testing and other Linux based applications.

Interactive login servers and batch queues are provided

Systems run Scientific Linux which is a free Red Hat Enterprise based distribution

Systems are currently running a mixture of SL4, and SL5

The Systems are being migrated to SL5 currently, this is the same version as used on the Grid and at CERN. Students are encouraged to test pplxint5 to let us know of any problems.

Worker nodes form a PBS (aka torque) cluster accessed via batch queues.

Page 5: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 5

Current Clusters

Particle Physics Local Batch cluster

Oxfords Tier 2 Grid cluster

Page 6: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010

Alias to pplxint 2

6TB

pplxfs2

PP Linux Batch Farm Scientific Linux 488 active slots

pplxint1 Interactive login nodes

4TB

10TB

pplxfs3

9TB

pplxfs4

19TB

lustre

19TB

CDF Data

ATLAS Data

pplxwn11 8 * Intel 5420 cores

pplxwn10 8 * Intel 5420 cores

pplxwn09 8 * Intel 5420 cores

pplxwn08 8 * Intel 5420 cores

pplxwn07 8 * Intel 5420 cores

pplxwn06 8 * Intel 5420 cores

pplxwn05 8 * Intel 5420 cores

pplxwn04 8 * Intel 5420 cores

pplxwn03 8 * Intel 5420 cores

pplxwn02 8 * Intel 5420 cores

pplxwn01 8 * Intel 5420 cores

pplxfs6

19TB

lustre

19TB

lustre

19TB

pplxint2 8 * Intel 5420 cores

pplxint3

6Graduate Lectures

LHCb Data

NFS Servers

Home areas

Data Areas

Data Areas

Page 7: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

Particle Physics Computing

44TB

Lustre OSS03

df -h /data/atlasFilesystem Size Used Avail Use% Mounted onpplxlustrenfs.physics.ox.ac.uk:/data/atlas 76T 46T 27T 64%/data/atlasgronbech@pplxint2:~> df -h /data/lhcbFilesystem Size Used Avail Use% Mounted onpplxlustrenfs2.physics.ox.ac.uk:/data/lhcb 18T 8.5T 8.6T 50% /data/lhcb

Page 8: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010

PP Linux Batch Farm

pplxwn12

Scientific Linux 5migration plan

pplxint6pplxint5

pplxwn18

pplxwn19

8 * Intel 5420 cores

8 * Intel 5345 cores

8 * Intel 5345 cores

Interactive login nodes

pplxwn13 8 * Intel 5420 cores

pplxwn14 8 * Intel 5420 cores

pplxwn15 8 * Intel 5420 cores

pplxwn16 8 * Intel 5420 cores

Currently acting as NFS – Lustre gateways for the SL4 nodes

8Graduate Lectures

Page 9: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 9

http://pplxconfig.physics.ox.ac.uk/ganglia

Page 10: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 10

Strong Passwords etc

Use a strong password not open to dictionary attack! fred123 – No good Uaspnotda!09 – Much better

Better to use ssh with a passphrased key stored on your desktop.

Page 11: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 11

Connecting with PuTTY

Demo1. Plain ssh terminal connection2. With key and Pageant3. ssh with X windows tunnelled to

passive exceed4. ssh, X windows tunnel, passive

exceed, KDE Session

http://www.physics.ox.ac.uk/it/unix/particle/XTunnel%20via%20ssh.htm

Page 12: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 12

Page 13: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

Puttygen to create an ssh key on Windows

14th October 2010 Graduate Lectures 13

Paste this into ~/.ssh/authorized_keys on pplxint

If you are likely to then hop to other nodes add :

ForwardAgent yes

to a file called config in the .ssh dir on pplxint

Save the public and private parts of the key to a subdirectory of your h: drive

Page 14: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

Pageant

Run Pageant once after login to load your (windows ssh key)

14th October 2010 Graduate Lectures 14

Page 15: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 15

SouthGrid Member Institutions

Oxford RAL PPD Cambridge Birmingham Bristol

JET at Culham

Page 16: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 16

Oxford Tier 2 Grid Upgrade 2008 13 systems, 26 servers,

52 cpus, 208 cores. Intel 5420 clovertown cpu’s provide ~540KSI2K

3 servers each providing 20TB usable storage after RAID 6, total ~60TB

One rack, 2 PDU’s, 2 UPS’s, 3 3COM 5500G switches

Page 17: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

2010 Upgrade Due in November

Compute Servers Twin squared nodes

– Dual 8 core AMD Opteron 6128 CPUs provide 64 cores per unit.

Storage 24 *2TB disks per unit (~44TB after RAID6)

– Increase in LHCb capacity– Allow migration off older servers

36*2TB disks per unit (~68TB after RAID6)– Grid Cluster upgrade ~200TB

14th October 2010 Graduate Lectures 17

Page 18: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 18

Get a Grid Certificate

Must remember to use the same web browser to request and retrieve the Grid Certificate.

Once you have it in your browser you can export it to the Linux Cluster to run grid jobs.

Details of these steps and how to request membership of the SouthGrid VO (if you do not belong to an existing group such as ATLAS, LHCb) are here:

http://www.gridpp.ac.uk/southgrid/VO/instructions.html

Page 19: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 19

Two New Computer Rooms provide excellent

infrastructure for the future

The New Computer room built at Begbroke Science Park jointly for the Oxford Super Computer and the Physics department, provides space for 55 (11KW) computer racks. 22 of which will be for Physics. Up to a third of these can be used for the Tier 2 centre. This £1.5M project is funded by SRIF and a contribution of ~£200K from Oxford Physics.

The room was ready in December 2007. Oxford Tier 2 Grid cluster was moved there during spring 2008. All new Physics High Performance Clusters will be installed here.

Page 20: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 20

Oxford Grid Cluster

Page 21: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 21

Local Oxford DWB Physics Infrastructure Computer Room

Completely separate from the Begbroke Science park a computer room with 100KW cooling and >200KW power has been built. ~£150K Oxford Physics money.

Local Physics department Infrastructure computer room.

Completed September 2007.

This allowed local computer rooms to be refurbished as offices again and racks that were in unsuitable locations to be re housed.

Page 22: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 22

The end for now… Ewan will give more details of use of the

clusters next week Help Pages

http://www.physics.ox.ac.uk/it/unix/default.htm http://www.physics.ox.ac.uk/pp/computing/

Email [email protected]

Questions…. Network Topology

Page 23: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 23

Network

Gigabit connection to campus operational since July 2005. Second gigabit connection installed Sept 2007. Dual 10 gigabit links installed August 2009 Gigabit firewall installed for Physics. Purchased commercial unit

to minimise manpower required for development and maintenance. Juniper ISG 1000 running netscreen.

Firewall also supports NAT and VPN services which is allowing us to consolidate and simplify the network services.

Moving to the firewall NAT has solved a number of problems we were having previously, including unreliability of videoconferencing connections.

Physics-wide wireless network. Installed in DWB public rooms, Martin Wood,AOPP and Theory. New firewall provides routing and security for this network.

Page 24: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 24

Network Access

CampusBackboneRouter

Super Janet 4 2* 10Gb/s with Super Janet 5

OUCSFirewall

depts

depts

Physics Firewall PhysicsBackboneRouter

1Gb/s

10Gb/s

1Gb/s

10Gb/s

BackboneEdgeRouter

depts

100Mb/s

100Mb/s

1Gb/s

depts

100Mb/s

BackboneEdgeRouter

10Gb/s

Page 25: 14th October 2010Graduate Lectures1 Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator.

14th October 2010 Graduate Lectures 25

Physics Backbone

desktop

Serverswitch

PhysicsFirewall

PhysicsBackboneRouter

1Gb/s

1Gb/s

100Mb/s

1Gb/s

ParticlePhysics

desktop

100Mb/s

100Mb/s

1Gb/s

100Mb/s

Clarendon Lab

1Gb/s

LinuxServer

Win 2kServer

Astro

1Gb/s

1Gb/s

Theory

1Gb/s

Atmos

1Gb/s