2
PeregrineEnvironment
• Peregrineo 2592nodes,peakperformanceof2.26Pflops,142TBmemory,FDRIBnetworkinscalableunittopology
o 2.25PBLustrefilesystems• Gyrfalcon
o OracleStorageTektapelibraryo SAMQFS,mountedonPeregrineloginnodes
3
PeregrineArchitecture
PFS/scratch1.5PB/projects.75PB
HFS/home10TB/nopt4TB18x36pFDRInfiniBand Ethernet
Phase1+
ServicenodesTDS
72AirCooledSandyBridge16c32G
72AirCooledSandyBridge16c32G
72SandyBridgewithPhi16c32G
72SandyBridgewithPhi16c32G
72SandyBridgewithPhi16c32G
72SandyBridgewithPhi16c32G
Phase2bPhase2aPhase3
144IvyBridge
24c64G
144IvyBridge
24c64G
144IvyBridge
24c32G
144IvyBridge
24c32G
144IvyBridge
24c32G
144IvyBridge
24c32G
144IvyBridge
24c32G
144HasWell24c64G
144HasWell24c64G
144HasWell24c64G
144HasWell24c64G
144HasWell24c64G
144HasWell24c64G
144HasWell24c64G
144HasWell24c64G
SU1144nodesSU2144 SU3288nodes SU4288nodes SU5288nodes SU6288nodesSU8576nodes SU7576nodes
Ethernet Ethernet EthernetEthernetEthernetEthernetEthernetEthernetEthernet
210pFDRInfiniBand
324pFDRInfiniBand
324pFDRInfiniBand
324pFDRInfiniBand
324pFDRInfiniBand
324pFDRInfiniBand
180pFDRInfiniBand
648pFDRInfiniBand
648pFDRInfiniBand
52256GBnodes 3DAVnodes
8:1over-subscribed
4loginnodes
4
UserCommuni`es
• O(200)UsersinFY17,O(70)projectso NRELstaffo Externalusers(universi`es,industry,otherna`onallaboratories)
• Usercommunityandworkloadvarieseachyear• Dataanalyzed:
o FY16:11/1/15–10/31/16o FY17:1/1/16–5/30/17
5
PeregrineusagereflectsEEREmissionandprograms
Bioenergy18%
Buildings1%
Computa`onalScience1%
EnergySystemsIntegra`on
3%Exascale1%
Geothermal
GridMod
H2andFuelCells1%
Solar44%
Vehicles8%
Water1%
Wind22%
NodeHoursUsageFY16AdvManufacturing
0%
Bioenegy30%
Buildings0%
Computa`onalScience1%EnergySystems
Integra`on5%
ExascaleCompu`ng0%
Geothermal0%
GridMod0%
H2andFuelCells13%
SolarEnergy24%
Vehicles7%
Wind18%
Water2%
Other0%
NodeHourUsageFY17
6
Applica`onsusedonPeregrine
VASP,53.6%
VFS-Wind,8.7%
charmm,7.4%
SOWFAwithOpenFOAM,4.2%
WINDSE,3.2%
Gaussian,2.9%
namd,2.2%
Python,2.1%
plexos,2.1%
amber,2.2%OpenFOAM,1.8%
s3d,1.6%converge,1.4% WRF,1.0% FY16
VASP56%
NAMD8%
pythonhomegrown6%
WRF5%
Gaussian4%
VFS-wind4%
OpenFOAM3%
converge3%
gromacs2%
starccm1%
charmm1%
ls-dyna1%
other6%
FY17
7
NodeHoursusedbynodecount
0.00
500,000.00
1,000,000.00
1,500,000.00
2,000,000.00
2,500,000.00
1 2 3-4 5-8 9-16 17-32 33-64 65-128 129-256257-512
Nod
eHo
urs
#Nodes
FY17NodeHours
8
0
500000
1000000
1500000
2000000
2500000
1 2 3-4 5-8 9-16 17-32 33-64 65-128 129-256 257-512
Nod
eHo
ursC
onsumed
JobSize(#Nodes)
JobsizesofTopPeregrineApplicaGonsforFY17
ls-dyna
Starccm+
GROMACS
Converge
WRF
Gaussian
Pythonhomegrown
VFS-Wind
OpenFOAM
NAMD
VASP
9
Jobrun`mes
Mostusageisfromjobsthatrunfor1-5days
020000400006000080000
100000120000140000160000180000200000
<1min <10min 10min-1hr
1-8hrs 8-24hrs 1-2days 2-5days 5-10days
#jobs
#jobs
0
500000
1000000
1500000
2000000
2500000
3000000
3500000
<1min <10min 10min-1hr
1-8hrs 8-24hrs 1-2days 2-5days 5-10days
Node-hours
Node-hours
10
Lustrefilesystems
/projects:767TB,477TBused
• Biggestuserhas62.9TBofdatain738,726files
• Spaceisallocatedtoprojectsbyrequest.
/scratch:1.5PB,921TBused
• Biggestuserhas240.7TBofdatain802,420files
• 28daypurgepolicy
11
VASPjobsizes
0
500000
1000000
1500000
2000000
2500000
1 2 3-4 5-8 9-16 17-32 33-64 65-128129-256257-512
Nod
eHo
urs
NumberofNodes
FY16
0
500000
1000000
1500000
2000000
2500000
1 2 3-4 5-8 9-16 17-32 33-64 65-128 129-256 257-512
FY17
12
NodeHoursusedbyMolecularDynamicsCodes
0
100000
200000
300000
400000
500000
600000
700000
Nod
eHo
urs
#nodes
FY16
CHARMM
AMBER
NAMD
0
100000
200000
300000
400000
500000
600000
700000
1 2 3-4 5-8 9-16 17-32 33-64 65-128 129-256
FY17
LAMMPS
GROMACS
NAMD
AMBER
CHARMM
• TopMDappsvaryalotfromyeartoyear• Trendtowardslargerjobs
Top Related