NUIT Tech Talk - Northwestern University · • Research projects that require large amounts of...
Transcript of NUIT Tech Talk - Northwestern University · • Research projects that require large amounts of...
Topics in Research Computing:
XSEDE and Northwestern University
Campus Champions
Pradeep Sivakumar [email protected]
NUIT Tech Talk
• What is XSEDE?
– Introduction • Who uses XSEDE?
– Resources available • High-Performance Computing
• Storage
• Visualization
– Allocations
• The Campus Champions Program – Overview
Contents
• XSEDE is a federated pool of scientific discovery
infrastructure
– The most advanced, powerful and robust collection of
integrated digital resources and services in the world
– Funded by the NSF (led by NCSA)
– An integrated, persistent shared computational resource
– Combining leadership class resources at 16 partner
sites
Introduction
1 Cornell University Center for Advanced Computing
2 Indiana University
3 Jülich Supercomputing Centre
4 National Center for Atmospheric Research
5 National Center for Supercomputing Applications - University of Illinois at Urbana-
Champaign
6 National Institute for Computational Sciences - University of Tennessee Knoxville/Oak
Ridge National Laboratory
7 Ohio Supercomputer Center - The Ohio State University
8 Pittsburgh Supercomputing Center - Carnegie Mellon University/University of Pittsburgh
9 Purdue University
10 Rice University
11 San Diego Supercomputer Center - University of California San Diego
12 Shodor Education Foundation
13 Southeastern Universities Research Association
14 Texas Advanced Computing Center - The University of Texas at Austin
15 University of California Berkeley
16 University of Chicago
17 University of Virginia
XSEDE Partners
XSEDE Campus Resources
Locations of key HPC sites
• 2 billion CPU-hours allocated
• 1400 allocations
• 350 institutions
• 32 research domains
• 600 research requests per
year
• 800 other requests
• 3.5B SUs requested(3.2B
are research requests)
• 1.8B SUs awarded(1.6B are
research awards)
Who Uses XSEDE?
Sampling of much larger set. Many examples are new to use of
advanced digital services. Range from petascale to disjoint HTC, many
are data driven. XSEDE will support thousands of such projects.
Who Uses XSEDE?
• Earthquake Science and Civil
Engineering
• Molecular Dynamics
• Nanotechnology
• Plant Science
• Storm modeling
• Epidemiology
• Particle Physics
• Economic analysis of phone network
patterns
• Brain science
• Analysis of large cosmological
simulations
• DNA sequencing
• Computational Molecular Sciences
• Neutron Science
• International Collaboration in
Cosmology and Plasma Physics
• Research projects that require large amounts of
computing power are suitable to apply for XSEDE’s HPC
resources
• If you have an application that is developed with MPI,
OpenMP, GPGPU, and hybrid programming paradigms
(massively parallel code), it will scale well across many
compute nodes
High-Performance
Computing
• Stampede at Texas Advanced Computing Center
• Blacklight at Pittsburgh Supercomputing Center
• Gordon at San Diego Supercomputing Center
• Keeneland at Georgia Tech
• Kraken at the National Institute for Computational
Sciences, Oak Ridge National Labs
• Lonestar at Texas Advanced Computing Center
• Steele at Purdue
• Trestles at San Diego Supercomputing Center
Available HPC Resources
• XSEDE has access to three types of storage areas:
– Archival storage: long term storage for large amounts of data
– Allocatable storage: additional storage on several stand-alone
systems (available in the first quarter of 2013)
– Resource file-system storage: on the compute and visualization
resource
Data Storage
• SDSC Gordon, Trestles, 410 TB
– SSD system with fast storage
• NCSA Mass Storage System (MSS) – 10 PB
• NICS HPSS – 7 PB
• TACC Ranch – 50 PB
Data Storage Resources
• Visualization is the means to transfer data into plots,
images or animations to better understand phenomena
being modeled.
• XSEDE offers a variety of resources for visualization
– Longhorn
– Nautilus
– Spur
Visualization
• Startup Development/Testing/Porting/Benchmarking
• Education Classroom, Training
• Research Program (usually funded)
Allocations Types and Nomenclature
PI Principal Investigator
POPS Partnerships Online Proposal System
XRAC XSEDE Resource Allocations Committee
SU Service Unit = 1 Core-hour
*Commonly used XSEDE proposal acronyms
• One per PI (generally)
• 1-year duration
• Unused SUs are forfeited at the end of an award period
• Progress report required for renewal requests
• Add users to a grant via XSEDE User Portal
Allocation Awarded
Review Award
Advance
Time to renew Submission
Campus Champions Map
Campus Champion Institutions
January 2, 2013
Current Campus Champion Institutions (unclassified) – 71
Current Campus Champion Institutions (EPSCoR states) 44
Current Campus Champion Institutions (Minority Serving Institutions)--
10
Current Campus Champion Institutions (both EPSCoR and MSI) – 8
Total Number of Campus Champion Institutions Overall -- 133
• Role
– Provide local source of knowledge about XSEDE resources
– Add researchers and educators to start-up allocations for testing
and development
– Provide information about the allocation process and aid in writing
a successful XSEDE proposal
– Conduct training workshops about the use of XSEDE resources
– Be a liaison for NU’s users of XSEDE to capture information on
problems and challenges that need to be addressed
• NU’s representatives
– Pradeep Sivakumar, Sr HPC Specialist
– Pascal Paschos, Sr HPC Specialist
– For information on resources, services, offerings
contact [email protected]
Campus Champions
• XSEDE offers training to teach users about getting started
and how to maximize their productivity
• Classes are offered in a variety of areas including
systems and software supported, parallel programming
languages (MPI, OpenMP, CUDA, Scientific Python),
HPC, visualization, data management, etc.
• Classes are mostly web-based. Online virtual training
available at https://portal.xsede.org/web/xup/online-
training
Training