famu-research-computing

3
1 Tallahassee, 02/02/15 Research Computing at Florida A&M University Timeline: 1. Meeting with President Dr. Elmira Mangum and Vice President of Research Dr. Timothy Moore at FAMU President’s Office, Jan. 12, 2015 (including synopsis for research computing): New Campus Computer Cluster: Resources: Historically: 1. Army DoD Research Computer Cluster – remote access via FH Science Research Center (Pharmacy, Environmental Science, Physics, Chemistry). Decommissioned in 2003/04. 2. CePAST Computer Cluster, funded by Army/DoD as part of collaboration with University of Hawaii (TeraWatt laser remote sensing); PIs: C. Weatherford, L. Johnson (physics). MAC XSERVE cluster (128 nodes = 256 processors); since 2005/06. Users: 3 faculty, 2-3 postdocs/research associates, 2-3 graduate students. Not a shared resource. New (Jan. 2015): o Hewlett Packard cluster addition to CePAST cluster, sponsored by FAMU Office of the President ($115,000 value). o New, shared campus resource. Hardware to be setup by spring break (mid March). Software to be installed and completed by end of spring 2015 (April/May). o Specifications: About 120 cores: 2 Intel Xeon E5 head nodes and 10 compute nodes with 10 cores each and 80 or 40 GB memory per node, respectively (120 cores total), standard 1 TB hard drives, the full Intel XE Studio suite with compilers, Intel MKL library, thread profiling and analysis software. FAMU Campus Needs (w/o Engineering): Currently, 15-20 FAMU faculty on campus actively using high-performance / high- throughput computing for research and / or education; mostly in STEM: physics: 4-5; chem.: 1; biology: 2; math: 2; CIS: 5-6; ESI: 1; pharmacy: < 5; SBI: 1 (FAMU-FSU Engineering School supported via FSU RCC). Typical simulation needs for research, code development, training: Gaussian, QChem, MD (chem., pharm., biol., ESI) – 32-64 cores per simulation; DFT, ab-initio QM (phys., chem.) – 128-256 cores per simulation; other areas e.g. distributed (‘cloud’) computing. Core Estimate: 16 people running 128-core (mid-tier) jobs concurrently (not every faculty member will run simulations at the same time; some faculty with up to 2-3 students). => Promote goal to increase cluster size to 1024 2048 cores via faculty grants. Affiliate Membership of FAMU in SSERCA: Approved by FAMU VP of Research Jan. 15, 2015. New Search for VP of Information Technology (CIO) initiated by President Mangum: Search firm with experience selecting many of the administrative positions in information technology in the South East. Candidate for CIO position with research computing background and vision to be selected by fall 2015. Interim IT staff position as (part-time) coordinator with expertise on research computing (e.g. R. Seniors).

Transcript of famu-research-computing

Page 1: famu-research-computing

  1  

Tallahassee, 02/02/15 Research Computing at Florida A&M University

Timeline: 1. Meeting with President Dr. Elmira Mangum and Vice President of Research Dr. Timothy Moore at FAMU President’s Office, Jan. 12, 2015 (including synopsis for research computing): New Campus Computer Cluster: Resources:

• Historically: 1. Army DoD Research Computer Cluster – remote access via FH Science Research Center (Pharmacy, Environmental Science, Physics, Chemistry). Decommissioned in 2003/04. 2. CePAST Computer Cluster, funded by Army/DoD as part of collaboration with University of Hawaii (TeraWatt laser remote sensing); PIs: C. Weatherford, L. Johnson (physics). MAC XSERVE cluster (128 nodes = 256 processors); since 2005/06. Users: 3 faculty, 2-3 postdocs/research associates, 2-3 graduate students. Not a shared resource.

• New (Jan. 2015):

o Hewlett Packard cluster addition to CePAST cluster, sponsored by FAMU Office of the President ($115,000 value).

o New, shared campus resource. Hardware to be setup by spring break (mid March).

Software to be installed and completed by end of spring 2015 (April/May).

o Specifications: About 120 cores: 2 Intel Xeon E5 head nodes and 10 compute nodes with 10 cores each and 80 or 40 GB memory per node, respectively (120 cores total), standard 1 TB hard drives, the full Intel XE Studio suite with compilers, Intel MKL library, thread profiling and analysis software.

FAMU Campus Needs (w/o Engineering):

• Currently, 15-20 FAMU faculty on campus actively using high-performance / high- throughput computing for research and / or education; mostly in STEM: physics: 4-5; chem.: 1; biology: 2; math: 2; CIS: 5-6; ESI: 1; pharmacy: < 5; SBI: 1 (FAMU-FSU Engineering School supported via FSU RCC).

• Typical simulation needs for research, code development, training: Gaussian, QChem, MD

(chem., pharm., biol., ESI) – 32-64 cores per simulation; DFT, ab-initio QM (phys., chem.) – 128-256 cores per simulation; other areas e.g. distributed (‘cloud’) computing.

• Core Estimate: 16 people running 128-core (mid-tier) jobs concurrently (not every faculty

member will run simulations at the same time; some faculty with up to 2-3 students). => Promote goal to increase cluster size to !≈1024−2048 cores via faculty grants.

Affiliate Membership of FAMU in SSERCA: Approved by FAMU VP of Research Jan. 15, 2015. New Search for VP of Information Technology (CIO) initiated by President Mangum:

• Search firm with experience selecting many of the administrative positions in information technology in the South East.

• Candidate for CIO position with research computing background and vision to be selected by fall 2015.

• Interim IT staff position as (part-time) coordinator with expertise on research computing (e.g. R. Seniors).

Page 2: famu-research-computing

  2  

2. Planned Meetings and Workshops: Florida LambdaRail (FLR) Meeting (Feb 2015)

• Meeting with Florida LambdaRail leadership Joe Lazor and Veronica Sarjeant (Thursday Feb. 5 / Tuesday Feb. 10).

• VPR Tim Moore and Assoc. VPR Charles Weatherford as participants.

• Topics: FLR activities (e.g. 100G roll-out), FAMU-FLR network connectivity, FAMU's technical

situation with its campus network.

• Additional participants: FAMU IT staff (e.g. R. Seniors). SSERCA contact: Dave Pokorney (UF).

• Goal: Coordination of NSF Internet2 workshop on campus for spring semester (April/May) or fall semester (September) (1 ½ days). Suggestion of a cyberinfrastructure plan to build FAMU’s research computing network based on its needs and available resources with suggested proposal funding mechanisms.

SSERCA-DDN Meeting at UCF (Feb. 11+12) Primary goal for FAMU: Form a discussion with DDN as new affiliate member of SSERCA for its future data storage, data analysis or data curation needs with FAMU’s specific environment (e.g. historical data at its Black Archives; Pharmacy School; School of Nursing and Public Health; School of Business and Industry; School of Journalism and Multimedia Communications). XSEDE Spring 2015

• XSEDE booth at Emerging Researchers Network Conference in D.C., Feb. 19-22, 2015 (M. Jack participant; host: Linda Akli/SURA).

• XSEDE User Advisory Committee (M. Jack, two years, starting Jan. 2015). Meetings of the FAMU VP of Research with Departments on Campus Spring 2015: Discuss research needs of faculty. Create focus areas with center proposals. Ideas in computing: Cybersecurity, data science and material science, cloud computing (e.g. FAU NSF proposal). 3. Computational Skills for Students, Faculty, and Staff – Computational Science Program Goals: Build important computational skills for students, faculty and staff across STEM and other areas, provide training in computational science as a new interdisciplinary offering, increase competitiveness and job opportunities for FAMU graduates, create new collaborations among FAMU faculty and with off-campus research communities nationally and internationally. History:

• Since fall 2013: First meetings in College of Science and Technology (CoS&T) to discuss a new computational science program across 5 STEM departments. CS program listed in updated FAMU mission statement.

• Committee meetings with chairs and faculty representatives in Biology, Chemistry, Physics (M. Jack), Mathematics, and Computer & Information Sciences (H. Chi): Comparison of existing programs in the state and nationally. Minor, B.S., and M.S./Ph.D. programs. Last meeting: April 2014.

• Faculty workshops and support by NSF XSEDE Education, Training and Outreach and SURA

and SSERCA (Linda Akli, SURA/XSEDE): − Spring 2012: 2-day parallel visualization workshop at FAMU CIS.

Page 3: famu-research-computing

  3  

− Spring 2013: 2-day Research Data Management Implementations Workshop (RDMI) in Washington, D.C. sponsored by XSEDE and CASC (M. Jack).

− Fall 2013: 2-day SSERCA/XSEDE workshop at Engineering School – Introduction to high-performance computing resources in Florida (SSERCA) and nationally (NSF XSEDE).

− Fall 2014: 2-day SURA/XSEDE workshop on computational science curriculum development with about 20 faculty and administrators from 10 different MSIs (M. Jack, H. Chi).

• Spring/Fall 2015: Finalizing curriculum plan for computational science minor in 5 STEM areas math, physics, chemistry, biology, and CIS (courses with course descriptions). Committee: M. Jack (physics), H. Chi (CIS) and faculty associated with each course offering.