XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery...

40

Transcript of XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery...

Page 1: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated
Page 2: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

XSEDE Table ofContents

The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated digital resources and services in the world. It is a single virtual system that scientists can use to interactively share computing resources, data, and expertise.

The five-year, $121 million project is supported by the National Science Foundation. It replaces and expands on the National Science Foundation TeraGrid project. More than 10,000 scientists used the TeraGrid to complete thousands of research projects, at no cost to the scientists. XSEDE continues that same sort of work—with an expanded scope, generating more knowledge, and improving our world in an even broader range of fields.

XSEDE is led by the University of Illinois’s National Center for Supercomputing Applications. The partnership includes:• Carnegie Mellon University/Pittsburgh

Supercomputing Center - University of Pittsburgh• Center for Advanced Computing - Cornell University• Indiana University• Jülich Supercomputing Centre• National Center for Atmospheric Research• Ohio Supercomputer Center - The Ohio State University• Purdue University• Rice University• Shodor Education Foundation• Southeastern Universities Research Association• University of California Berkeley• San Diego Supercomputer Center - University of California

San Diego• University of Chicago• National Center for Supercomputing Applications -

University of Illinois at Urbana-Champaign• National Institute for Computational Sciences -

University of Tennessee Knoxville/Oak Ridge National Laboratory

• Texas Advanced Computing Center - The University of Texas at Austin

• University of Virginia

xsede.org

What is XSEDE?

Page 3: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Table ofContentsOn the cover: The Advanced Visualization Laboratory at the National Center for Supercomputing Applications generated this visualization of supernova data from Volker Bromm atThe University of Texas at Austin. The image represents an initial data study, and the collaboration eventually yielded a final rendered scene for the feature film ‘The Tree of Life.’ Courtesy of the Advanced Visualization Laboratory at the National Center for Supercomputing Applications

Dawn of the XSEDE Era 2

John Towns, leader of the National Science Foundation’s new Extreme Science andEngineering Discovery Environment, talks about the vision for XSEDE and how it will build on the TeraGrid.

Science highlights

For the Birds 6Supercomputers and citizen scientists converge to pinpoint avian populations

Malaria Mystery Solved 8 Humans likely source of malarial infections in great apes, not the other way around aspreviously thought

Ice, Ice, Baby 10University of Washington researchers explore mysterious Antarctic sea ice

Improving Nature’s Top Recyclers 12 The National Renewable Energy Laboratory uses TeraGrid supercomputers to explore new enzymes for renewable fuels

Placing Landmarks on the 14 Genome Map Researchers show for the first time that differences in DNA between individuals can affect the binding of transcription factors

Turbulent Times 16TeraGrid aids scientists in developing novel technique to reduce jet noise

Cold Dark Matter Lives 18An international team led by Universityof Washington astrophysicists appears to have solved the problem of dwarf galaxies

No Charge Double Helix 20Researchers derive the first accurate 3D structure of a synthetic double-helical molecule that holds promise for applications in biomedicine and nanotechnology

A Recipe for Science Success 22Collaboration between Open Science Grid and TeraGrid aims to giveresearchers the right tools

Education, Outreach, and Training highlights

Building Skills that Count 26A sampling of the education and outreach programs offered by TeraGrid partners

Champions Help 28Campuses Connect Swarthmore exemplifies how dedicatedchampions broaden TeraGrid/XSEDE reach

Compelling, Ferocious Beauty 30Cosmic simulations and visualization skill contribute to acclaimed feature film ‘The Tree of Life’

Taking Training on the Road 32Collaboration with Southeastern Universities Research Association provides visualization workshops to minority-serving institutions

Being ‘Smart’ at Home 34TeraGrid storage and visualization resources aid ‘smart grid’ research

Page 4: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

John Towns, leader of the National Science Foundation’s new Extreme Science and Engineering Discovery Environment, talks about the vision for XSEDE and how it will build on the TeraGrid.

X S E D ET

ER

AG

RID

Dawn of the XSEDE era

Page 5: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Research no longer typically happens in the context of a single investigator on a single campus. Instead, today’s investigators are collaborating across institutional and geographic bound-aries. To be successful, researchers need access to dispersed resources, including instruments, data stores, and high-perfor-mance computers and, critically, to the tools and services that enable coordinated use and sharing of those resources.

The intent with XSEDE is to create the integrated environment in which all of these resources and services are available. We aim to establish a cyberinfrastructure ecosystem that allows us to interoperate with other resources, with other infrastructure providers, and in which researchers and educators can be much more productive and can begin to develop new capabilities.

For example, we will lower the entry barrier for institutions and collaborations to connect to XSEDEnet by using National LambdaRail’s (NLR) FrameNet services. In Year 1, XSEDEnet will provide dedicated 10 Gbps connectivity to the core XD Service Providers (Indiana, NCSA, NICS, NCAR, PSC, Purdue, SDSC and TACC); then in Year 2 XSEDE will add a service to enable collaborators to create on-demand high-performance networks between XSEDE service providers and many other potential sites around the country.

XSEDE will also leverage GlobusOnline services to easily move data from campus servers, laptops, and desktops, allowing high-performance data movement to and from XSEDE resources.

Of course, along with new services and tools, we also want to continue providing the strong support that people relied on throughout the TeraGrid’s decade of operation; we want to make the transition from TeraGrid to XSEDE as non-disruptive as possible. Ralph Roskies and Nancy Wilkins-Diehr jointly lead XSEDE’s Extended Collaborative Support Services (ECSS), which encompasses Advanced Support for Research Teams (a continuation of TeraGrid’s Advanced Support for TeraGrid Applications), Advanced Support of Community Capabilities, and Advanced Support for Training, Education and Outreach.

XSEDE ECSS will also include support for Novel and Innovative Projects, an effort led by Sergiu Sanielevici at PSC. This effort will extend support to domain areas that have not typically tapped into high-performance computing, such as economics and computational linguistics, and to under-represented communities and institutions.

As XSEDE gets under way, what I find most exciting is the potential to include a lot more disciplinary areas and a lot more researchers who might not have had easy access to the resources in the past. New disciplines, new users, and being able to increase productivity in order to enable new science and engineering—that’s what makes the XSEDE project really exciting.

John TownsXSEDE Project Director

Dawn of the XSEDE era

0203

Page 6: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Science highlights

TE

RA

GR

IDT

ER

AG

RID X S E D E

Page 7: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Science highlights

Discussion of cyberinfrastructure projects such as TeraGrid and XSEDE often focuses on flops and bytes and data transfer rates, on hardware and software, on code and computers. But what these projects are really about is gaining new knowledge. Over the past decade, thousands of researchers used the resources, tools, and support provided by TeraGrid to better understand climate change, the flow of blood in our bodies, and the evolution of the universe. Thousands more investigators will use XSEDE to tackle these and many other challenges. The following pages offer just a small sampling of the research enabled by TeraGrid over the past year, and a hint of what may be to come with XSEDE.

SC

IEN

CE

HIG

HL

IGH

TS

0405

Page 8: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

X S E D E

Sometimes everything just comes together, creating a sum much greater than its parts. Such is the case with eBird, a bird-monitoring project by the Cornell Laboratory of Ornithology that is using a unique National Science Foundation (NSF) collaboration to revolutionize bird conservation and numerous areas of environmental science.

“Three distinct entities came together to make this possible: citizen scientists, a unique statistical algorithm, and the existence of large-scale (high-performance computing) facilities,” says John Cobb, a principal investigator for TeraGrid at Oak Ridge National Laboratory (ORNL) in Tennessee.

Birds are often the first to suffer when damage hits an ecosystem. For that reason they are a widely acknowledged environmental indicator. Thanks to the NSF’s Office of Cyberinfrastructure DataONE and TeraGrid initiatives, along with support from the Leon Levy Foundation, eBird was able to show, for the first time, how bird populations move week-by-week across America and identify the environmental conditions associated with these population movements.

TE

RA

GR

ID

Supercomputers and citizen scientists converge to pinpoint avian populations

For the Birds

Indigo Bunting (Passerina cyanea) distribution for June 28, 2008. The map shows the predicted occurrence corrected for variation in detectability associated with search effort. This estimate was derived from a model using data from eBird and data describing the local environment. Brighter areas indicate higher probability of occurrence. Courtesy: Daniel Fink, Information Science Department, Cornell Lab of Ornithology

Page 9: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

“DataONE has brought together a lot of people and resources,” says Cornell Laboratory of Ornithology statistician Daniel Fink. In fact, eBird was DataONE’s pre-release science demonstration project, and the leading NSF digital data archive wanted an early science impact—which this project certainly delivered.

Essentially, the eBird project enlists the help of thousands of enthusiastic birdwatchers to record bird sightings along with the precise location and time. These observations are reviewed by a network of expert ornitholo-gists and entered into eBird’s database, where environmental data describing the search locations are linked to the eBird records. This data is then fed to Texas Advanced Computing Center’s (TACC) Lonestar supercomputer, where a unique statistical algorithm is deployed to discover the dynamic associations between the environment and observed patterns of bird occurrence. Ecologists use these results to estimate bird occurrence week-by-week across the country, creating a dynamic bird census that provides invaluable information to conservationists and environmental scientists alike.

Among eBird’s more interesting findings: species with broad distributions will actually adapt to local niches distinct in different areas of the continent at different times of the year, and conservationists want to know what and where those niches are.

The maps provide so much valuable information that they were recently featured in the annual State of the Birds 2011 Report on Public Land and Waters, the nation’s first assessment of the distribution of bird species on public lands as a measure of stewardship responsibility. “The State of the Birds report is a measurable indicator of how well we are fulfilling our shared role as stewards of our nation’s public lands and waters,” notes Secretary of the Interior Ken Salazar. Thanks to the TeraGrid, policymakers got their first fine-grained nationwide glimpse at bird species distributions, a feat that just a few years ago dwelled in the realm of the impossible.

“TeraGrid allows us to provide information for multi-species analysis which is very useful in the ecology and conservation world and will allow us to do year-by-year difference compar-isons to study responses from environmental change,” adds Fink. “TeraGrid was absolutely necessary for the State of the Birds Report, and there was no way we could have done it without Lonestar.”

Going forward, the eBird project aims to chart more than 200 species using more than six years’ worth of data from a 3 million-hour allocation. The data will be broken apart year by year to reveal changes over space and time, a great point of interest across environmental arenas.

“Without TeraGrid, we would be doing boutique analysis on our local cluster one species at a time,” says Fink.

For more information: www.ebird.org

Grant #: TG-deb110008

SC

IEN

CE

HIG

HL

IGH

TS

0607

Page 10: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TE

RA

GR

ID X S E D E

For centuries, malaria has mystified physicians and terrified patients, claiming more childhood lives than any other infectious disease across large sections of the world. Though much has been learned about the genetics of various Plasmodium parasites, which cause malaria across vertebrate species, key aspects of the evolutionary relationships of these parasites have been elusive.

With the aid of a portal linking them to TeraGrid expertise and computational resources, researchers led by the University of Maryland and the University of South Carolina have clarified the evolutionary history of Plasmodium by analyzing an unprecedented 45 dis-tinct genes from genomes of eight recently sequenced Plasmodium species.

The results, published online in the journal Parasitol-ogy on Dec. 1, 2010, offer the first comprehensive dat-ing of the divergence of these individual Plasmodium species and provide new insights into the complex relationship between Plasmodium species and their mammalian hosts.

Humans likely source of malarial infections in great apes, not the other way around as previously thought

Malaria Mystery Solved

Page 11: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

0809

“The results clarify the ancient association between malaria parasites and their primate hosts, including humans,” says James B. Munro, a researcher from the University of Maryland School of Medicine. “Indeed, even though the data is somewhat noisy due to issues related to nucleotide composition, the signal is still strong enough to obtain a clear answer.”

A major finding of the research is that humans likely serve as a reservoir for P. falciparum–that is, humans are likely to transmit this most virulent among all human-infecting Plasmo-dium species to great apes, not the other way around. This finding contradicts previous stud-ies, which suggested that humans acquired P. falciparum from apes. The results obtained in this study argue that “if P. falciparum infections in great apes are derived from humans, [there may be a] need to establish refuges for the great apes that are safe from human intrusion.”

The research builds on the unveiling of the genome sequences of the two most wide-spread human malaria parasites–P. falciparum and P. vivax–and the monkey parasite P. knowlesi, together with the draft genomes of the chimpanzee parasite P. reichenow; three rodent parasites, P. yoelii yoelli, P. berghei, and P. chabaudi chabaudi; and one avian parasite, P. gallinaceum. To examine the association between malaria parasites and their primate

hosts, the researchers sought to compare ge-netic variations found in 45 highly conserved nuclear genes for which sequences are avail-able from all eight Plasmodium species.

The evolutionary relationships were inferred using, among others, the software package MrBayes, which consumed about 200,000 CPU hours on Abe, a supercomputer at the National Center for Supercomputing Applica-tions (NCSA). The researchers accessed this resource via the CIPRES Science Gateway, a browser interface developed at the San Diego Supercomputer Center (SDSC) that permits access to TeraGrid compute resourc-es. “Without CIPRES, this work, and the other projects I am working on, would not go as quickly or as smoothly,” adds Munro. “CIPRES is a fantastic resource.”

Other phylogenetic or divergence time analy-ses were conducted on the Brazos computer cluster at Texas A&M University and on the cluster at the Institute for Genome Sciences at the University of Maryland School of Medi-cine. Further studies are expected to be run on Trestles, a new data-intensive high-perfor-mance computing resource at SDSC.

For more information: http://www.phylo.org/sub_sections/portal/

Grant #: GM43940, 5R01 GM070793-03

SCIENCE GATEWAYS: It sometimes takes a community

About five years ago, TeraGrid launched Science Gateways, an innovative program designed to provide a full range of specialized services and capabilities—computational analysis, visualization, workflows development, collaborative tools, and more—to a broad range of scientific communities. By the end of its first year, 2006, about 100 users signed up. Since then, the number of users has soared to nearly 1,200, accounting for roughly 36 percent of all TeraGrid users charging jobs.

Communities came together to address problems in astron-omy, chemistry, earthquake mitigation, geophysics, global atmospheric research, biology and neuroscience, cognitive science, molecular biology, physics and seismology, among others. This year, the largest community—CIPRES—represent-ed 890 users, or 25 percent of all TeraGrid users charging jobs.

“Science Gateways is an excellent example of how to democ-ratize science,” said Nancy Wilkins-Diehr, former TeraGrid area director for the program and now the co-leader of XSEDE Extended Collaborative Support Services. “It provided anyone, even those at small institutions, with access to the largest-scale HPC resources and expertise, with projects spanning a wide variety of research domains.”

A few Science Gateways highlights:

• Ultrascan connects hundreds of users globally to an ultracen-trifuge at the University of Texas (UT) Health Sciences Center in San Antonio; TeraGrid resources run the high-resolution analysis of the experimental data.

• As an example of multidisciplinary work enabled by gate-ways, the Community Climate System Model (CCSM) was used by a joint political science and earth and atmospheric sciences class at Purdue University to simulate the influence of public policy decisions on climate change.

• GISolve, a grid-based environment for computationally intense geographic analysis, made the cover of the Proceedings of the National Academy of Sciences (PNAS).

SC

IEN

CE

HIG

HL

IGH

TS

Page 12: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TE

RA

GR

ID X S E D E

University of Washington researchers explore mysterious Antarctic sea iceIce, Ice, Baby

Surface temperature (in degrees Celsius) response to depleting ozone over the second half of the 20th century. The response at low resolution (left) is about twice as strong as at high resolution (right).Courtesy: Cecilia Bitz, University of Washington

Current speeds (in cm/s) and sea ice extent (15% concentration contour) are shown for a randomly chosen October in the high- and low-resolution runs for preindustrial ozone levels. Depleting ozone increases the westerly wind stress, and therefore strengthens the ocean currents. This research addresses whether the parameterized eddy response at low resolution (left) correctly reproduces the resolved eddy response at high resolution (right).Courtesy: Cecilia Bitz, University of Washington

Page 13: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

The Southern hemisphere has a number of eccentricities familiar to those of us from North of the equator: Penguins instead of polar bears, toilets that supposedly flush backwards, and a night sky that features a different cast of stars. The poles are no exception.

Unlike the Arctic, which is notoriously shedding more and more ice every year, the amount of Antarctic sea ice—literally frozen, floating ocean water—is actually increasing. While scientists aren’t exactly sure why, they do have a few main suspects. The current culprit of choice is the hole in the ozone layer that hovers over the Antarctic continent, creating a number of natural phenomena, not the least of which is increased wind circulation, creat-ing a lower surface temperature on the Antarctic continent and altering ocean heat transport.

Not so fast, says Cecilia Bitz of the University of Washington and the principal investigator of the most detailed simulations to date of Antarctic sea ice. She too was a believer, until her team, a branch of the National Science Foundation’s PetaApps program, ran 10km simulations of the Antarctic ice sheet using the Community Earth Systems Model (CESM) to determine if, as expected, the depletion of ozone at the bottom of our planet is indeed causing the sea ice to expand.

CESM is a fully-coupled, global climate model that provides state-of-the-art simulations of the Earth’s past, present, and future climate states. It is among the most sophisticated climate models available, a global model that provides consistent simulation of high-resolution effects. However, even CESM has its problems, as previ-ous models of the Antarctic sea ice showed a decrease in area annually, in contrast with observations that show an overall expansion.

Bitz’s hypothesis was that the models were too coarse, and by ramping up the resolution ten-fold her team could get to the truth. For exam-ple, other factors, such as ocean eddies, could play a major role, but they have been largely parameterized to vary with wind strength. Bitz believed that the ocean’s response to increased wind strength in the model’s coarse resolution was incorrect, downplaying important factors such as ocean currents and eddies.

After months of preparation, Bitz’s team began the two-month-long process of conducting the simulations. Most of the runs used 6,000 of Kraken’s more than 112,000 cores at the National Institute for Computational Sciences (NICS). Kraken was a TeraGrid supercomputer and is now available through XSEDE. Her team consumed more than 11 million CPU hours, with each simulation generating approximately 50 terabytes of data. “Just to analyze and inter-pret that amount of data is intensive,” says Bitz.

The team’s analyses of the Antarctic sea ice simulations aren’t exactly what were expected, but it does seem that they are getting closer to nature. “A lot of the subtle behavior is different at fine resolution,” notes Bitz. However, it is clear to Bitz that the expansion is likely not solely due to ozone.

While the team’s simulations didn’t put an end to specu-lation regarding the expansion of the Antarctic sea ice, they were a milestone in climate modeling. Bitz is grate-ful to the NSF’s PetaApps program. “I could only do what I did with the help of that group,” she says, adding that NICS and Kraken “have both been very helpful to me.”

Bitz is also using two new XSEDE resources at NICS and the Texas Advanced Computing Center (TACC) to teach a climate modeling course and study geoengineer-ing as a means to offset our increase in carbon dioxide production. Theirs is one of a handful of computational geoengineering studies that helps to determine how a drastic human-induced change might interrupt the Earth’s environmental systems. The work follows up on other atmospheric studies by Bitz, including a recent publication in Nature that suggested, based on TeraGrid simulations, that greenhouse gas mitigation can reduce sea-ice loss and increase polar bear survival.

For more information: http://www.atmos.washington.edu/~bitz/

Grant #s: NSF #OPP-0938204 and DOE #0013706

Cecilia Bitz,University of Washington

In color is the annually averaged surface temperature over the planet, overlain with the change in winds at 850 millibars due to increased carbon dioxide and a strato-spheric sulfate layer. The magnitude of these atmospheric circulation changes, especially over the Southern Ocean, is similar to that induced by just an increase in carbon dioxide. Courtesy: Cecilia Bitz, University of Washington

SC

IEN

CE

HIG

HL

IGH

TS

1011

Page 14: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

X S E D E

Improving Nature’s Top Recyclers

The National Renewable Energy Laboratory uses TeraGrid supercomputers to explore new enzymes for renewable fuels

TE

RA

GR

ID

A coarse-grained model of the bacterial cellulosome system during the self-assembly process. The long scaffold (blue) contains binding sites for the free enzymes (red, yellow, and green) of different sizes. Courtesy: National Renewable Energy Laboratory

Page 15: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

If a tree falls in the forest and there are no enzymes to digest it, does it decompose?

It’s a question that has important ramifications for the renewable energy industry. Scientists and engineers are studying ways to transform non-food-based plant material into transporta-tion fuel—think alfalfa stalks and wood chips, as opposed to the edible corn grains used in the production of ethanol.

“Cellulose in the biosphere can last for years,” says Gregg Beckham, a scientist in the National Bio-energy Center at the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL). “It’s really tough, and we want to know why (this happens) at the molecular scale.”

Despite the toughness of plant cell walls, fungi and bacteria have evolved enzymes to convert abundant cellulosic plant matter into sugars to use as energy to sustain life. Unfortunately, the most powerful enzymes don’t work fast enough to break down cellulose at a pace—and price— that is competitive with fossil fuels…yet.

In an effort to improve nature’s top recyclers, computational scientists at NREL are trying to create “designer” enzymes capable of speeding up bio-fuel production and thereby lowering the cost of biomass-derived fuel to serve the global population.

“It’s a Goldilocks problem,” notes Beckham. “The enzymes have to be ‘just right.’ We’re trying to find out what just right is, why, and how to make mutations to the enzymes to make them most efficient.”

NREL’s computational researchers used TeraGrid supercomputers to simulate processes in the world of enzymes. Using Ranger at the Texas Advanced Computing Center (TACC) and the Red Mesa system at NREL, they simulated enzyme behavior from the cellulose-devouring bacteria Clostridium thermocellum and the prodigiously plant-eating fungus Trichoderma reesei.

After creating a computational model of the molecules and setting them into motion in a virtual environment, the researchers learned how the bacteria forms scaffolds for its en-zymes, which work together to break apart the plant.

Contrary to expectation, the larger, slower-moving enzymes lingered near the scaffold longer, allowing them to bind to the frame more frequently, while the smaller ones moved faster and more freely through the solution, but bound less often.

The results of this study were reported in the Journal of Biological Chemistry in February 2011, and insights from the simulations are being used to create designer enzymes to make bio-mass conversion faster, more efficient, and less expensive.

Using Ranger, the scientists also studied underex-plored parts of the enzyme that the Trichoderma reesei fungus uses to break down cellulose. They found that the cellulose surface has energy wells set one nanometer apart—a perfect fit for the binding module. In addition, they found that the linker region, previously believed to contain both stiff and flexible regions, behaves more like a highly flexible tether. These findings were reported in the Biophysical Journal in December 2010.

“We’re using rational design to understand how the enzyme works, and then to predict the best place to change something and test it,” says Michael Crowley, a principal scientist at NREL.

“If we can help industry understand and improve these processes for renewable fuel production, we’ll be able to offset a significant fraction of fossil fuel use in the long term,” according to Beckham.

For more information: http://www.nrel.gov/biomass/staff_pages/gregg_beckham.html

Grant #: TG-MCB090159

Back row, left to right: Yannick Bomble, Michael Crowley, and Gregg Beckham. Front row, left to right: Antti-Pekka Hynninen, Mark Nimlos, Christy Payne, and Deanne Sammond. Not shown: Lintao Bu, James Matthews.

SC

IEN

CE

HIG

HL

IGH

TS

1213

Page 16: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TE

RA

GR

ID X S E D E

Researchers show for the first time that differences in DNA between individuals can affect the binding of transcription factors

Placing Landmarks on the Genome Map

Human chromosome 21 with a small region outlined in red. The main rectangle below is a close-up of the outlined region, showing the binding locations of three transcription factors along the chromosome. Courtesy: Vishy Iyer, The University of Texas at Austin

Representation of allele-specific and non–allele-specific SNPs across the CTCF binding motif. The y axis indicates the difference between the two as a percentage of normal-ized total SNPs. Higher bars indicate an increased representation of allele-specific SNPs relative to other positions, which tends to occur at conserved positions. Courtesy: Vishy Iyer, The University of Texas at Austin

We typically think of heredity—eye color, body type, or susceptibility to a disease—as rooted in our genes. And it is. But as biologists sequence more ge-nomes and analyze their results, they’re finding that the non-coding regions of the genome play a role in our traits as well.

One example involves the role that transcription factor proteins play in gene regulation, which scientists are just beginning to explore on a genome-wide scale. These proteins bind to the genome and act as control dials for gene regulation—turning genes on or off, or determining the amount of gene activity in a cell.

“If you’re comparing normal cells to cancer cells, you want to know what happened in the cancer cell that makes it different,” says Vishy Iyer, an associate professor in the Institute for Cellular and Molecular Biology at The University of Texas at Austin. “The gene expression patterns change, and we want to know which genes are up or down regulated, and how that came about.”

About 2,000 of the transcription factor proteins have been identified, and some have been linked to breast and other cancers, Rett syndrome, and autoimmune diseases. However, little is known about how they work.

Page 17: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Iyer, along with colleagues at Duke, The University of North Carolina-Chapel Hill, and Hinxton, UK, are trying to change that. Their study—based on simulations performed on the Ranger supercomputer at the Texas Advanced Computing Center (TACC) and published in Science in 2010—was one of the first to use supercomputers and next-generation gene sequencing to explore the expression of genes related to a specific regulatory transcription factor, called CTCF.

“We showed for the first time that some of the differences in DNA between individuals can affect the binding of transcription factors and more importantly, that those differences could be inherited,” according to Iyer.

The group used a relatively new sequencing technology, called ChIP-Seq, to pull out only the regions of DNA to which the proteins of interest were bound. These base pairs were then sequenced to determine the order of nucleotides and to count how many mol-ecules of the promoter were bound to the protein.

Sounds simple enough, until you try to sequence millions of these regions and locate their exact position among the approximately 3 billion base pairs in the human genome.

“The genome is a vast area with many features,” explains Iyer. “You can think of the proteins as landmarks that we’re trying to place on the genome map.”

Using several thousand processors simulta-neously, Ranger was used to take the short sequence reads generated by ChIP-Seq and align them to the reference genome. The single base resolution offered by next-gen sequencing enabled the researchers to look at individual known differences in the DNA and to use those dissimilarities to examine how genes on each chromosome bind transcription factors.

“We were able to tell the difference in binding from the gene that you inherited from your father and mother—that was the big advance,” notes Iyer. “We’re now applying this technol-ogy to cases where you know that the gene from one of your parents has a mutation that predisposes you to some disease.”

The findings bring science one step closer to personalized medicine based on a detailed reading of an individual’s genome, including the non-coding regions. Despite the tre-mendous complexity of the genome, Iyer is optimistic that his group’s research will have an impact on human health.

“There are lots of diseases and for a subset, they’ve got to be affecting gene expres-sion by impacting transcription factors,” he adds. “If we pick the diseases and the factors smartly, I think we’ll find them.”

For more information: http://microarray.icmb.utexas.edu/research.html

NIH Project #: 5R01CA095548-07

Placing Landmarks on the Genome Map

Vishy Iyer, The University of

Texas at Austin

SC

IEN

CE

HIG

HL

IGH

TS

1415

Page 18: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TE

RA

GR

ID X S E D E

TeraGrid aids scientists in developing novel technique to reduce jet noise

Turbulent Times

Small, well-timed disturbances added to an uncontrolled Mach 1.3 turbulent jet (left) result in the quieter, controlled jet (right). Though only subtly different, the controlled jet is producing 30 percent less noise as visualized by the black-and-white contours of dilatation, a measure of air’s compression rate. The sound-generating turbulence, as indicated by the vorticity, is shown as color. Courtesy: Daniel Bodony, University of Illinois at Urbana-Champaign

Airlines and aircraft manufacturers are under in-creasing pressure to keep noise levels low for airport personnel and residents in surrounding neighbor-hoods. About every 10 years, the International Civil Aviation Organization reduces the maximum noise an airplane can produce before it can be certified and sold to commercial airlines.

Aircraft are barely able to meet the current noise restriction levels. When even tighter restrictions are mandated in a few years, no one has a ready-made solution.

Daniel Bodony, an assistant professor of aerospace engineering at the University of Illinois at Urbana-Champaign, is working to address this issue. Along with Jon Freund of Illinois, and Jeonglae Kim, a graduate student, Bodony is part of a NASA-funded effort to decrease jet engine noise by controlling the unsteady movement of air, known as turbulence.

Instead of working in a wind tunnel or laboratory, the team used a TeraGrid supercomputer at the Texas Advanced Computing Center (TACC) to simulate the evolution of turbulence-generated sound waves from jet engine exhaust. The simulations help explain how sound is generated on the most basic level, and how it can be controlled using a new device.

Page 19: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

“We’re studying the controlled jet and the uncontrolled jet to understand what changes between them,” says Bodony . “That’s what experiments can’t currently do and what is miss-ing from our understanding of the science.”

Bodony, Freund, and Kim use a numerical tech-nique called “large eddy simulation” to simulate the motion of air around the jet. The simulations show the amount of turbulence flowing around the jet and, importantly, the amount of sound that this turbulence creates.

“Unfortunately, the noise is not generated where you can control it directly, so you have to add a control someplace else, like on the nozzle, and tickle the flow in such a way that the sound is reduced at a later spot in the jet,” Bodony explains.

After conducting four years of research and relying on TeraGrid resources, Bodony and his collaborators developed a novel technique to determine the optimal controller required to reduce jet noise. The controller is a plasma actuator—something like a giant spark plug—based on those developed by colleagues at Ohio State that alters the sound field by injecting heat.

“We can’t squash the turbulence,” Bodony notes. “Our controllers aren’t that strong and it may not even be possible or desirable. So, we add additional perturbations to reorganize the pre-existing disturbances such that the unsteady forces and stresses within the fluid are less.”

The simulations on TACC’s Ranger and Lonestar 4 determined the ideal timing and strength of the perturbations to reduce the engine’s radiated sound without significantly altering its thrust. The first round of improvements showed the potential to reduce jet noise by three decibels, or the equivalent of 30 percent, which equals the best that has been found experimentally by trial and error. Bodony is confident that with further refinements, his group will be able to reduce the noise level even further.

The design insights that Bodony uncovered are expected to reduce the sound levels on “N+3” generation aircraft, NASA’s shorthand for aircraft

fielded three generations in the future. Bodony expects such a device, if successful, to come to market in 10 to 15 years.

If that sounds long, consider that the new Boeing 787, the first commercial airliner to be equipped with noise-controlling devices, called chevrons, contains elements designed 15 years ago.

“This work is computationally and intellectually demanding,” says Sanjiva Lele, a professor of mechanical engineering at Stanford University who is familiar with the research. “But if system-atic methods to reduce noise can found, the benefit to the aviation community would be tremendous.”

Results of the group’s theoretical and simulation work were published in the Journal of Sound and Vibration in February 2011.

For more information: http://www.ae.illinois.edu/ people/faculty/bodony.html

Grant #: CTS090004

Daniel Bodony,University of Illinois at

Urbana-Champaign

Extensibility

The advanced computing centers of XSEDE offer un-paralleled power to researchers, but these systems are by no means the only computers that scientists use. In fact, the computing power distributed among the tens of thousands of campuses in the United States—in departmental clusters, IT centers, and university-wide data repositories—is far beyond the capacity provided by XSEDE.

To leverage these capabilities, XSEDE will allow users to connect other resources—local, regional, or inter-national—to the XSEDE network easily and on

an ad hoc basis. For a researcher like Bodony, this might mean linking local clusters in the Aerodynamics Engineering department at the University of Illinois to Ranger and Kraken in order to verify sample problems or compare scaling results.

This extensibility matches XSEDE’s new focus on ad-dressing the computing needs of all researchers, not just those who use high-performance computers. By bridging the campus and national cyberinfrastructure, XSEDE will enable more researchers to take advantage of the national computing ecosystem to accomplish important work.

SC

IEN

CE

HIG

HL

IGH

TS

1617

Page 20: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TE

RA

GR

ID X S E D E

Cold Dark Matter Lives An international team led by University of Washington astrophysicists

appears to have solved the problem of dwarf galaxies

These two frames from Governato and colleagues’ simulation of a dwarf galaxy at high resolution show light distribution face-on (left) and edge-on. Courtesy: Chris Brook, The Jeremiah Hor-rocks Institute at the University of Central Lancashire, and Patrik Jonsson, Center for Astrophysics, Harvard

A galaxy from Governato and colleagues’ simulation (left) appears in all respects identical to a real galaxy (right) and background image from the Sloan Digital Sky Survey Collaboration.Courtesy: Chris Brook, The Jeremiah Horrocks Institute at the University of Central Lancashire, and Patrik Jonsson, Center for Astrophysics, Harvard

Although small in size, dwarf galaxies have posed a big problem for cosmological theory. Many of these galaxies, which have only about one percent of the number of stars in the Milky Way, orbit our galaxy, and astronomers who study them know they are “bulgeless”—with a disc-like star distribution that on edge looks like a Frisbee.

Yet astrophysicists who run computational simula-tions have been seeing bulges.

These simulations test the reigning “cold dark matter” (CDM) model of how galaxies form, and although a central bulge of stars is common in larger galaxies such as the Milky Way, when CDM simula-tions of dwarf galaxies show them, as they have for about 15 years, researchers have suggested there’s something wrong with the model. This discrepancy has been a serious stumbling block for CDM, since in nearly every other way CDM-based simulations have shown excellent agreement with observations.

Page 21: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Similarly, observations of dwarf galaxies in space show a shallow, unconcentrated distribu-tion of dark matter, the invisible matter that comprises the largest part of the universe’s mass, while CDM simulations have shown dwarf galaxies with dark matter centrally concen-trated. “Basically we have a model that’s really good at explaining a lot of what’s going on in the universe,” says University of Washington astrophysicist Fabio Governato, “but there’s been these two sore points: bulges and dense dark-matter halos in the dwarf galaxies.”

“This failure is potentially catastrophic for the CDM model,” Governato and his University of Washington colleague Tom Quinn and an inter-national team of collaborators wrote in a paper published in Nature in January 2010, reporting simulations that convincingly resolve the dwarf galaxy problem. With improved accuracy made possible in part by access to more than a million hours of TeraGrid computing, mainly at the National Institute for Computational Sciences (NICS) and the Texas Advanced Computing Cen-ter (TACC), their simulations show dwarf galax-ies without bulges, meaning the distribution of stars and dark matter agree well with observed dwarf galaxies.

Quinn, Governato, and their colleagues this year extended their 2010 findings, including an April 2011 report in Astrophysics Journal of “artificial observations.” For this data-intensive compari-son between their simulated dwarf galaxies and observed dwarf galaxies (from THINGS, The HI Nearby Galaxy Survey), notes Quinn, “shared memory is really helpful,” and the Pittsburgh Supercomputing Center’s (PSC) Blacklight, one of the newest TeraGrid resources and the largest shared-memory system in the world, contribut-ed significantly. Analyses show good agreement between the simulated and observed galaxies, and lend validation to the simulations.

Making the difference, along with more power-ful computing, was improvements to GASOLINE, an astrophysics simulation software developed over a 15-year period by Quinn, James Wadsley of McMaster University, and Joachim Stadel of the University of Zurich. By making changes in how GASOLINE represented the physics involved and with higher resolution than had before been possible, the researchers more realistically captured the processes of star formation and evolution, including the violent star death and spectacular gas outflow phenomena of super-novae. “It was a massive computational project,” says Governato. “This kind of research wasn’t possible just three years ago. We took advantage of the fact that computers are getting faster and faster.”

Overall, the findings from this ongoing work are a game changer for the CDM model and have gath-ered commentary in several science journals. “Real-istic dwarf galaxies are thus shown to be a natural outcome of galaxy formation in the CDM scenario,” wrote the researchers in Nature. Or as Governato puts it, “CDM lives to fight another day.”

For more information: http://www.astro.washington.edu/users/fabio/

Grant #s: NSF-AST-0607819; TG-MCA94P018

(Left)Fabio Governato,

University of Washington

(Right) Tom Quinn, University

of Washington

SC

IEN

CE

HIG

HL

IGH

TS

1819

Page 22: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TE

RA

GR

ID X S E D E

Researchers derive the first accurate 3D structure of a synthetic double-helical molecule that holds promise for applications in biomedicine and nanotechnology

Superposition of 10 time-averaged PNA structures obtained by restrained MD. Courtesy: Carnegie Mellon University

Over the course of about 4 billion years, nature has evolved remarkably efficient ways to transfer electrons from atom to atom within living organisms to produce energy from food. Our energy for getting up in the morning, for work or for pleasure, all comes from these processes of “controlled burning” that depend on electron transfer.

“Just like we transfer electricity through power lines to heat and light our homes, we transfer electrons in our bodies to metabolize food,” says Catalina Achim, an associate professor of chemistry at Carnegie Mellon University. “But nature does it very efficiently, while we don’t yet know how to take oil and make our cars go without wasting a lot of energy. We know some of the basics of how electron transfer works, and many scientists study these processes so we can learn to apply them.”

To that end, Achim worked with a team, including TeraGrid scientist Marcela Madrid from the Pittsburgh Supercomputing Center (PSC), using TeraGrid resourc-es to solve the structure of a fascinating “bio-mimetic” molecule called peptide nucleic acid, or PNA. Although PNA doesn’t exist in nature, it’s a close cousin to DNA but with a special advantage: It doesn’t have a charge.

No Charge Double Helix

Side (top) and axial (bottom) view of PNA simulated by Achim, Madrid and colleagues. Courtesy: Carnegie Mellon University

Page 23: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

DNA’s helical strands have negative charges in the backbone, and when the four A-G-C-T bases pair up via hydrogen bonds to form the DNA double-stranded helix, there’s built-in electrostatic repulsion between the strands. The DNA structure is overall always negatively charged, according to Achim.

To circumvent this, Achim and her colleagues sub-stituted peptide-like groups (small proteins) for the phosphate groups of the DNA backbone. The result-ing neutral double helix helped further the study of electron transfer and offers useful applications, such as a molecular “scaffold” for metal ions to deliver electrons to cells or biological molecules.

The necessary first step for these applications, though, is having an accurate 3D structure of PNA in solution. A static structure from an x-ray crystal-lographic study was available, but in biological applications PNA isn’t static; it’s flexible and mobile. To attain its 3D structure in this state required a combination of NMR spectroscopy and molecular dynamics (MD) using supercomputing resources.

A “Collaborative Research in Chemistry” grant from the National Science Foundation (NSF) supported this project, and teamwork among Achim, Madrid, and Achim’s partners at Carnegie Mellon and Duke University produced results. First, Achim and her Carnegie Mellon colleague Danith Ly and their students synthesized PNAs with different chemi-cal structure and flexibility in solution. A graduate student in Achim’s lab worked with Carnegie Mellon chemist Roberto Gil on the 2D NMR spectroscopy of the synthesized PNAs, which provided a matrix of distances between protons in the molecules.

From this data, and in order to derive an accurate 3D structure, PSC’s Madrid turned to MD, which simulates the movement of a biomolecule by track-ing the forces between the atoms over time. In this case, Madrid—relying on PSC’s SGI Altix system called Pople—used “restrained” MD, a technique that made it possible, along with a package of molecular simulation programs called AMBER, to determine a family of PNA structures that fit with the NMR data.

The results, reported in 2010 in Molecular Biosys-tems, a journal of the Royal Society of Chemistry, provide for the first time the 3D structure of this PNA molecule, allowing Achim and her collabora-tors to embark on new, more detailed studies of electron transfer. Potential applications include attaching metal ions to the PNA scaffold to create metal-PNA complexes that can catalyze reactions.

Similarly, Achim foresees PNA being used to create “nanowires”—100,000 times finer than a human hair—for quantum circuitry, in which quantum characteristics can lead to electronics that are much faster than today’s integrated circuitry.

Achim expects that the new shared-memory Black-light supercomputer at PSC will help to further advance her work. “I look forward to continuing the collaboration,” she says, “and to using this new resource.

“Our collaboration with PSC was very beneficial,” says Achim, “not only for the research itself but also for educational purposes. Working with Mar-cela Madrid, my graduate students learned how to do molecular dynamics simulations.”

For more information: http://www.chem.cmu.edu/groups/achim/

Grant #s: NSF-CHE-0347140; TG-MCB0700070N

Catalina Achim, Carnegie Mellon University, with her laboratory group. Courtesy: Carnegie Mellon University

SC

IEN

CE

HIG

HL

IGH

TS

2021

Page 24: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TE

RA

GR

ID X S E D E

A Recipe for Science Success

Imagine cooking a gourmet meal from scratch using only a knife. With just that one tool, some steps, like mincing onions and slicing carrots, would be quick and easy because your tool was designed for those tasks. Other steps would be slow, producing sub-standard results, and some might not be possible at all—imagine trying to whip egg whites or taste the soup.

Computational research is no different. Some “recipes”—known as workflows in computing— involve few steps that require only one tool. Others involve multiple steps, each requiring different tools.For example, Open Science Grid (OSG), jointly funded by the Department of Energy and the National Science Foundation, is optimized to perform high-throughput computing (HTC), while systems available through TeraGrid were designed for high-performance computing (HPC).

“In the past, researchers used either TeraGrid or Open Science Grid to run their large workflows,” says Paul Avery, OSG Council co-chair and professor of physics at the University of Florida. “Now ExTENCI, a partnership between the two cyberinfrastructures, provides tools that help them to take advantage of both.”

TE

RA

GR

ID

An example of protein structure prediction; the experimental structure is shown in green and the prediction generated by Adhikari and his colleagues is shown in blue. Courtesy: Aashish Adhikari, University of Chicago

Collaboration between Open Science Grid and XSEDE aims to give researchers the right tools

Page 25: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

ExTENCI, which stands for Extending Science Through Enhanced National Cyberinfrastructure, was launched in 2010 under the leadership of Avery and co-principal investigators Ralph Roskies, co-scientific director of the Pittsburgh Supercomputing Center, and Daniel S. Katz, senior fellow at the Uni-versity of Chicago/Argonne National Laboratory. The project brings together 11 U.S. universities and na-tional laboratories—including the University of Chi-cago, Clemson University, Louisiana State University, Purdue University, University of Wisconsin-Madison, Fermi National Accelerator Laboratory, Brookhaven National Laboratory, Florida State University, and Florida International University—to develop tech-nology to enable researchers to more easily access resources through both OSG and TeraGrid/XSEDE.

“ExTENCI explored how to exploit the mutual capa-bilities of both TeraGrid and Open Science Grid,” says Roskies.

“Many TeraGrid users have a natural need for the high-throughput resources that the OSG provides. Similarly OSG users sometimes need access to high-performance computing resources such as those of TeraGrid,” says Michael Wilde, a fellow at the Univer-sity of Chicago Computation Institute and software architect at Argonne National Laboratory.

“The ExTENCI project is working to make the use of both cyberinfrastructures more seamless, and easier for individual scientists and smaller collabo-rations to leverage concurrently.”

“We’ve begun to do this in a few concrete cases, with the aim of leveraging the investments of both NSF and DOE in cyberinfrastructure resources and thereby to improve the productivity of U.S. com-putational scientists,” says Roskies.

One of those concrete cases is the protein struc-ture prediction project that operates the “Midway Folding Server,” a collaboration between the laboratories of Karl Freed and Tobin Sosnick of the University of Chicago and Jinbo Xu of the Toyota Technological Institute at Chicago.

The most widely used form of structure prediction uses the structure of known proteins as templates from which to compute the structure of similar unknown proteins. But that only works if there is a similar protein with a known structure. Nor does it give insight into how proteins fold in nature. Predicting a protein’s structure based solely on its amino acid sequence is more difficult—and more computationally intensive.

“We’re trying to predict protein structures by mimicking how we think proteins fold in nature,” says Aashish Adhikari, a researcher at the Insti-tute for Biophysical Dynamics at the University of Chicago. “Experiments suggest that proteins fold in a stepwise fashion, where subunits of structure we call ‘foldons’ form cooperatively and add on to the existing structure in a process called sequen-tial stabilization. Our algorithm follows a similar principle.”

This works well for protein sequences less than 100 amino acids long, according to Adhikari. But “if you increase the number of amino acids, for every amino acid that you add the computation time increases exponentially. Our goal is to try to use our algorithm to fold increasingly bigger proteins.”

Because this process can make excellent use of both HTC and HPC resources, Wilde identified the group as a good match for ExTENCI. Today, the protein structure prediction project is regularly using resources from both OSG and XSEDE, allow-ing them to fold larger proteins than ever before.

For more information: http://sites.google.com/site/extenci/

SC

IEN

CE

HIG

HL

IGH

TS

2223

Page 26: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Education, Outreach, and Training

TE

RA

GR

IDT

ER

AG

RID X S E D E

Page 27: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Over the past decade, TeraGrid’s Education,Outreach, and Training activities reached tens of thousands of researchers, graduate, undergraduate and K-12 students, educators, and citizens, helping them harness powerful tools, understand the value of high-perfor-mance computing, and pursue education and careers in math, science, engineering, and technology.

The next few pages spotlight some of these activities and give a preview of what is to come in XSEDE, which will continue—and expand—some of the TeraGrid’s most successful efforts, such as the Campus Champions Program.

Education, Outreach, and Training

ED

UC

AT

ION

, OU

TR

EA

CH

, AN

D T

RA

ININ

G2425

Page 28: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TE

RA

GR

IDT

ER

AG

RID X S E D E

A sampling of the education and outreach programs offered by TeraGrid partners

San Diego Supercomputer Center (SDSC)SDSC has a clear vision for strengthening the 21st century work-force, one that their outreach program has been promoting for almost 20 years. In that time, SDSC has seen tremendous growth in—and national recognition for—its teacher outreach program, TeacherTECH, which brings science, technology, engineering, and math (STEM) resources to teachers around the world.

Based on the success of TeacherTECH, SDSC decided to expand the concept to middle- and high-school students with its Stu-dentTECH program.

“The program was started in 2006 with one workshop, ‘Introduc-tion to Maya and 3D Modeling,’ and we had 14 students,” explains Ange Mason, education program manager for SDSC. “This sum-mer we will have attracted more than 300 students, and through our year-round programs we attract almost 1,000 students annu-ally.”

“In the future, computers will be everything,” says Angela Li, a participating eighth-grader at Bishop’s School in La Jolla, Califor-nia. “In order to be successful in the future, we need to be able to use everything that the future provides to the greatest extent. StudentTECH teaches us how to use that and more.”

National Center for Supercomputer Applications (NCSA)Improving high school students’ understanding of chemistry through computation and visualization is the central goal of the Institute for Chemistry Literacy through Computational Science (ICLCS) at NCSA. For nearly five years, ICLCS has provided an immersive environment for teachers to increase their chemistry knowledge. The program’s goal is to teach teachers how to use computational science tools in the classroom, and to develop their pedagogical and curriculum development skills.

The program includes intensive training over the course of three years, including two-week summer institutes, and yearlong content courses delivered via an online professional learning environment.

Using a pool of about 120 teachers from more than 100 rural Illi-nois school districts, the ICLCS has found a statistically significant improvement in chemistry content knowledge by the teachers and their students over time, as measured by the standardized test of the American Chemical Society (ACS).

Building Skills that Count

Page 29: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

“Chemistry is a challenging subject to teach because it involves species—atoms and molecules—that fall outside the range of human perception,” according to NCSA director and ICLCS principal investigator Thom Dunning. “By emphasizing the use of computational tools that allow students to visualize and interact with these species, ICLCS enables teachers to convert exercises in abstract thinking into exercises involv-ing concrete objects that can be manipulated and understood.”

Pittsburgh Supercomputing Center (PSC)TeraGrid’s education, outreach, and training efforts have not only been geared toward workforce develop-ment. They also raise awareness of real-world con-cerns about 21st century digital literacy and safety. For instance, do Internet passwords protect personal in-formation from unwanted intrusion? How can one be sure if someone online is who they say they are? Does anti-virus software really protect one’s hard-drive?

To help parents, educators, students, and individuals with these questions and many others associated with using the Internet, PSC in 2010 introduced SAFE-Net, a program funded by a National Science Foundation grant for Cyber Safety Awareness. Through SAFE-Net, PSC presents workshops that train educators and pro-vide materials for classroom learning. These materials address cyber threats, measures of protection, and questions of cyber ethics that arise as a result of social networking and other uses of the Internet.

“Many Internet users lack an understanding of com-mon threats they may face online,” notes Cheryl Begandy, PSC’s director of outreach and education. “Among parents, many lack confidence that their child is safe when using the Internet.”

In 2010, PSC held two ‘Train-the-Teacher’ workshops introducing SAFE-Net to 18 Pittsburgh-area teachers. The SAFE-Net website also provides free informa-tion, including classroom and parent materials about cyber-security issues, with lessons geared to grade levels 1-3, 4-6, and 7-12.

“SAFE-Net provides a wonderful repository of re-sources for educators as they attempt to address the issues of Internet safety,” says Norton Gusky, coor-dinator of educational technology for Fox Chapel Area School District, north of Pittsburgh. “For my presentations, I rely on the concise definitions and examples provided by SAFE-Net. All educators, K-12, respond positively to these materials and activities.”

The University of Texas at Austin/Texas Advanced Computing Center (TACC)Sometimes, encouraging engagement is simply a question of letting students and professors use Ter-aGrid resources to enhance existing computational education. For the past three years, the TeraGrid has offered an education allocation to allow teachers to teach and students to learn using some of the most powerful computing systems on the planet.

The Freshman Research Initiative (FRI) at The Univer-sity of Texas at Austin is considered a national leader in engaging undergraduates in scientific research. Of the 20 research tracks in FRI, three focus on the concepts of scientific computing through the lens of chemistry, biology, and physics.

These research experiences allow students to use the high-performance computing (HPC) resources at TACC, including Ranger and Lonestar 4, two of the top 30 systems in the world, for their coursework. During the past three years, students at The Uni-versity of Texas at Austin have used more than 1.3 million CPU hours on these clusters to do original research in nanotechnology, materials science, and new energy solutions.

“The FRI program helps young students experience what it is to do research,” says Graeme Henkelman, professor of chemistry at The University of Texas at Austin. “And if they like it, it gives them the resources they need to excel.”

National Institute for Computational Research(NICS)In East Tennessee and surrounding area, NICS isworking to add computational thinking to thetoolbox of area educators. Teachers from thefourth grade through the undergraduate levelhave attended NICS-sponsored workshops. Be-sides NICS personnel, experts from the ShodorFoundation and area Master Teachers werebrought in to share techniques, experiences andtools to use in the classroom.

“The educators who attend these computationalthinking workshops are under stress to cover allitems on government-mandated tests, and soneed tools that both provoke interest and curios-ity, as well as mesh well with things the teachersare already doing,” says Jim Ferguson, NICS direc-tor of Education, Outreach & Training.

Indiana UniversityThe staff at the Advanced Visualization Lab at Indiana University also had the needs of educatorsin mind as they worked with partners across theUnited States to produce several videos about thebenefits of computational science and computersimulation. The videos employ stereoscopic 3D storytelling and feature both computer-generatedand live-action imagery. Targeted toward studentsin grades 5-12 as well as the general public, the short videos can be used in classrooms, museums,or even viewed at home on 3D TV.

The first two videos have been shown to thou-sands of viewers at numerous locations andevents throughout the United States. A third videois scheduled to be released in late 2011.

By continuing these successful education, outreach, and training initiatives and adding newones, XSEDE will inspire the scientists of tomorrowand help keep the pipeline of students and teach-ers flowing for the professions of tomorrow.

For more information: https://www.xsede.org/education-and-outreach

ED

UC

AT

ION

, OU

TR

EA

CH

, AN

D T

RA

ININ

G2627

Page 30: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Champions HelpCampuses Connect

Swarthmore exemplifies how dedicated champions broaden TeraGrid/XSEDE reach

For the past several years of the TeraGrid program and now as XSEDE gets under way, more than 100 Campus Champions from 43 states have helped researchers, educators, and students take advantage of the national cyberinfrastructure.

The volunteer Champions fill diverse roles on their home campuses: faculty, information technology administrators, project managers, instructional designers, and high- performance computing specialists. Thirty-three of the Champions are from EPSCoR (NSF’s Experimental Program to Stimulate Competitive Research) jurisdictions and seven from minority-serving institutions.

Champions are sources of information about high-perfor-mance computing generally and XSEDE resources and services specifically not just at their own campuses, but also regionally and nationally. Champions can also help researchers and educators on their campuses get start-up allocations so they can quickly begin using XSEDE resources.

TE

RA

GR

ID X S E D E

Page 31: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

For example, Swarthmore College Champion Andrew Ruether created accounts for all stu-dents in professor Tia Newhall’s course on Distributed and Parallel Computing, enabling them to use TeraGrid supercomputers for class projects. Two of the students continued their project after the course concluded, developing and testing a novel parallelization technique for solving the K-Nearest Neighbor problem. The algorithm can be applied to tasks such as discov-ering medical images that contain tumors from a large medical database, recognizing finger-prints from a national fingerprint database, and finding certain types of astronomical objects in a large astronomical database. Having access to TeraGrid resources allowed the students to run large-scale experiments necessary to demon-strate that their solution worked well for real-world problems. Their work was presented at the 2010 TeraGrid conference.

Ruether has also helped other Swarthmore re-searchers and students successfully use TeraGrid resources.

“I would not have been able to do this work without him,” says chemistry professor Paul Rablen who used TeraGrid’s Cobalt system at the National Center for Supercomputing Applica-tions to investigate rearrangements in highly re-active carbon molecules that are widely used as catalysts; this work was published by the Journal of Organic Chemistry in February 2011.

Michael Brown, physics, is working on ways to design fusion reactors and uses the TeraGrid to model the behavior of the plasma in a magnetic containment field. One of Brown’s students, Swarthmore senior Dan Dandurand, used TeraGrid to calculate the complex orbits of more than a billion energetic protons, calculations that helped shed light on magnetic confinement fusion. In another set of calculations, Dandurand determined the fraction of energetic protons col-lected by a simulated probe in the plasma. These calculations helped to calibrate and understand a probe used in experiments. Brown and Dan-durand’s research was published in the Review of Scientific Instruments in 2011.

“Andrew has been a big help in the initial setup and then answering follow-up questions from students,” Brown says, adding that he and his students plan to continue their research with the help of XSEDE.

In addition to the research successes that Cham-pions facilitate, they also are a valuable source of feedback, helping the XSEDE leadership under-stand what resources are needed and what chal-lenges need to be overcome at the campus level. “XSEDE’s vision is to enhance the productivity of scientists and engineers by providing them with new and innovative capabilities,” says XSEDE leader John Towns. “Campus Champions provide important local and regional feedback and are therefore an essential part of the development process.”

XSEDE will provide increased professional development opportunities to the Campus Champion community through a new Fellows Program. Fellows will work with XSEDE Extended Collaborative Support Services staff on real-world science and engineering projects. The expertise Fellows gain through these collabora-tive projects can then be shared with their peers, students, and others.

For more about the Campus Champions program, see https://www.xsede.org/campus-champions or con-tact Campus Champions coordinator Kay Hunt ([email protected]).

Campus Bridging

The XSEDE Campus Bridging effort will work with campuses to assist them in adopting and making effective use of the XSEDE system architecture, which will make many tasks much easier for re-searchers. XSEDE personnel will work with campus personnel to provide awareness, advice, training, and assistance with the installation of appropriate XSEDE architecture components to support their local research community.

As new XSEDE capabilities and tools are made available, the Campus Bridging effort will work withpilot sites to deploy, test, and refine the architec-ture to best benefit the community. Large-scale deployment will follow once the tools are proven to be effective and reliable.

The Campus Bridging effort will seek the coopera-tion of campuses to identify local Campus Cham-pions who are committed to working with campus researchers to raise awareness of and use of the new tools and capabilities.

ED

UC

AT

ION

, OU

TR

EA

CH

, AN

D T

RA

ININ

G2829

Page 32: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Compelling, Ferocious Beauty

This early parameter study shows an exploration of Volker Bromm’s supernova data with added density and detail layers. After further visual development, final high-resolution layers were provided to the film’s digital effects team for further processing. Courtesy: Advanced Visualization Laboratory, National Center for Supercomputing Applications.

TE

RA

GR

ID X S E D E

Cosmic simulations and visualization skill contribute to acclaimed film ‘The Tree of Life’

Page 33: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Filmmaker Terrence Malick is often praised for the beauty of his films and their resonance with nature. Roger Ebert talks about his “painterly images.” For Janet Maslin, it’s his “visual genius” and his “intoxication with natural beauty.” In Malick’s new movie, “The Tree of Life,” some of that natural beauty came from an unlikely source—TeraGrid’s Ranger and Abe supercom-puters and the visualization expertise of the National Center for Supercomputing Applications (NCSA).

NCSA’s Advanced Visualization Laboratory (AVL) has enlivened documentary television and IMAX movies for years. But “The Tree of Life” marks the center’s first work in a feature film. The movie won the Palm d’Or at the Cannes International Film Festival in June 2011.

“Cosmic events are powerful visual metaphors for the human condition, and we wanted to combine accurate science with artistic sensitivity,” says Donna Cox, who leads NCSA’s Advanced Visualization Laboratory.

For “The Tree of Life,” the AVL team collaborated with the filmmakers to create two animated visualizations that are based on scientific data, some of which was created by the University of Texas’ Volker Bromm using Ranger at the Texas Advanced Computing Center.

The work brings “heart and soul to the scientific visualizations. We collaborated closely over many months to design the shots in question, deeply respecting the underlying science while shaping it into emotional imagery,” says Dan Glass, visual effects supervisor on “The Tree of Life.”

One visualization shows an awe-inspiring flight through a highly detailed galaxy model created at NCSA. The other highlights Bromm’s work, showing how the very first stars appeared, illuminating the previously dark universe.

An early version of the NCSA University of Illinois Milky Way galaxy model. NCSA worked directly with film’s digital effects team to cus-tomize the flight and visual settings for the “The Tree of Life” shot. Courtesy: Advanced Visualization Laboratory, National Center for Supercomputing Applications.

Volker Bromm,The University of

Texas at Austin

“The emergence of the first stars rapidly transformed the universe from a featureless, cold, and barren place to one teaming with complexity,” explains Bromm. “At the end of their brief life, the stars exploded as hyper-energetic supernovae. These primordial supernova explo-sions seeded the universe with the first heavy chemical elements, such as carbon, oxygen, silicon, and iron, thus setting the path toward planets, and, ultimately, beings like us.”

Bromm simulated these events on the Ranger supercomputer over the course of 42 days. The calculation would have taken 114 years on a laptop. The results of the simulation were not only featured as a Palm d’Or winner, they were also published in Astrophysical Journal in 2010.

NCSA took that scientific data and turned it into visualizations for “The Tree of Life.” Stuart Levy processed the simulation results and extracted features that they chose to make visually prominent. Alex Betts developed custom capabilities for using scientific data to visualize cosmic gas and dust with great realism. Bob Patterson orchestrated the camera move and the design of the visualization.

They worked on an in-house cluster with about 200 processors. When they needed more power, they used NCSA’s Abe supercomputer.

“Scientific visualization helps to infuse ‘The Tree of Life’ with an authenticity that goes beyond any other movie,” says NCSA’s Cox. “We’ve employed the most advanced supercomputing, networking, and visualization technologies to bring some of the compelling, ferocious beauty of the universe to the big screen.”

For more information: avl.ncsa.illinois.edu

Grant #: NSF AST-0708795, NSF AST-1009928, NASA NNX 08-AL43G, NASA NNX 09-AJ33G

ED

UC

AT

ION

, OU

TR

EA

CH

, AN

D T

RA

ININ

G3031

Page 34: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

X S E D E

Collaboration with Southeastern Universities Research Association provides visualization workshops to minority-serving institutions

TE

RA

GR

ID

Taking Training on the Road

Through a series of workshops at SURA institutions, more than 200 faculty researchers and students learned the basics of scientific visualization.Courtesy: Texas Advanced Computing Center

Page 35: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Not every college and university in the United States has its own computational cluster. This, in part, was the impetus for TeraGrid: to offer advanced computing resources to researchers all across the country, no matter how large or small their institu-tion.

The annual TeraGrid conference, the Campus Cham-pions program, and the education, outreach, and training programs developed by TeraGrid resource providers are just a few ways that TeraGrid brought scientists and students into the fold.

However, sometimes these programs are not enough to get researchers involved. Sometimes, one has to bring resources—human and technological—to researchers on their home turf.

This was the idea behind a unique collaboration between TeraGrid and the Southeastern Universities Research Association (SURA), a consortium of more than 60 universities working to advance and exploit the transformative nature of information technology on the regional, national, and international fronts.

The Southeast region is home to nearly all of the nation’s historically black colleges and universi-ties (HBCUs) and a large percentage of its minor-ity-serving institutions (MSIs). The individuals who work and study at these institutions are sometimes underserved by the national science community, so the Texas Advanced Computing Center (TACC) and TeraGrid reached out to these schools to share knowledge and recruit new users.

Over the course of the last year, as part of an Extreme Digital (XD) Visualization grant from the National Science Foundation (NSF), TACC and SURA presented training sessions at SURA member institutions, includ-ing Norfolk State University (an HBCU), the University of Central Florida, University of Miami, and SURA offices in Washington, D.C. In addition to attendees from host institutions, the faculty-student research teams from Florida A&M University, Howard University, and Morgan State University attended the workshops as well.

“The bulk of the people who come to the workshops have had zero or very nominal exposure to TeraGrid,” notes Linda Akli, program manager of IT Initiatives for SURA, and the organizer of the workshops.

Through these workshops, and others at The University of Texas, TACC staff members taught more than 200 faculty researchers and students the basics of scientific visualization, a process of transforming data into im-ages and animations that can be interpreted to derive insights.

“Visualization is a very important tool for science and engineering, but a lot of scientists and engineers have no visualization background, or are even aware of what’s capable with visualization,” says Dan Majchrzak, director of research computing at the University of South Florida. “The trainings informed the community at large about what’s available in TeraGrid and how it’s used.”

Participants learned how to use some of the most com-mon scientific visualization software (including Para-view and VisIt), how to get an allocation on TeraGrid, and how to access the Longhorn Visualization Portal, a website that lets scientists visualize massive datasets from the comfort of their offices.

“We’re teaching them to use the tools and apply them to their research,” adds Greg P. Johnson, a vi-sualization expert at TACC and one of the workshop trainers. “Attendees were able to get visual results and interact with their data to form new insights.”

Importantly, the workshops dovetail with the emerging ability to perform visualization remotely, which is an increasingly important capability supported by the NSF.

“For classes, for teaching, and for giving talks, you want to go to a place where you have the 3D, tiled display, and high-end visualization hardware,” Majchrzak says. “But researchers also want to be able to dump in their data, get the visualization out, and analyze it at their desk.”

In-person training paired with one-on-one consult-ing and remote access to powerful resources made it easy for new users to begin to take advantage of the computing tools available through TeraGrid. The leadership behind XSEDE expects this access to be even easier in the new program.

“If we’re not bringing in more folks from diverse and underrepresented communities, we’re not going to have much of a scientific and technological work-force,” says SURA’s Akli. “There’s an untapped pool of talent and it’s critical to get them engaged if we’re going to continue to have leadership in innovation and technology.”

ED

UC

AT

ION

, OU

TR

EA

CH

, AN

D T

RA

ININ

G3233

Page 36: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TE

RA

GR

ID X S E D E

Being ‘Smart’ at Home TeraGr id st or ag e and visualiza tion resour ces aid ‘smar t grid ’ rese ar ch

Power generation accounts for 40 percent of the U.S. carbon footprint. As global energy prices continue to rise, energy ef-ficiency is an increasingly important priority. Now, more than ever, there is significant momentum from both the general public and government to develop “smart grids”—an ever widening array of utility applications that enhance and auto-mate the monitoring and control of electrical distribution for greater efficiency.

Two images of the Mueller project whole-house energy usage data, aggregated by census block. The shade and height of the blocks correspond to the energy usage of the houses in each census block. The two images show the dramatic increase in energy use between the morning and afternoon, primarily from A/C use during the heat of the day. These tools allow the researchers at Pecan Street Project (PSP) to com-pare aggregate use across the homes in the project, which will become even more useful in Phase 2, when the project incorporates 1,000 homes across Austin. Visualizations such as these will allow PSP to easily correlate energy use with locations across the city.Courtesy: Adam Kubach and Paul A. Navrátil, Texas Advanced Computing Center, The University of Texas at Austin

Page 37: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

Being ‘Smart’ at Home According to Michael Webber, associate direc-tor of the Center for International Energy and Environmental Policy at The University of Texas at Austin, utilities and energy companies are ex-pected to spend $1 trillion to $2 trillion over the next few decades to build, update, and upgrade their grids nationwide. At the same time, energy consumers are expected to spend tens of bil-lions of dollars on energy-related appliances in the home.

“Before smart grid advocates and companies ask customers to invest in new products and servic-es, we all need a better understanding of what they want, what they’ll use, and what they’ll get excited about,” says Brewster McCracken, execu-tive director of Pecan Street Project, an organi-zation focused on developing and testing new technologies and business models for advanced energy management systems.

Hence, the creation of the Mueller Smart Grid Demonstration Project, a comprehensive energy consumer research study in Austin, Texas.

The Mueller smart grid project is generating complex and large datasets that require power-ful supercomputers to capture, integrate, and verify the information, and to make sure that it is properly synchronized and analyzed.

Enter the Texas Advanced Computing Center (TACC), a TeraGrid resource provider in Austin.

“TACC has some of the world’s fastest comput-ers, so we’re confident they can do any kind of crunching, rendering, or data manipulation,” notes Bert Haskell, technology director for the Mueller smart grid project. “They have the technical expertise to look at different data-base structures and know how to organize the data so it’s more efficiently managed. We’re very excited to work with TACC to come up with new paradigms on how to intuitively por-tray what’s going on with the grid and energy systems.”

With sensor installations in place at 100 homes, new data is generated every 15 seconds showing precisely how much energy individual circuits are using. In response, TACC developed a special data transfer format to pull all of the data into a database on the Corral storage system. To date, the database contains approximately 600 million individual power readings and continues to grow.

“We’re trying to create very rich resources for people to use in analyzing patterns of energy usage,” says Chris Jordan, a member of TACC’s Advanced Computing Systems group. “Over time, as the resources grow and become more varied, we expect whole new forms of research to be conducted. We’re really interested to see what people can do with it, such as how the data stream can transfer itself into a decision-making device for city planners and individual consumers.”

Members of the Mueller Smart Grid project (left to right), and Paul Navratil (far right) from TACC’s Data and Information Analysis team.

One of the weaknesses in smart grid systems is the way they visualize data, which is often not intuitive. Since TACC is a leader in providing visualization resources and services to the national science community, they were a perfect partner to remedy this problem.

Paul Navratil, research associate and manager of TACC’s Visualization Software group, says: “We’ve used our visualization expertise to translate the immense volume of data into images that convey insights to the researchers and their industry partners.”

Navratil notes that it is a massive data mining problem, but this is something the experts at TACC and all super-computer centers work with on a daily basis. Overall, the Mueller smart grid demonstration project is trying to understand how energy management systems can be integrated into our lifestyle. Observes Haskell: “That’s what we want to figure out—how that future automated home environment will interface to the smart grid to provide the peak energy demand char-acteristics that the utility needs to run their network without creating a burden on the customer.”

For more information: www.pecanstreetproject.org

ED

UC

AT

ION

, OU

TR

EA

CH

, AN

D T

RA

ININ

G3435

Page 38: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

XSEDE Leadership

John Towns, project directorNATIONAL CENTER FOR SUPERCOMPUTING APPLICATIONSUniversity of Illinois at Urbana-Champaign

Tim Cockerill, associate project directorNATIONAL CENTER FOR SUPERCOMPUTING APPLICATIONSUniversity of Illinois at Urbana-Champaign

John Boisseau, director of user servicesTEXAS ADVANCED COMPUTING CENTERThe University of Texas at Austin

Chris Hempel, deputy director, user servicesTEXAS ADVANCED COMPUTING CENTERThe University of Texas at Austin

Nancy Wilkins-Diehr, director of Extended Collaborative Support Services-communitiesSAN DIEGO SUPERCOMPUTER CENTERUniversity of California, San Diego

Ralph Roskies, director of Extended Collaborative Support Services-projectsPITTSBURGH SUPERCOMPUTING CENTERCarnegie Mellon University/University of Pittsburgh

Sergiu Sanielevici, deputy director, Extended Collaborative Support Services-projectsPITTSBURGH SUPERCOMPUTING CENTERCarnegie Mellon University/University of Pittsburgh

Victor Hazlewood, interim director of operationsNATIONAL INSTITUTE FOR COMPUTATIONAL SCIENCESUniversity of Tennessee Knoxville/Oak Ridge National Laboratory

Kathlyn Boudwin, manager, project management and reportingOak Ridge National Laboratory

Ian Foster, architect, architecture and designARGONNE NATIONAL LABORATORY

Andrew Grimshaw, architect, architecture and designUNIVERSITY OF VIRGINA

Kurt Wallnau, manager, software development and integrationSOFTWARE ENGINEERING INSTITUTECarnegie Mellon University

JP Navarro, deputy manager, software development and integrationARGONNE NATIONAL LABORATORY

Janet Brown, manager, systems and software engineeringPITTSBURGH SUPERCOMPUTING CENTERCarnegie Mellon University/University of Pittsburgh

Scott Lathrop, director, education and outreachNATIONAL CENTER FOR SUPERCOMPUTING APPLICATIONSUniversity of Illinois at Urbana-Champaign

Steven Gordon, manager, educationOHIO SUPERCOMPUTER CENTERThe Ohio State University

Laura McGinnis, manager, outreachPITTSBURGH SUPERCOMPUTING CENTERCarnegie Mellon University/University of Pittsburgh

Dan Stanzione, manager, trainingTEXAS ADVANCED COMPUTING CENTERThe University of Texas at Austin

Bill Bell, manager, external relationsNATIONAL CENTER FOR SUPERCOMPUTING APPLICATIONSUniversity of Illinois at Urbana-Champaign

Susan McKenna, communications coordinator, external relationsNATIONAL CENTER FOR SUPERCOMPUTING APPLICATIONSUniversity of Illinois at Urbana-Champaign

CarltonBruettDesign

Page 39: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

XSEDE Leadership

Page 40: XSEDE · 2015-03-24 · XSEDE Table of Contents The Extreme Science and Engineering Discovery Environment (XSEDE) is the most advanced, powerful, and robust collection of integrated

TeraGrid XSEDE

xsede.org