Computationally SpeakingOn the edge

shootceaselessΠολεοδομικά Έργα

16 Νοε 2013 (πριν από 3 χρόνια και 11 μήνες)

175 εμφανίσεις

TeraGrid’s Transition to XD (Message from John Towns)

Computational
ly Speaking

On the edge


QCD Stuff
--
Paving the way to the LHC. Multiple projects.

Highlight
Morningstar?

deGrand?

Bartschat?

The story could be our relationship with OSG and projects that are paving the way to the LHC. One prominent
and two sidebars (OSG & second LHC story). QCD Ramping to Petascale. Highlight
deGrand
, and side
-
bar what
our friends in Kentucky are doing
--

Keh
-
Fe
i Liu, QCD research/EPSCoR story (good science, unique
methodology).
Meifeng Lin, yale Collab with Japan related to QCD. Deemed promising by reviewers.
?
Huey
-
Wen Lin (University of Washington), maybe a side bar with others.
Thomas deGrand University of Co
lorado,
Boulder, New theories beyond the Standard Model to assist LHC

detection, Wrote the book (literally) on QCD
with deTar.
Morningstar?

Testing Technicolor Physics (NICS/TACC. Submitted by Dubrow)


Researchers at the University of Colorado and Tel A
viv University are studying Technicolor particle theories
beyond the Standard Model in which the Higgs boson is not necessarily a fundamental particle. Using
TeraGrid supercomputers, DeGrand and his colleagues have simulated new particles made up of quarks

with
two colors and three colors, respectively, to examine alternative systems of matter.

Phenomenologists had been thinking about Technicolor models for the last 30 years, but it was only five
years ago that scientists realized that many of the techniqu
es that had been invented to do simulations for
quantum chromodynamics could be applied to these new theories as well. The simulations revealed that the
simplest Technicolor models, with two colors, have properties that are very different from a convention
al
particle system (described as "unparticles”) and behave much like a liquid
-
gas mixture at its critical point. The
three
-
color system was even more mysterious. The results of the researchers’ alternative mapping could
have important implications on the s
earch for the Higgs boson at the Large Hadron Collider.

256.
Thomas DeGrand, University of Colorado


Physics
Running coupling and anomalous dimension of SU(4) gauge theory with decuplet fermions

TACC
-
RANGER 4,800,000 2011
-
01
-
01 to 2011
-
12
-
31


NICS
-
KRAKEN 5,000,000


Klaus Bartschat, Drake University, & colleagues (PSC
-
Schneider)

The interaction of atoms & molecules with short, intense laser fields with charged particles (scattering) as
well as wi
th neutrons. This work had led to several publications, including a 2010 article in
Science
. Includes
computing at NICS & TACC. Broad collaboration. Multinational, multidisciplinary.

The fundamental scientific driver of this collaboration is the study of
the interaction of atoms and molecules
with short, intense laser fields and with charged particles as well as neutrons (scattering processes). As noted
in the cover article for a recent special issue of Journal of Physics B, "We are [currently] witnessing
a
revolution in photon science, driven by the vision to time resolve ultra
-
fast electronic motion in atoms,
molecules, and solids..." Indeed, the ongoing development of intense short
-
pulse light sources based on high
-
harmonic generation and free
-
electron l
asers is providing new ways to generate optical pulses that are
capable of probing dynamical processes on attosecond (10^{
-
18} s) time scales. These capabilities promise a
revolution in our microscopic knowledge and understanding of matter. The unifying as
pect of all the work is
the application of the most sophisticated computational tools available to solve the time
-
dependent
Schroedinger equation for the interaction of electromagnetic fields with atoms and molecules, as well as the
calculations of atomic
continuum processes (excitation and ionization by photon, electron, and heavy
-
particle
impact) in complex atoms. We are striving for high accuracy in order to eliminate any issues associated with
the size of the basis sets used and/or propagation times. Mo
st of these studies are carried out in full
dimensionality (in the absence of symmetry, three physical dimensions for each particle and time) and
require large spatial grids and long propagation times to remove uncertainties associated with the extraction
of physically relevant quantities. Numerical results for quantities such as energy and angular distributions in
double ionization processes are often so sensitive that there are even qualitative differences in the
predictions from the converged and the unc
onverged calculations. Access to the computational instruments
provided by the TeraGrid is essential to achieving these goals.

70.
Klaus Bartschat, Drake University


Physics
Computational Studies of th
e Interaction of Time
-
Dependent Electromagnetic Fields and Charged
Particles with Atoms and Molecules

TACC
-
RANGER 2,900,000 2011
-
01
-
01 to 2011
-
12
-
31


NICS
-
KRAKEN
10,700,000




TACC
-
LONESTAR
-
WESTMERE 2,500,000

The spectrum of excited states in QCD using t
he Monte Carlo method on anisotropic lattices


Colin Morningstar.
Petascale rampup

clearly for Blue Waters. Should not give full Kraken allocation..physics
analysis for existing lattices…good physics, but 32M ranger, 0 on Kraken. and GPU stuff…

Co
-
PI:
Robert G. Edwards

Co
-
PI:
Balint Joo

Co
-
PI:
Keisuke Juge

Co
-
PI:
John Bulava

Co
-
PI:
Justin Foley

PI Institution:
Carnegie
-
Mellon University

Request Number:
MCA07S017

Request Type:
Renewal

Abstract of Submission:
In this renewal, we request an allocation of 32 million service units on Ranger and 10 million service
units on
Kraken to carry out computations of the low
-
lying baryon and meson resonance spectrum
using grouptheoretically constructed operators and pion ma
sses down to about 220 MeV with ab initio
Markov
-
chainm Monte Carlo path integrations on anisotropic space
-
time lattices. The majority of the
excited states to be studied have never before been investigated in lattice QCD. These calculations
are important
to many experimental efforts, but especially to those at Jefferson Laboratory. Exotic
hybrid mesons, a hypothesized new form of excited matter, will be studied, and the photocouplings
between some of these meson resonances will eventually be determined. Th
e first computations of
the hadron mass spectrum to include multi
-
hadron operators will be carried out using a new method
of stochastically estimating quark propagation. We request resources to compute the quark sinks

needed for configurations previously g
enerated on a 32^3 x 256 lattice, to complete the hadron
sources/sinks for previously
-
computed quark sinks on 24^3 x 128 lattices, to start the hadron
sources/sinks on the larger 32^3 x 256 lattice, and to begin a new Markov chain on a 40^3 x 256
lattice.


Requested Resources:

NICS Cray XT5 (Kraken)
10,000,000

TACC Sun Constellation Cluster (Ranger)
32,000,000


GPUs versus CPUs: Apples and Oranges?

(Zverina for TeraGrid)

The news late last year that China’s GPU
-
rich
Tianhe
-
1A

supercomputer was ranked the fastest system in the
world focused attention


and a lot of discussion


within the HPC community about the advantages of
graphics processing units (GPUs) versus central process
ing units (CPUs) used in many systems.

While GPUs have for several years primarily been used as fast video game engines to process 3D functions,
simulate movement and other mathematically intensive operations that might otherwise strain some CPUs,
the n
ewest GPUs are capable of more than making whiz
-
bang images or movies. Peter Varhol, a contributing
editor for the online magazine Desktop Engineering (DE), says GPUs are now capable of performing high
-
end
computations such as those used in engineering app
lications, in some instances as much as 20 times faster.

We might want to highlight this interesting project by U of I PI Bernstein. Ask Kent how the external review
went (pending after TRAC).

Daniel J Bernstein

PI Institution:
University of Illinois,
Chicago

Request Number:
DMS100011

Request Type:
Renewal

Request Title:
Cryptanalysis on GPUs

Abstract of Submission:

The "Conficker/Downadup" worm has infected several million PCs around the Internet and linked them into a centrally
controlled "botnet." Ca
n these PCs break the cryptographic systems commonly used to protect Internet communication, such
as 1024
-
bit RSA? In particular, these PCs contain many GPUs; how much of a threat is posed by the combined computational
power of these GPUs?

Requested Resour
ces:

NCSA Dell with NVIDIA Tesla S1070 GPU Cluster (Lincoln)
8,000,000

TACC Dell/NVIDIA Visualization and Data Analysis Cluster (Longhorn)
8,000,000


Andrei Hutanu, Louisiana State University (NCSA
-
Bell, creative use, multinational distributed
collaborati
on, EPSCoR institution,
extreme
scalability)

Scientific research is increasingly dependent on simulation and analysis requiring high performance
computers, distributed large
-
scale data, and high
-
speed networks. Andrei Hutanu of LSU leads a team
addressing
fundamental issues in distributed visualization design and implementation, where network
services represent a first
-
class resource. They built a distributed visualization system,
eaviv
, and a distributed
visualization application optimized for large data,
high
-
speed networks and interaction.
eaviv

connects Spider
at LSU’s Center for Computation & Technology (CCT), NCSA’s Lincoln and accelerator clusters, and a cluster at
the Laboratory of Advanced Networking Technologies (SITOLA) at Masaryk University in th
e Czech Republic.
The Internet2 interoperable On
-
demand Network (ION) provides wide
-
area connectivity to support dynamic
network circuit services, and NCSA’s mass storage system provided permanent data storage. Applications
request and reserve point
-
to
-
poi
nt circuits between sites as needed using automated control software.
eaviv

supports distributed collaboration, letting multiple users in physically distributed locations interact with the
same visualization to communicate their ideas and explore the datas
ets cooperatively. The team
implemented a ray
-
casting parallel volume renderer called Pcaster as the rendering component to
demonstrate the distributed pipeline’s workflow. Compared to parallel volume renderers in existing
software, Pcaster is a purely GPU
-
based volume renderer and image compositor supporting high
-
resolution
rendering. Pcaster asynchronously couples with parallel data servers for network
-
streamed data input. The
team tested Pcaster with data sets up to 64 gigabytes per timestep and achieved

interactive frame rates of
five to 10 frames per second on Lincoln to produce render images of 1,024 x 1,024 resolution. The
eaviv

system’s configurable architecture also allows it to run with a local data server and a single renderer, making
it a

desktop

tool for small
-
scale local data.

This research was published in 2010 in
Scalable Computing: Practice and Experience

and
Computing in Science
and Engineering
, as well as presented at the TeraGrid 10 conference.

514.
Andrei Hutanu, Louisiana State
University


Computer and Computation Research
eaviv
-

Distributed visualization

ABE
-
QUEENBEE
-
STEELE 1,000 2010
-
10
-
16 to 2011
-
06
-
30


NCSA
-
LINCOLN 29,500 2010
-
07
-
01 to 2011
-
06
-
30


NCSA
-
TAPE
-
STORAGE 5




TACC
-
LONGHORN 10,000




NICS
-
NAUTILUS 10,000


TeraGrid’s Common User Interface (PSC

Schneider/Leake)

TeraGrid is known for its diverse suite of computational resources that serve a geographically distributed,
multidisciplinary community of users who each have unique research application requirements, methods and
workflows. Navigating multiple systems with

disparate environments used to be a challenge, but not
anymore. TeraGrid’s Common User Environment (CUE) simplifies access by identifying commonalities across
systems and eliminating many of the differences.

Data
-
Intensive
Research


TeraGrid is for the Bi
rds

(Data
-
intensive, multidisciplinary collaboratio
n, leverages multiple projects.
Side
-
bar about Trestles (?)
DataOne?
Citizen Science?

(NICS
--
Jones)

PI: Daniel Fink,
Cornell Lab of Ornithology

Statistical Modeling of Avian Distributional Dynamics on the

Teragrid: Support for The Phenological
Atlas of North American Birds.

More than ever, research to identify the environmental drivers that shape species distributions is
needed to understand the impact humans have on earths natural systems, and for develop
ing
science
-
based management policies. However, obtaining this knowledge can be of the most difficult
tasks in ecology for the following reasons: 1) Species distributions vary dramatically through time and
space; 2) Ecologically relevant data are either no
t collected across sufficiently large spatial or
temporal scales, or are heterogeneous and widely scattered; and 3) Conventional analytical methods
have not been effective for facilitating spatiotemporal pattern discovery with such sparse, noisy data
and h
ighly variable ecological signals.


The goal of our research program is to advance data intensive ecology to meet these challenges and
improve our understanding of the dynamics of continent
-
scale bird migrations.

We have assembled a data warehouse that ass
ociates the bird observation data collected by the
citizen science project, eBird (http://www.ebird.org), with local
-
scale environmental covariates such
as climate, habitat, and vegetation phenology. eBird is unique among broad
-
scale bird monitoring
projec
ts because it collects data year
-
round. By associating environmental inputs with observed
patterns of bird occurrence, predictive models provide a convenient statistical framework to harness
available data for predicting species distributions and making in
ferences about predictor effects. For
the eBird data we employed a novel multiscale statistical model designed to automatically adapt to
both large
-
scale patterns of movement and localscale habitat associations. As part of the scientific
Exploration, Visua
lization, and Analysis (EVA) working group of the Data Observation Network for
Earth (DataONE) (https://dataone.org/), an NSF DataNet project, we have begun work scaling our
analysis as an exemplar project of data intensive analysis and visualization.




B
ioinformatics and the data challenge

Building the Gene Sequencing Pipeline

(TACC
-
Dubrow)


Scientists continue to make advances in gene sequencing, moving us closer to the possibility of the $1000
genome, and the potential for genome
-
wide analyses to
understand disease. But the dirty little secret is that
gene sequencing produces copious

amounts of data that can scarcely be interpreted. Many leading scientists
are integrating supercomputing into their workflow to perform rapid data analysis, processing
, and
visualization. With these 21
st

century tools at their disposal, scientists are making breakthroughs in the field,
and developing the pipeline for the widespread analysis of bioinformatics data.

One such discovery was made by a team led by Vishy Iyer
, a molecular biologist at The University of Texas,
using the TeraGrid systems at the Texas Advanced Computing Center. His group was able to pair next
-
generation sequencers with
Ranger

to study transcription proteins, which bind to the DNA and act as dials

for
protein production. Their proof of principle study, published in
Science

in April 2010, determined that
distinctions in transcription binding can be studied using their method, and that transcription factor binding
appears to be a heritable trait.


Br
eakthroughs in
Astro
nomical
Science

Toward Petascale Cosmology with GADGET

(NICS
-
Ferguson. Female PI)

PI(s): Tiziana Di Matteo, Carnegie Mellon University

RP(s): NICS, PSC

Discipline: Physics

This project, led by Tiziana Di Matteo of Carnegie Mellon Uni
versity, aims to carry out a detailed numerical
investigation of the coupled formation and evolution of black holes and galaxies using state
-
of
-
the
-
art
cosmological hydrodynamic simulations of structure formation in the Lambda
-
Cold Dark Matter model. The
t
eam has developed a new petascale
-
optimized version of the gravity/hydrodynamics code p
-
GADGET. With
Kraken the team plans to use p
-
GADGET to study how the vast range of quasars and galaxies formed. The
team intends to run the first hydrodynamical simulat
ion that can answer this, using well
-
tested algorithms for
modeling star formation and black hole formation, accretion and associated feedback processes.

272.
Tiziana Di Matteo, Carnegie Mellon University


Astronomical Sciences
Petascale Cosmology with Gadget
-

Request for Storage Space on PSC's Lustre WAN

PSC
-
ALBEDO 50 2011
-
03
-
15 to 2011
-
06
-
30


PSC
-
BLACKLIGHT 215,032 2011
-
03
-
03 to 2011
-
06
-
30


NICS
-
NAUTILUS 10,000 2010
-
07
-
01 to

2011
-
06
-
30


NICS
-
KRAKEN 23,796,400

Preparing to Swim in
Blue Waters
: Cosmic Re
-
Ionization at Full Scale on NICS Kraken

(NICS
-
Ferguson)

PI(s): Robert Harkness, University of California at San Diego

RP(s): NICS, PSC, NCSA, TACC

Discipline: Astronomical
Sciences

150 million to 1 billion years after the Big Bang, reionization of neutral elements occurred because of light
from newly formed stars. Reinonization changed the state of all the matter in the universe, making the
opaque cosmos clear and observable
. Using the hydrodynamic cosmology code Enzo, a team led by Rober
Harkness of the University of California at San Diego are simulating the formation of early galaxies with a grid
of 6,400^3 cells and 6,400^3 dark matter particles. Such large grids and part
icle counts are required to both
resolve individual galaxies and survey a representative volume of the universe. This simulation will be the
first to replicate a large enough volume of the universe to form galaxies across a sufficiently wide range of
masse
s and luminosities to give a proper description of the reionization process. Studying the reionization
process can elucidate a great deal about the process of structure formation in the universe.

460.
Robert Harkness, University of California
-
San Diego



Astronomical Sciences
Cosmic Re
-
Ionization

NICS
-
KRAKEN 3,000,000 2010
-
12
-
21 to 2011
-
12
-
21

In Search of Bulgeless Dwarfs (PSC
-
Schneider)

(Leake’s Note:
Hey Mike
--
Quinn is listed as PI on an allocation
request

for an even broader range of resources

(follows) for a second project “Understanding Giant Planets
/oort clouds
) with Governato as co
-
PI. The
Giant/Dwarf comparison might be interesting
.)

Astrophysicist Fabio Governato, University of Washington in c
ollaboration with Tom Quinn, U. Wash, James
Wadley, McMaster U., & Joachim Stadel, University of Zurich, solved a major problem with the Cold Dark
Matter model of how galaxies form using TeraGrid computing at PSC, NICS & TACC. (More than a million
hours on

various systems.) Their paper in
Nature

(Jan. 2010) reported simulations (using GASOLINE) that
agree with observed structure of "dwarf glaxies"
--

a result not achieved with prior simulations & which
overcomes a major problem of the CDM model. Dwarf galax
ies from the CDM model for the first time agree
with dark matter density properties and also luminous matter (stars) of observed dwarf galaxies. Subsequent
work further confirms this important finding.

Thomas R. Quinn

Understanding Giant Planets/Oort Cloud
s

Co
-
PI:
Fabio Governato

PI Institution:
University of Washington

Request Number:
MCA94P018

Request Type:
Renewal

Request Title:
N
-
body simulations: Planets to Cosmology

Abstract of Submission:

Our group continues to address several problems that encompass

an

enormous range of scale, and yet are joined
by a key commonality: the

physics is dominated by the mutual gravitational interaction of a

large
number of bodies. At the one end, we explore the clustering of

galaxies on the largest scales which
we can
observe; at the other, we

investigate the dynamics of the small bodies in our own Solar
System.

The commonality of the physics between these distinct problems allows

us to take advantage of a
significant overlap of the effort required

to solve them.

This y
ear we will 1) perform cosmological
simulations of galaxy

formation to understanding the role that the formation and destruction

of molecular gas has on star formation and hence the observed

structure of galactic disks, 2) perform
isolated simulations of d
isk

growth to understanding the detailed evolution of Milky Way type

disks, 3) determining the overall star formation history of the

Universe and the galaxy luminosity
function by expanding our work on

high resolution galaxy formation to a uniform sample o
f the
Universe,

4) model the inner regions of disk to match high resolution kinematic

data on Seyfert
galaxies and 5) perform simulations to gain a deeper

understanding of the role of Giant Planets on
the formation of the

Oort cloud and the subsequent flux

of comets through the inner and

outer Solar
System.


Requested Resources:

NCAR IBM Blue Gene (Frost)
100,000

NCSA Dell with NVIDIA Tesla S1070 GPU Cluster (Lincoln)
100,000

NCSA/LONI/Purdue Dell PowerEdge Linux Clusters (Abe/Queen Bee/Steele)
240,000

NICS

Cray XT5 (Kraken)
16,000,000

NICS SGI/NVIDIA, Visualization and Data Analysis System (Nautilus)
30,000

PSC SGI Altix (Pople)
50,000

Purdue Condor Pool
750,000

TACC Sun Constellation Cluster (Ranger)
100,000


The Formation of Active Regions on the Sun

(NIC
S
-
Ferguson)

PI(s): Matthias Rempel, National Center for Atmospheric Research

RP(s): NICS, TAAC

Discipline: Astronomical Sciences

A team led by Matthias Rempel

of the National Center for Atmospheric Research is investigating
formation and evolution of Sunspots.
Sunspots

are temporary phenomena on the surface of the Sun
caused by intense magnetic activity in a region, appearing dark relative to other regions of t
he Sun.
Magnetic fields emanating from the Sun can affect space weather and telecommunications on
Earth. Though the exact process by which sunspots are formed is still debated, it is thought that
sunspots are the visible counterparts of magnetic flux tubes

(a cylindrical area in the Sun containing
a magnetic field) in the Sun's convective zone that become coiled due to differential rotation. If the
stress on the tubes reaches a certain limit, they curl up like a spring and puncture the Sun's surface,
creati
ng the visable Sunspots. The project focuses on simulations of active region formation
(including large domains and long time scales) and sunspot fine structure (high resolution for
shorter time periods). Identifying the minimum
resolution required for pro
perly resolving the
physics of sunspots is essential for future simulation of the Sun and can lead to better
understanding of how the Sun’s magnetic fields affect life on Earth. 1010.

Matthias Rempel, National Center for Atmospheric Research


Astronomic
al Sciences
The Formation of Active Regions on the Sun

NICS
-
NAUTILUS 13,000 2010
-
11
-
30 to 2011
-
06
-
30


NICS
-
KRAKEN 7,990,000 2010
-
01
-
01 to 2011
-
06
-
30

Simulating Solar Magnetic Fields and their affect on
communication
satellites

(NICS
-
Ferguson)

PI: Juri Toomre

Systems: NICS, TACC, PSC, NCSA, SDSC

Field: Astronomical Sciences

Understanding the Sun’s inherent magnetism and its evolution will not only help us to understand the
heartbeat and evolution of our
solar system, but could also prove useful in the very future of mankind

the
Sun’s magnetic fields are thought to be largely responsible for space weather, or the magnetic forces that
wreak havoc on communications satellites.

Fittingly, a team led by Juri
Toomre of the University of Colorado recently used Kraken to simulate a Sun
-
like
star in the hopes of better understanding or own Sun. The team’s findings revealed much about our most
familiar stellar companion: for instance, that the Sun’s magnetic fields

originate in the convective zone rather
than the previously theorized tachocline; and that the simulated stars rotating more rapidly display stronger
differential rotation, or different rates of rotation at the poles and equator, which produces global sca
le
magnetic fields, as are observed in the Sun. For researchers this means that their model so far is in line with
observation. Other salient features of these models include the magnetic fields in the convection zone which
remained for an extended period
of time, much like the Sun’s fields, and that these fields can undergo regular
global scale reversals.

1216.
Juri Toomre, University of Colorado


Astronomical Sciences
Coupling of Turbulent Compressib
le Solar and Stellar Convection with Rotation,
Shear and Magnetic Fields

TACC
-
LONGHORN 500 2010
-
07
-
01 to 2011
-
06
-
30


TACC
-
RANGER 6,782,500




TACC
-
SPUR 10,000




NICS
-
KRAKEN 7,434,700

The Magnificent Magnetsophere

(NICS
-
Ferguson)

PI:

Homayoun Karimabadi


Systems: NICS, TACC

Field: Physics

Understanding the interaction of solar wind, i.e., space weather, and the Earth’s magnetosphere is critical not
only from an intellectual standpoint but also to protect Earth’s satellite infrastructure. Despite years of

research, mankind’s understanding of space weather is currently at a primitive level.

This interaction between solar wind and the magnetosphere has been studied for more than fifty years, but
much of the simulation work has relied on single
-
fluid magnetoh
ydrodynamics, a useful tool for predicting
global events and other features. However, Earth’s magnetosphere is dominated by ion kinetic effects, which
are effectively ignored in MHD simulations, and many other aspects of the magnetosphere such as transport

and structure of boundaries call out for more sophisticated models. A team led by the University of
California, San Diego’s Homayoun Karimabadi has begun this process on Kraken, using the H3DM and VPIC
codes to reveal a never
-
before
-
seen picture of our fo
remost radiation shield’s anatomy. The team has begun
coupling the magnetosphere with the ionosphere

the successful development of a 3
-
D global, hybrid
simulation of the magnetosphere with realistic coupling to the ionosphere has the potential to be
transf
ormative in our abilities to model and predict how the solar wind couples to Earth's magnetosphere
and ionosphere, furthering our understanding of space weather’s impact on our planet.

588.
Homayoun Karimabadi, University of California
-
San Diego


Astrono
mical Sciences
Kinetic Simulations of the Magnetosphere

NICS
-
KRAKEN 6,000,000 2010
-
12
-
21 to
2011
-
12
-
21
Enabling Breakthrough Kinetic Simulations of th
e Magnetosphere

ASTA 7 2010
-
12
-
24 to 2011
-
06
-
30


NICS
-
KRAKEN 24,961,600 2010
-
07
-
01 to 2011
-
06
-
30


TACC
-
LONGHORN 220,000




NICS
-
NAUTILUS
220,000


Intermittency in Interstellar Turbulence

(NICS
-
Ferguson)

PI(s): Alexei Kritsuk, University of California at

San Diego

RP(s): NICS, TACC

Discipline: Astronomical Sciences

Star formation occurs due to the interaction of turbulence, gravity and magnetism within the extremely
dense, cold, gaseous environments of molecular clouds (MCs). However, the role these physi
cal processes
play to initiate star formation are poorly understood. A team led by Alexei Kritsuk of the University of
California at San Diego has taken this subject to task using Kraken. The team is interested in the physical
processes that determine mole
cular cloud structure and the formation rate of stars. Using over 5 million CPU
hours on Kraken since 2008, the team has conducted suites of simulations that elucidate how physical
processes, both inside and outside MCs, interact to produce the wide array

of stellar masses found in the
cosmos.

What’s with the air up there?


Quieting Plane Engines




Commercial airplanes generate a lot of noise, inside and out, from the flow of air over the body of the plane
and through the engines. In an effort to reduce
the noise created by planes, researchers from the University
of Illinois at Champaign
-
Urbana performed
large
-
scale simulations
on Ranger
to
learn

how a plasma
-
based
actuator
could be used to

control the jet exhaust
and limit noise
. Their findings show
ed

th
at careful tailoring
of the actuator
to control the airflow could reduce the amount of noise by
up to 30 percent
. Other leading
researchers, from Florida State University, Georgia Tech and Purdue, are exploring other approaches to jet
engine noise reductio
n, from chevrons on the nozzle to atomized fuel, using TACC and other TeraGrid
resources.

Regional Climate Change in a Geoengineered World

(TACC
-
NICS)



The implementation of certain geoengineering schemes
is

being
discussed

to offset global warming from
anthropogenic greenhouse gases in the atmosphere. Reducing the amount of sunlight that reaches Earth's
surface
through the use of volcanic aerosols
is a common
ly discussed

approach
, however very little modeling
has been done
. Early research
suggests

that i
t is possible to cancel the globally averaged warming,
but t
he
r
esponse of the hydrologic cycle, which
is arguably just as
important,
has received much less attention.

Cecelia Bitz, at the University of Washington, is using

TeraGrid systems at TACC and OR
NL to perform a

series
of

simulations of

fully coupled, century
-
long,
climate change

scenarios.
These simulations are among the first
to model geoengineering solutions, and also the first to use ensemble forecasts to explore interannual
variability.

(TIE
-
IN to pitch below: BITZ is a leading user of TeraGrid’s education allocations for her graduate and
undergraduate climate modeling courses.)

99.
Cecilia Bitz, University of Washington


Atmospheric Sciences
Climate Modeling Course

TACC
-
RANGER 200,000 2010
-
03
-
31 to 2011
-
03
-
31
Regional
Climate Change in a Geoengineered World

TACC
-
RANGER 1,138,000 2009
-
10
-
01 to 2011
-
03
-
30


TACC
-
SPUR
500


Climate System Noise in Climate Variability

NICS
-
NAUTILUS 23,000 2010
-
10
-
22 to 2011
-
08
-
11


NICS
-
KRAKEN 11,990,000 2010
-
08
-
11 to 2011
-
08
-
11

Climate System Noise in Climate Variability

(NICS
-
Ferguson. Female PI)

PI(s): Cecila Bitz, University of Washington

RP(s): NICS, TACC

Discipline: Atmospheric Sciences

Cecilia Bitz of the University of Washington is leading a team to study the effect that increased resolution of
climate models has on the simulation of noise (f
luctuations on short spatial and temporal scales) in the
climate system. The team’s main motivation is to study how sea ice is affected by changes in the ozone. Thus
far, understanding of the climate system has been derived from models run at coarse resolu
tion

around 100
km. The evolution of petascale computers offers opportunities to run and validate dramatically higher
resolutions that resolve mesoscale features of ocean and ice dynamics. Incorporating noise statistics from
high
-
resolution models into cou
pled climate models, the research team could test the hypothesis that low
-
frequency fluctuations of unpredictable climate noise in one component (the atmosphere) can be
“reddened

by the longer
-
time
-
scale fluctuations in another component (such as oceans w
ith higher heat
capacity). The team’s initial goal is to run high
-
resolution, century
-
scale simulations of the Earth System in
order to test the importance of noise at unresolved scales.

99.
Cecilia Bitz, University of Washington


Atmospheric Sciences
Climate Modeling Course

TACC
-
RANGER 200,000 2010
-
03
-
31 to 2011
-
03
-
31
Regional
Climate Change in a Geoengineered World

TACC
-
RANGER 1,138,000 2009
-
10
-
01

to 2011
-
03
-
30


TACC
-
SPUR
500


Climate System Noise in Climate Variability

NICS
-
NAUTILUS 23,000 2010
-
10
-
22 to 2011
-
08
-
11


NICS
-
KRAKEN 11,990,000 2010
-
08
-
11 to 2011
-
08
-
11


Climate Noise

James
Kinter (
Bitz is co
-
PI.
This project might be better/leveraged huge allocation at NICS
)

Co
-
PI:
Benjamin Kirtma

Co
-
PI:
Mariana Vertenstein

Co
-
PI:
Richard Loft

Co
-
PI:
Cecilia Bitz

Co
-
PI:
John Dennis

Co
-
PI:
Cristiana Stan

PI Institution:
Center for Ocean
-
Land
-
Atmosphe
re Studies

Request Number:
ATM090041

Request Type:
Renewal

Request Title:
The Role of Climate System Noise in Climate Variability II

Abstract of Submission:

Simulations supporting the scientific consensus that human activity is changing the Earths climate
have been derived from
models run at coarse, O(100 km) resolutions. The impact of unresolved scales on these predictions is not precisely known:
indeed it has been hypothesized that high frequency noise in the climate system could be reddened, thereby infl
uencing the
low
-
frequency components of the climate signal. Incorrect simulation of the noise statistics (or stochastic forcing) due to
inadequate resolution or errors in the physical parameterizations can feed back onto the mean climate. If this hypothesi
s is
true, the impact on future climate simulations could be enormous. It means that modeling improvements, such as better
physical parameterization of unresolved scales, perhaps combined with higher resolution, are necessary to model climate
variability c
orrectly. That conclusion could increase the computational cost of future climate studies by many orders of
magnitude. If the hypothesis is proven false, i.e., if increased resolution does not change climate variability significantly
, then
we can proceed w
ith much of the current low
-
resolution research program intact. An important tool in understanding the role
of noise in climate feedbacks is the interactive ensemble (IE) method, which allows us to control the characteristics of nois
e in
a specific climate

forcing through an ensemble
-
average approach. In our previous TeraGrid allocation, our team produced a
centuryscale, high
-
resolution baseline control run and began the low
-

and high
-
resolution atmospheric IE simulations
necessary to address these fundamen
tal questions about the role of atmospheric noise (weather) in the climate system. Now,
in this new TeraGrid Research Request, we propose to first complete the high
-
resolution atmospheric IE runs, then extend our
IE experimental methodology to study sea ic
e and ocean climate feedbacks. Analysis of the resource implications of these
science objectives leads us to a TeraGrid resource allocation (TRAC) request for 48.5 million SUs on Kraken, the Cray XT5
system at the National Institute for Computational Scien
ces (NICS). In addition, we request 105K SUs on the Nautilus data
analysis and visualization resource. Nautilus will be used to make a diagnostic first pass on the data sets generated. As

described in our supplemental Special Requirements document, we will

require 203.6 TB of HPSS archive

storage for this
project at NICS. A subset of derived data products (27.1 TB) will be served for further

scientific study via a GRaDS server at
NCAR, and archived on its HPSS system.

Requested Resources:

NICS Cray XT5 (Kra
ken)
48,500,000

NICS SGI/NVIDIA, Visualization and Data Analysis System (Nautilus)
105,000

One step

ahead of
disaster


Earthquake modeling (PSC
--
Schneider)

This involves modeling through the SCEC group in California and their collaborators at Carnegie Mellon in
Pittsburgh. It includes the TeraShake and CyberShake simulation (2006), the Analytics Challenge award
(2006), and the ShakeOut quake scenario modeling

(2008). This work has included computing at PSC, SDSC
and others (I think).

Storm Forecasting (PSC

Schneider)

Storm forecasting, collaboration between TeraGrid resources & Oklahoma CAPS (Center for Analysis &
Prediction of Storms) over the course of Tera
Grid existence, has led to significant improvements in storm
modeling/forecasting technology. These include several milestones in Spring forecast experiments, the most
realistic tornado simulation to date (2004), the first successful forecast of thundersto
rm details 24 hours in
advance (2005), the first large
-
scale experiment in ensemble forecasting (2007), and collaboration with the
LEAD project to test "on demand" forecasts (2007), tornado simulations of Oklahoma City F5 tornado (2009).
Current work invol
ves a PetaApps grant for Ensemble
-
based Data Assimilation for Numerical Analysis and
Prediction of High
-
Impact Weather. This work has included computing at PSC, NCSA, TACC and NICS.

Understanding why the oil rig failed

deep sea structures

Arif Masud,
University of Illinois at Urbana
-
Champaign

(NCSA
-
Bell)

Development of computational methods for modeling turbulent flows is considered a formidable challenge
due to the plethora of associated spatial and temporal scales. Professor Arif Masud and graduate s
tudent
Ramon Calderer from the Department of Civil and Environmental Engineering at the University of Illinois at
Urbana
-

Champaign are using NCSA’s Abe to develop residual
-
based methods for modeling turbulence in
complex fluid flows and fluid
-
structure in
teractions. Their new codes are mathematically consistent and
robust and provide high
-
fidelity solutions at reduced computational costs compared to the more traditional
Direct Numerical Simulations (DNS).

In an effort to model flow
-
induced vibrations in o
ff
-
shore oil platforms, the team used their new method

to
investigate fluid
-
structure interaction and turbulence around rigid and oscillating cylinders. Underwater
currents trigger periodic vortices around risers that are oil pipelines extending from the f
loating platforms to
the seafloor. Vortex shedding causes energy transfer from fluid to the structure and leads to high
-
amplitude
vibrations that can trigger fatigue failure of the structural systems. The new methods and codes are being
used to analyze the

structural integrity of deep sea risers following extreme storm events, and to model the
cause of failure of the deep sea oil pipelines in the Gulf of Mexico.

To visualize the 3D nature of turbulent flows, the team turned to NCSA’s Advanced Applications S
upport
visualization team. Visualization programmer Mark Van Moer used ParaView, a parallel renderer, with
custom VTK scripts to create visualizations for the team. “These visualizations have tremendously benefited
us in the method development phase,” says

Masud.

This work will be presented at the
16th International Conference on Finite Elements in Flow Problems,
Munich, Germany, in March 2011.

T
his research is funded by the National Science Foundation (#0800208).

781.
Arif Masud, University of Illinois at
Urbana
-
Champaign


Mathematical Sciences
Residual Based Turbulence Models for Large Eddy Simulation on Unstructured
Tetrahedral Meshes

SDSC
-
TRESTLES 500,000 2011
-
01
-
01 to 2011
-
12
-
31


NICS
-
KRAKEN 500,00
0




ASTA 2

Combustion Hazard Analysis

scalability, release, and use of Uintah Software (NICS

Jones)

PI: Martin Berzins

PI Institution:
University of Utah

Request Number:
MCA08X004

Request Type:
Renewal

Request Title:
Scalability, Release and Use of the U
intah Software


This could tie into a petascale, scalability, multiagency collaboration, and disaster mitigation. EPSCoR
institution.


Abstract of Submission:

This request is for Teragrid resources for three NSF
-
funded projects.The NSF SDCI program, throug
h
award OCI
-
0721659 (PI Berzins), and the NSF OCI PetaApps program, through award OCI 0905068
(PI Berzins co
-
PIs Wight and Harman) (from 09/2009) NSF CBET (CBET
-
0933574) (Microscale Heat
Transfer) PI Ameel co
-
PI Harman) from 08/2009 have funded work on the

scalability and use of the
Uintah software that resulted from the decade long work of the University of Utah Center for the
Simulation of Accidental Fires and Explosions (C
-
SAFE), a DOE
-
funded academic alliance project.


This proposal is concerned with a
renewal request for Teragrid resources for the final year of the
SDCI Project and for year 2 of the other two projects. The SDCI project is aimed at ensuring
scalability of Uintah and that the software can solve a broad class science and engineering proble
ms.
The PetaApps project is aimed at demonstrating that Uintah may be used to solve a very large scale
combustion problem from hazard analysis. The Microscale Heat Transfer project is using Uintah to
solve a challenging modeling problem related to improvin
g the cooling of CPUs.

Requested Resources:

NICS Cray XT5 (Kraken)
4,000,000

TACC Dell/NVIDIA Visualization and Data Analysis Cluster (Longhorn)
300,000

TACC Sun Constellation Cluster (Ranger)
5,000,000



Weather Prediction Model, Hurricane Katrina

(TACC
-
Dubrow)

Wei Wang, NCAR (Female PI)

The Advanced Visualization Laboratory at

NCSA has visualized data from
one of the largest simulations of
2005’s

devastating Hurricane Katrina
ever computed. Wei Wang, a research scientist at NCAR's

Earth System
Laboratory
, computed the evolutio
n of the storm using a complex
numerical weather prediction model t
hat
closely matches the actual
event. NCSA then visualized the 36
-
hour period when the storm is gaining energy
over the warm ocean.

Currently the Katrina visualizatio
n is

being shown at dome festivals
around the world; ultimately it will

be
part of a dome show called
“Dynamic Earth”

that is slated to debut this summer.

TeraGrid computers at
NCAR, high
-
speed networks, and expertise at RP sites were all important to this

research.

Kraken Tackles Tornado Prediction (Kraken or Nautilus or both
. Female PI/US National
)

(NICS
-
Ferguson)

PI: Amy McGovern

Systems:
NICS, PSC

Field: Atmospheric Sciences

Understanding and predicting tornadoes is a risky, and difficult, business. Co
nventional tornado chasers
might witness a handful a year, and radar systems only measure certain variables, such as wind velocity and
intensity of precipitation. For proper understanding and prediction, researchers need hundreds of storms and
mountains of

data, hence the benefit of simulation.

Despite their limitations, these on
-
the
-
ground observations have provided a research team from the
University of Oklahoma with enough data to begin simulating hundreds of storms, providing science with a
revolutionar
y look at these awesome storms and new tools with which to predict them. Currently the team,
led by Amy McGovern, is generating 150 75
-
meter resolution possible tornado
-
precursor storms, with each
simulation consuming 30 hours and 3,000 cores and generatin
g roughly a terabyte of data. These simulations
delve into the most complex players in tornadic storms, such as updrafts, downdrafts, tilt, and the various
relationships between these factors. If a storm does in fact generate a tornado, the team begins the

process
of “relational” data mining, in which they can examine individual factors and their relationships to tornado
formation. Furthermore, the data mining algorithms could potentially be used in other fields of science as
well, such as atmospheric turbu
lence across the U.S. Overall, the team hopes their work will significantly
reduce the false alarm rate for tornado warnings (currently about 75%) and increase the warning lead time
(currently around 12
-
14 minutes).

iPlant

Collaborative (gene sequencing/food source. Data
-
intensive.
Combating

w
orld
hunger)

PI: Dan Stanzione

The iPlant Collaborative is a $50M virtual organization created by NSF to build the
cyberinfratructure to address the grand challenges in plant biology.
This allocation is being created
to support iPlant users, both through direct login and through the iPlant web gateways. The initial
users will be in support of the two active iPlant grand challenges; the iPlant tree of life, which is
building the phylogen
etic tree showing evolutionary relationships for all green plant species on
earth, and the Next Gen Sequencing working group of the iPlant Genotype
-
to
-
Phenotype project,
mapping genome and environmental information to expressed characteristics in plants (p
rimarily in
food crops).

1168.
Dan Stanzione, University of Texas at Austin


Environmental Biology
iPlant Collaborative

PSC
-
POPLE 1,107 2010
-
11
-
03 to 2011
-
04
-
02


TACC
-
RANGER
299,000 2010
-
02
-
02 to 2011
-
04
-
02


TACC
-
SPUR 500


Hyperlocal Forecasting

(TACC
-
Dubrow)



The rise of powerful high performance computers has led to transformative advances in atmospheric
forecasting.
And yet, these forecasts are still painted with a fairly large brush. The global
models


upon
which all official predictions are based


have a resolution on the order of 100 kilometers (km) per grid
-
point. At that level, storms appear as undifferentiated blobs and towns in the mountains and the valley seem
to experience identical wea
ther.

Masao Kanamitsu, a veteran atmospheric researcher at Scripps Institution of
Oceanography, uses a process called
downscaling

to improve regional predictions. The technique takes
output from the global climate model and adds information at smaller scal
es to improve the forecasts
dramatically without including observation data.

Using TeraGrid resources at the Texas Advanced Computing Center, Kanamitsu accurately forecast the
California region at 10km resolution by adding topography, vegetation, and rive
r flow data to global climate
models.
Researchers have already begun applying the
se models

to fish population studies, river flow
changes, and wind energy applications.

587.
Masao Kanamitsu, Scripps Institution of Oceanography


Atmospheric Sciences
Regionalization of Anthropogenic Climate Change Simulations.
-

Atmosphere ocean
coupled downscaling

TACC
-
RANGER 2,808,000 2010
-
10
-
01 to 2011
-
09
-
30

Better life through Materials Science (Chemistry? Physics?

Help)


Alenka Luzar, Virginia Commonwealth University (Bell. Female PI.)

It is now possible to engineer surfaces to show a range of properties related to, but not confined to, the
traditional concepts of hydrophobicity (a molecule’s resistance to wetting
or hydration).
Atomic scale
patterning offers a powerful way to tune hydrophobicity without altering
the chemistry of surface atoms. The
potential applications of such surface engineering are numerous, including minerals, coal and petroleum
products, and m
any applications involving paints, coatings, and adhesives, and also non
-
stick surfaces.

Alenka Luzar and her team at Virginia Commonwealth University are exploring superhydrophobic surfaces,
those that are extremely difficult to wet. Surface roughness is
regarded as the key feature as to why. In
nature, superhydrophobicity typically relies on two
-
scale corrugations, occurring on micron and nanoscale
levels. Using NCSA’s now
-
retired Mercury and Purdue’s Steele, Luzar and her colleagues conducted molecular
d
ynamics simulations to examine the wetting behavior of nanodroplets, which cannot detect any roughness
above the nanoscale.

By examining the effect of nanoscale roughness on spreading and surface mobility of
water nanodroplets, they arrived at two new find
ings pertinent to nanoscale roughness.

First, they demonstrated that roughness on this length scale renders hydrophilic surfaces more hydrophobic,
a trend opposite to observations on macroscopically rough surfaces (Wenzel relation). Second, they showed
th
at a superhydrophobic surface with a contact angle of 180 degrees can also be achieved when surface
roughness is limited to the nanoscale alone. They also observed an interesting dynamic effect of surface
hydrophilicity. For smooth surfaces, a counterintui
tive increase in droplet mobility is observed as surface
properties are modulated to render it more hydrophilic. This behavior is, however, reversed in the presence
of even moderate surface corrugations. Sub
-
nanoscale surface patterning is therefore suffic
ient to produce
drastic changes in surface physics, addressing both equilibrium (contact angle and surface tensions) and
dynamic (droplet diffusion and mobility) properties of newly synthesized materials. The work was published
in
Faraday Discussions
.

Thi
s work is funded by the National Science Foundation (#
0718724)
.

Thom Dunning and David Woon, University of Illinois (NCSA
-
Bell)

How silica behaves under extreme heat helps scientists
predict earthquakes and volcanic eruptions
. It
could also help us mak
e better computer chips and optic cables.

Kevin

Driver and John Wilkens, Ohio State University

(NCSA
-
Bell)


Using NCSA’s Abe and recently
-
retired Cobalt supercomputers, along with LONI’s QueenBee, TACC’s
Ranger and Lonestar, and machines at the Ohio
Supercomputer Center, NCAR, NERSC, and the
Computational Center for Nanotechnology Innovations, an international team of physicists led by
Ohio State University has been able to simulate the behavior of silica in a high
-
temperature, high
-
pressure form that

is particularly difficult to study firsthand in the lab.


The resulting discovery

reported in the online edition of the Proceedings of the National Academy
of Sciences (PNAS) in 2010

could eventually benefit science and industry alike. Silica makes up
two
-
thirds of the Earth’s crust, and we use it to form products ranging from glass and ceramics to
computer chips and fiber optic cables.


Silica is all around us, explains Ohio State doctoral student Kevin Driver, who led this project for his
doctoral thesis
. But despite much research, the detailed structure and composition of the deepest
parts of the mantle remain unclear. These details are important for geodynamical modeling, which
may one day predict complex geological processes such as earthquakes and vol
canic eruptions. Even
the role that the simplest silicate

silica

plays in Earth’s mantle is not well understood.


Silica takes many different forms at different temperatures and pressures, not all of which are easy
to study. The team used the equivalent of

more than six million CPU hours to model four different
states of silica.


In PNAS, Driver, his advisor John Wilkins, and their coauthors describe how they used a quantum
Monte Carlo method to show that the method could be applied to studying minerals in
the planet’s
deep interior. They found that the behavior of the dense, alpha
-
lead oxide form of silica did not
match up with any global seismic signal detected in the lower mantle.


This result indicates that the lower mantle is relatively devoid of silica, except perhaps in localized
areas where oceanic plates have subducted,


This research was funded by the National Science Foundation (Award #1025392) and the
Department of Energy.


Clean, renewable energy


Improving Nature’s
Top

Recyclers

(NICS
-
TACC)




Many scientists believe the future of fuel lies in biomass


waste material that can be broken down to make
sugars and energy for heating or transportation. Scientists from the Nati
onal Renewable Energy Laboratory
are exploring this transformation through numerical simulations of the enzymes that perform this function in
nature most efficiently, fungal
t. reseii

and bacterial cellulosomes. Using the
Ranger

supercomputer, they
have ma
de a number of discoveries about the nature of plant walls and the process of breaking them down.
These insights will help create “designer” enzymes capable of speeding up the rate of biofuel production.

77.
Gregg Beckham, National Renewable Energy Laborat
ory


Molecular Biosciences
Understanding cellulose and cellulose
-
degrading enzymes for biofuels applications

NICS
-
ATHENA 11,838,320 2010
-
10
-
06 to 2011
-
09
-
30


NICS
-
KRAKEN 2,916 2010
-
10
-
01 to 2011
-
09
-
30



TACC
-
RANGER 5,400,000

Clean Energy
--
First Principles Based Modeling of Proton Diffusion in Decorated Carbon Nanotubes
as Structurally Defined Systems for Proton Exchange Membranes

PI(s): Stephen Paddison, University of Tennessee at Knoxville

RP(s): NICS

Discipline: Chemistry

A team lead by Stephen Paddison of the University of Tennessee at Knoxville is using Kraken to
study fundamental properties of the electrolyte of proton exchange membrane (PEM) fuel cells,
which are currently being tested as clean en
ergy conversion devices for automobiles.
The project
focuses on materials, specifically the polymer electrolyte membrane that acts as a central
component of this type of fuel cell. Paddison’s team is looking at the way in which protons are
transported thro
ugh the central component of a PEM, studying the functionality of the material at
the molecular level.


In their study of hypervalent molecules, Dunning and Woon have found a new type of chemical bond, which
they have labeled a “recoupled pair bond” becaus
e it requires the recoupling of electron pairs.
They used
TeraGrid resources at NCSA.
They have found this bond is the excited states of “normal valent” molecules as
well. And they have found evidence that recoupled pair bonds can affect the overall chemis
try of second row
elements.

For example, PF3, unlike NF3, inverts through a configuration that has a recoupled pair bond.

So
this phenomena has major implications for all of chemistry.

There are several publications in 2009 and 2010, with two pending for

2011.

300.
Thom Dunning, University of Illinois at Urbana
-
Champaign


Chemistry
Recoupled Pair Bonding and Its Impact on Molecular States, Structure, Energetics, and Reactivity

NCSA
-
EMBER 100,000 2011
-
01
-
01 to 2011
-
12
-
31


NCSA
-
TAPE
-
STORAGE 5




ABE
-
QUEENBEE
-
STEELE 200,000


T
he origin of life


NICS
-

Supercomputer Peers Into the Origin of Life

(NICS
-
Ferguson)

PI: Jeremy Smith

Systems: NICS, NCSA

Field:

A research team led by Jeremy Smith of the University of Tennessee used the Kraken supercomputer to run
molecular dynamics simulations in an effort to probe an organic chemical reaction that may have been
important in the evolution of ribonucleic acids, or

RNA, into early life forms. Certain types of RNA called
ribozymes are capable of both storing genetic information and catalyzing chemical reactions

two necessary
features in the formation of life. The research team looked at a lab
-
grown ribozyme that cata
lyzes the Diels
-
Alder reaction, which has broad applications in organic chemistry.

They found a theoretical explanation for why the Diels
-
Alder ribozyme needs magnesium to function.
Computational models of the ribozyme's internal motions allowed the resear
chers to capture and understand
the finer details of the fast
-
paced reaction. The static nature of conventional experimental techniques such as
chemical probing and X
-
ray analysis had not been able to reveal the dynamics of the system. Turns out, the
conce
ntration of magnesium ions directly impacts the ribozyme's movements. The research was published as
"Magnesium
-
Dependent Active
-
Site Conformational Selection in the Diels
-
Alderase Ribozyme" in the Journal
of the American Chemical Society.

More effective dr
ug design


Catalina Achim, Carnegie Mellon U (PSC
-
Schneider)

Combining experiment with molecular dynamics modeling, she has solved the first double
-
helical peptide
nucleic acid (PNA) structure.These

molecules have applications as scaffolds for electron transfer and in
biomedicine, as vehicles for drug delivery. They have the DNA double
-
helix structure but with peptides (like
proteins) instead of phosphates for the backbone, which means they don't hav
e electrical charge and
therefore can go through the cell wall in drug delivery applications. This work used PSC's Pople. Achim
published the final solution structure in 2010.

Understanding cellular membranes/gating and transport properties (Purdue

Hunt)

PI(s): Wonpil Im, University of Kansas

RP(s) represented: Purdue, NCSA

Discipline(s) enabled: Biology, bio
physics
, medicine

TG resources/expertise used: Im’s group has used such TeraGrid resources as Purdue’s Steele cluster and
Condor pool and NCSA’s Abe,
Cobalt and Tungsten clusters.
Purdue TeraGrid

staff members assisted Im and
colleagues in getting their codes running on the TeraGrid systems and in managing large amounts of data.

T
he
integral
cellular machinery of humans, plants and other forms of life
i
s

protected by membranes
but
it is
essential to biological functioning that these barriers, while keeping bad things out, have the ability to open
channels for allowing good things in, such as nutrients and antibiotics. This is accomplished by ion channels
,
pore
-
forming proteins like the voltage dependent anion channel, or VDAC (also known as the mitochondrial
porin)
,

in the outer membrane of mitochondria. The mitochondria are the power plants of cells, and also play
roles in such functions as cellular sign
aling and cell differentiation.

VDAC

is a

permeation pathway for
metabolites, sugars for instance, and mobile ions, in part to fuel these cellular power plants. Mitochondria
and VDACs also are involved in cell death by interacting with apoptotic proteins,
molecules that regulate
apoptosis, a process of programmed cell death normally beneficial but related to a variety of diseases, cancer
among them, when the process goes wrong. Ion channels in general are often targets in the search for new
drugs. The gatin
g and transport properties of VDAC are dependent on tiny variations in the protein’s
electrical charge.
Professor
Wonpil Im at the University of Kansas and colleagues are using the TeraGrid to
understand the structure, dynamics, and mechanisms underlying V
DAC’s electrophysiological properties by
performing molecular dynamics simulations of human VDAC. They also have performed
grand canonical
Monte Carlo/
Brownian dynamics simulations to explore ion transport properties of human VDAC. Results so
far are in go
od agreement with experimental measurements and offer avenues for expan
d
ing our knowledge
about how these
important

proteins work.
The researchers’

calculations require enormous computational
resources and support from the TeraGrid is

indispensible
.
Result
s from t
he research
were recently published

in the

Biophysical Journal, “
Molecular Dynamics Studies of Ion Permeation in VDAC
” and “
Brownian Dynamics
Simulations of Ion Transport through VDAC, Voltage Dependent Anion Channel
,” both Feb. 2, 2011.

522.
Wonpi
l Im, University of Kansas


Molecular Biosciences
Computational Studies of Protein/Peptide Interactions In Biological Membranes

ABE
-
QUEENBEE
-
STEELE 3,000,000 2010
-
10
-
01 to 2011
-
09
-
30


IU
-
HPSS 20




NCSA
-
TAPE
-
STORAGE 5




ABE
-
QUEENBEE
-
STEELE 5,000,000 2011
-
02
-
17 to 2012
-
02
-
17

Matters of the Heart

Mathematical Modeling of Heart Rhythm Disorders

PI(s): Xiaopeng Zhao, University of

Tennessee at Knoxville

RP(s): NICS
, PSC

Discipline: Biological and Critical Systems

Sudden cardiac arrest (SCA) refers to an unexpected cessation of the cardiac cycle

quite different
from a heart attack where a blood vessel is blocked but the heart contin
ues to beat. A patient
suffering from SCA loses consciousness in a few seconds, and death can occur within minutes
without treatment. Though people who have had heart disease are at increased risk for SCA, about
half of all cases of SCA appear in people wi
th no previous history of heart disease. Led by Xiaopeng
Zhao of the University of Tennessee at Knoxville, this project aims to study mechanisms of cardiac
arrhythmias by developing mathematical models that accurately portray the interactions between
the m
echanical, electrical and chemical functions of the heart. Of particular interest to the team is
ventricular fibrillation, a condition in which there is uncoordinated contraction of the cardiac muscle
of the ventricles in the heart, making them quiver rath
er than contract properly. Knowledge from
this project may advance the development of novel strategies to diagnose and treat cardiac
arrhythmias.

Improved prosthetic heart valves (Submitted by Purdue

Hunt).

PI(s):

Cyrus Aidun, Georgia Institute of Techno
logy

RP(s) represented:

Purdue, NCSA, LONI and TACC

Discipline(s) enabled:

fluid mechanics, biophysics, bioengineering

TG resources/expertise used:

Purdue TeraGrid staff helped Aidun and his students install and troubleshoot
various libraries and the resea
rchers’ specialized codes to get them running on TeraGrid resources, and also
expedited a high
-
priority job in time to make a publication deadline.

Problems in which fluid and solid interact dynamically occur in various engineering and biological
applicati
ons. For example,
blood is comprised of elements such as red blood cells and platelets suspended in
plasma.
Most computational modeling of blood flow has simplified blood as pure fluid flow without the
presence of suspended
solid
particles.
Understanding t
he physics of blood flow requires accurate simulation
of the suspended solid particles to capture the suspension structure and the blood elements on a microscale.
This kind of understanding could be important in, among other things, treating pathological b
lood clots, a
common cause of heart attacks and strokes, which strike more than 2 million Americans annually. Clot
formation and detachment also plays into such medical issues as the failure of popular bileaflet mechanical
heart valves.
In such problems, t
he solid and fluid phases are tightly coupled requiring the simultaneous
solution of both phases. In addition, the high spatial and temporal resolution required tends to make
traditional numerical methods computationally expensive. Georgia Tech University
Professor Cyrus Aidun and
his colleagues use the lattice
-
Boltzmann (LB) method and finite element (FE) analysis for modeling the fluid
and solid phases, respectively
. The technique provides

a number of unique features making the solution of

larger problems

feasible. Still, the calculations require extensive data processing and storage and Aidun’s
group uses the TeraGrid to help fill these needs. The researchers have examined exceedingly complex
noncolloidal suspensions, work
lending

itself to varied applic
ations in biology, including blood flow, paper
manufacturing (wood fiber and coating flows), the mining and petroleum industries (for waste trailing), and
home products (paint and cosmetics).
Aidun’s group published results from such simulations more than
once
in 2010. Their paper “Numerical Investigation of the Effects of Channel Geometry on Platelet Activation and
Blood Damage,” accepted by
Annals of Biomedical Engineering

in October 2010, reports results that may help
optimize the design of prosthetic replacement heart valves.


Simulations of arteries (Argonne
-
Insley)

A group of Brown University researchers is attempting to create its own three
-
dimensional innovation in
navigation, but instead of mapping interstates and city blocks, the team is charting the vascular highway of
the human body

the arterial tree.

Specific
ally, the team, led by mathematician George Karniadakis, aims to develop comprehensive three
-
dimensional models of the arterial tree of the brain and other organs, such as the coronary tree in the heart,
to improve predictive capabilities related to the pr
ogression of vascular and hematological pathologies. The
National Institutes of Health and National Science Foundation are funding the researchers’ multiscale
modeling of the vascular system at various scales.


Karniadakis, NICS/ANL; Petascale research;
http://www.nics.tennessee.edu/Simulations_of_arteries

595.
George Karniadakis, Brown University


Chemical, Thermal Systems
Ultrascal
e simulations of blood flow in the intracranial arterial tree

NICS
-
KRAKEN 5,000,000 2009
-
11
-
11 to 2011
-
05
-
11
High Performance Computational Fluid Dynamics with GPU

NCSA
-
TAPE
-
STORAGE 1 2010
-
09
-
04 to 2011
-
09
-
04


NCSA
-
LINCOLN 50,000


Parallel Simulations of Blood Flow
in the Human Cranial Tree

NICS
-
KRAKEN 2,232,625 2009
-
10
-
01 to 2011
-
09
-
30


TACC
-
RANGER 1,438,100




TACC
-
SPUR 0

Understanding Disease

Sim
ulations of neuraminidase in support of development of new anti
-
viral flu drugs for both
group
-
1 and group
-
2 strains

(NICS
-
Ferguson)

PI(s): Ross Walker, San Diego Supercomputer Center/University of California at San Diego

RP: NICS, SDSC

Discipline:
Chemistry

A research team led by Ross Walker of the San Diego Supercomputing Center is simulating the

molecule
neuraminidase, which covers the surface of the influenza virus.
After the influenza virus leaves an affected
cell, neuraminidase ensures that the

virus doesn’t get stuck to the cell surface, allowing the virus

to

continue
replication
.
The group has completed the most comprehensive simulations of group
-
1 and group
-
2
neuraminidase enzymes to date, discovering new structural insights on both group 1 a
nd 2 neuraminidase

that

may help in developing new, less resistant antiviral drugs.


1266.
Ross Walker, University of California
-
San Diego


Chemistry
Simulations of neuraminidase in support of developm
ent of new anti
-
viral flu drugs for both
group
-
1 and group
-
2 strains

NICS
-
ATHENA 5,000,000 2010
-
07
-
23 to 2011
-
07
-
23


Evolutionary History of Malaria Parasites Clarified with Help of TeraGrid Resources

(SDSC
-
Froelich)


H
umans likely source of malarial
infections in

great

apes, not the other way around as previous thought


For centuries, malaria has mystified physicians and terrified patients, claiming more childhood lives than any
other infectious disease across large sections of the world. Though much
has been learned about the genetics
of various
Plasmodium

parasites, which cause malaria across vertebrate species, key aspects of the
evolutionary relationships of these parasites have been elusive.


Now, with the aid of a portal linking them to TeraGrid
expertise and computational resources, researchers led
by the University of Maryland and the University of South Carolina have clarified the evolutionary history of
Plasmodium

by analyzing an unprecedented 45 distinct genes from genomes of eight recently s
equenced
Plasmodium

species.


The results, published online in the journal
Parasitology

on December 1, 2010, offer the first comprehensive
dating of the divergence of these individual
Plasmodium
species and provide new insights into the complex
relationshi
p between
Plasmodium

species and their mammalian hosts.

“The results clarify the ancient association between malaria parasites and their primate hosts, including
humans,” said James B. Munro, a researcher from the University of Maryland School of Medicine
, Baltimore,
Md. “Indeed, even though the data is somewhat noisy due to issues related to nucleotide composition, the
signal is still strong enough to obtain a clear answer.”

A major finding of the research is that humans likely serve as a reservoir for
P.

falciparum



that is, humans
are likely to transmit this most prevalent and most dangerous of all the parasitic infections to great apes, and
not the other way around. This finding contradicts previous studies, which suggested that humans can be
infected
with
P. falciparum

derived from apes. The results obtained in this study argue that “if
P. falciparum

infections in great apes are derived from humans, [there may be a] need to establish refuges for the great
apes that are safe from human intrusion.”

The

research builds on the unveiling of the genome sequences of the two most widespread human malaria
parasites


P. falciparum

and
P. vivax



and the monkey parasite
P. knowlesi
, together with the draft
genomes of the chimpanzee parasite
P. reichenow
; three
rodent parasites,
P. yoelii yoelli, P. berghei

and
P.
chabaudi chabaudi
; and one avian parasite,
P. gallinaceum
. To examine the association between malaria
parasites and their primate hosts, the researchers sought to compare genetic variations found in 45

highly
conserved nuclear genes for which sequences are available from all eight
Plasmodium

species.

The evolutionary relationships were inferred using, among others, the software package MrBayes, which
consumed about 200,000 CPU hours on Abe, a supercompu
ter at NCSA. The researchers accessed this
resource via the CIPRES Science Gateway, a browser interface developed at the San Diego Supercomputer
Center (SDSC) that permits access to TeraGrid compute resources. “Without CIPRES, this work, and the other
proj
ects I am working on, would not go as quickly or as smoothly,” said Munro. “CIPRES is a fantastic
resource.”

Other phylogenetic or divergence time analyses were conducted on the Brazos HPC cluster at Texas A&M
University, and on the cluster at the Institu
te for Genome Sciences, at the University of Maryland School of
Medicine. Further studies are expected to be run on Trestles, a new TG data
-
intensive HPC resource housed
at SDSC.

836.
Mark Miller, San Diego Supercomputer Center


Environmental Biology
The CIPRES Portal Science Gateway

TACC
-
LONESTAR
-
WESTMERE 25,000 2011
-
02
-
01
to 2011
-
06
-
30


ASTA 7 2010
-
12
-
24 to 2011
-
06
-
30


SDSC
-
TRESTLES 3,250,000 2010
-
12
-
23 to 2011
-
06
-
30


NCSA
-
LINCOLN 100,000 2010
-
07
-
01 to 2011
-
06
-
30


ABE
-
QUEENBEE
-
STEELE 3,456,100




NCSA
-
TAPE
-
STORAGE 5


Improving the Efficiency of Eukaryotic Genome Sequence Assembly Using DASH

TG
-
GPFS
-
WAN 4
2010
-
05
-
15 to 2011
-
04
-
28


SDSC
-
DASH 20
,000 2010
-
04
-
28 to 2011
-
04
-
28

Economics


Project Abstract

Research on the Role of Secondary Mortgages, Housing Market, and Consumption
Smoothing

PI: Sheng Guo

We build a quantitative life
-
cycle model where households can select from a set of possible first
and second
mortgage contracts. Households finance their home purchases by obtaining long
-
term first mortgage.
Homeowners can borrow against their home equity through short
-
term second mortgage. Households can
choose to foreclose their homes given realization

of income and housing price shocks. The proceeds of the
foreclosure goes to first mortgage lien before second mortgage lien. When housing prices are on the rise, the
availability of second mortgages allows homeowners to smooth their consumption by borrowin
g against their
home equity. Both in the data and model, households with higher debt
-
to
-
income ratios are more likely to
take out second mortgages. However, when housing prices goes downhill, given the positive correlation
between income and housing prices
, homeowners with second mortgages are more likely to foreclose their
houses because they cannot make mortgage payments from incomes or home equity.

Therefore, the
fluctuations of housing prices affect the ability of consumption smoothing for homeowners with

second
mortgages disproportionally. In this paper, we quantity how much the availability of second mortgages can
improve households consumption smoothing. Requested resources on Blacklight (27,000) Florid
a
International University, MSI.

EPSCoR.


Outreach:

Focus on Champions and engaging under
-
represented
communities


Puerto Rican Outreach

CI
-
Train?

Jeff Pummill

Campus Champions/
Axel Koehlmeyer and others in EPSCoR states? Lathrop to write/to come.


Desktop to TeraGrid. Clemson, NICS, others. (NICS

Jones)

C
lemson used spare cycles on a non
-
TeraGrid system and dedicated them to helping EPSCoR PIs
(with a limited scope/research focus) ramp up to TeraGrid. The project has grown to encompass
additional states, research arenas, and PI’s. It’s a cool model, but ma
y not be something we want to
highlight since it wasn’t originated by TG
-
Central. That’s a question for John Towns since this may or
may not be folded into XD.


From Jacek Jakowski’s (NICS/UT) materials science allocation request in March:
We request
computing time on TG resources, specifically on Kraken, for molecular dynamics modeling of
nanoscale carbon materials. We will use a combination of approaches ranging from classical molecular
dynamics with the REBO potential to direct molecular
dynamics with time
-
dependent quantum
mechanical treatment for electrons and/or some nuclei. For the electrons we focus on density
-
functional tight binding approaches. We request total CPU time in the amount of 3.685 million hours
for the simulations and de
velopment of new MD approaches. The proposed simulations include (a)
quantum
-
chemical studies of a lithium
-
graphite
-
hydrogen cluster, (b) formation of new carbon nano
-
structures from graphenes, nano
-
tubes and fullerenes. The new

development focuses on a co
mbined
quantum trajectory description for nuclei and direct molecular dynamics.

The core educational focus of this proposal is to exchange experience with different approaches for
material science modeling, which is a step towards creating a community of p
ractice among SC and
TN EPSCoR institutions and also to strengthen the connections between different levels of the
Branscomb pyramid. This educational goal will be achieved through the open character of
collaboration (as explained in the text below) betwee
n South Carolina and Tennessee research NSF
EPSCoR research institutions.

Requested Resources:

NICS Cray XT5 (Kraken)
3,685,000

TeraGrid Advanced Support (Collaboration with TeraGrid Staff)
12

https://www.teragrid.org/c/document_library/get_file?uuid=36872499
-
a679
-
48b1
-
b330
-
a569f6bbf2c9&groupId=24026



Why Mona Lisa Smiled

t
he Art of Science


The Art of Astronomy

Volker Bromm, Univers
ity of Texas Austin (NCSA
-
Bell)

Filmmaker Terrence Malick is often praised for the beauty of his films and their resonance with nature. Roger
Ebert talks about his “painterly images.” For Janet Maslin, it’s his “visual genius” and his “intoxication with
n
atural beauty.” In Malick’s next movie, “The Tree of Life,” some of that natural beauty will come from an
unlikely source

the TeraGrid.


Volker Bromm
, an astronomy professor at the University of Texas at Austin, provided a supercomputer
simulation of how the very first stars appeared, illuminating the previously dark universe. The simulation was
completed on TACC's Ranger and was published in Astrophys
ical Journal in 2010.


The results (more than 4.4 terabytes) were transfered to NCSA via the TeraGrid network. A team of five
people at NCSA then took that data and turned it into the animation you’ll see in “The Tree of Life.” They
collaborated intensely

over the course of months with Bromm's team and the "Tree of Life" team. Parts of
the rendering were completed on NCSA's Abe.


Art in the Expanded Field

(TACC)



A growing awareness of the applicability of visualization to non
-
traditional computing fields have led the
Texas Advanced Computing Center and others centers in the TeraGrid to open up their resources to artists,
humanities researchers, archivists, archite
cts, and others. These individuals are using advanced computing to
broaden their practice and also to experiment with the emerging technologies that will one day become
commonplace in their fields, among others: 3D
-
TV augmented reality, touch interfaces, a
nd interactive
visualization driven by parallel processing supercomputers.

TACC has partnered with leading Austin art institutions to present several digital exhibitions in the past year
that focus on the intersection of art and technology. These shows fe
atured works by emerging and
internationally renowned visual artists who created site
-
specific art for TACC’s Visualization Laboratory,
home to the highest
-
resolution display in the world, and other cutting edge technologies. Beyond these
exhibitions, the
Vislab has hosted humanities conferences, sessions on visualization at the South by
Southwest Interactive festival, and generally broadened the public’s access to, and knowledge about, the
TeraGrid and high performance computing.


Building a workforce for
the
future


Story about TeraGrid’s student program (Lathrop)

To come



SDSC’s TeacherTECH (Mason)

The San Diego Supercomputer Center (SDSC), on the campus of the University of California, San Diego, has a
vision for strengthening the 21
st

century workfo
rce of tomorrow. Their outreach program spans almost
twenty years and in that time, the center has seen the tremendous growth and national recognition for their
teacher outreach program, TeacherTECH. Wondering if there would be a positive reaction to stude
nt
outreach programs that focused on the same topics as its teacher outreach programs, SDSC decided to test
the waters and dove in with its StudentTECH program. Launched in 2006 with a single summer workshop
focused on 3D modeling, the program has grown in

size and scope. Now offering year
-
round programs for
middle, high school and undergraduate students, StudentTECH reaches almost 1000 students a year with
workshops focused on the sciences, engineering, math and computing

topics on the cutting edge that
p
repare students for the ever
-
changing, technology
-
driven workforce of tomorrow.

Early Engagement with HPC (TACC
-
Dubrow)



The Freshman Research Initiative (FRI) at The University of Texas at Austin is considered a national leader in
engaging undergraduate
s in scientific research. Out of the 20 research tracks in FRI, three focus on the
concepts of scientific computing through the fields of chemistry, biology and physics. These research
experiences enable students to use the HPC resources of the TeraGrid, a
nd over the course of the last 3
years, students at UT have used more than 2 million CPU hours on Ranger. This is just one example of
classroom instruction that utilizes HPC to train the computational leaders of tomorrow. This article will
discuss the impa
ct of HPC access for students and professors at institutions from Texas to Washington to
Puerto Rico.

TeraGrid in the classroom

(Purdue
-
Hunt)

PI(s):
Andrew Ruether, Swarthmore College

RP(s) represented: Purdue
,
NCSA
, LONI

and TACC

Discipline(s) enabled:
Computer science, chemistry and physics

TG resources/expertise used:
Swarthmore has benefited greatly from the TeraGrid Campus Champions
program and used both Purdue’s high
-
throughput Condor pool TeraGrid resource and high
-
performance
computing resources a
t NCSA and other TeraGrid sites. Purdue TeraGrid staff helped with code compiling and
execution.

Swarthmore College uses the TeraGrid for both student coursework and faculty research.
Swarthmore
academic technologist Andrew
Ruether
, the college’s TeraGrid
Campus Champion,

created accounts for all
students in Swarthmore’s Distributed and Parallel Computing course and they used TeraGrid systems for class
projects. Two of computer science Professor Tia Newhall’s students extended the work done in the class to
a
summer project and presented thei
r results at the 2010 TeraGrid c
onference. The students developed and
tested a novel parallelization technique for solving the K
-
Nearest Neighbor problem. The algorithm is used to
classify objects from a large set. A few
examples of its use include discovering medical images that contain
tumors from a large medical database, recognizing fingerprints from a national fingerprint database and
finding certain types of astronomical objects in a large astronomical database. The
TeraGrid allowed the
students to run large
-
scale experiments necessary to demonstrate
that
their solution worked well for real
-
world problems. Because Swarthmore doesn’t have a high
-
performance computing system, the TeraGrid has
been a key resource for fac
ulty researchers who have reached the limits of desktop computing. For example,
chemistry Professor Paul Rablen has been using the TeraGrid for an investigation of rearrangements in
certain carbenes, highly reactive carbon molecules widely used as efficien
t catalysts, among other places in
the pharmaceutical industry. Michael Brown, professor of physics, is working on ways to design fusion
reactors and uses the TeraGrid to model the behavior of the plasma in a magnetic containment field.
Swarthmore senior D
an Dandurand, one of Brown’s students, calculated the complex orbits of more than a
billion energetic protons, calculations that help shed light on magnetic confinement fusion. In another set of
calculations, Dandurand determined the fraction of energetic
protons collected by a simulated probe in the
plasma. These calculations helped to calibrate and understand an actual probe used in experiments
. A paper
on Rablen’s research,


Experimental and Theoretical Study of the 2
-
Alkoxyethylidene Rearrangement
” was
published on the Web by the Journal of Organic Chemistry Feb. 22, 2011, and includes an acknowledgment to
the TeraGrid and NCSA.
A paper by Brown and Dandurand, “Calibrated Cylindrical Mach Probe in a Plasma,”
using some TeraGrid
-
produced results is to app
ear in the Review of Scientific Instruments in 2011.