Applications Accomplishments (1997 CRPC Annual Report):
Director: Geoffrey Fox
Because essentially all application work is funded outside the base CRPC
NSF grant, it has a different organization than other CRPC research
activities. Applications borrow from and provide a proving ground for
nearly every CRPC technology. Included in this work are the fifteen (# may change ? 14)
Grand Challenge activities briefly described in Table 1.1. The CRPC's role is
typical of the computer science component of this major national activity:
to ensure that the best algorithms and software technologies are available
to application scientists and to monitor and evaluate these technologies.
This style of application work will continue to be a key aspect of CRPC activities although it will change in detail. For instance during 1996-98, several of the original Grand Challenge projects will phase out. However CRPC has strong involvement with the two NSF PACI centers and this will lead to natural collaboration with some of the dozen major application activities associated with these programs. DoE has started a new round of Grand Challenges where Argonne, Texas(? Check Mary) and to a lesser extent Syracuse will participate. Further Argonne, Illinois and Caltech have major application programs as part of the new ASCI Centers of Excellence programs. (Kathy please check DoE GC and ASCI) CRPC has a major role in the DoD modernization program where the PET (Programming Environments and Training) is strongly driven by application requirements which tend to be broadly based and not focused on major "Grand Challenge" like projects. Finally there are several significant applications outside the national programs such as the Syracuse work on Financial Modeling with Industry and on earthquake prediction with Los Alamos and an international group of geoscientists. Note the application work of CRPC is actually increasing for the national programs are moving into major deployment mode.
We have also successfully expanded our PCE-TECH activities, which have taken
computer science research projects into usable prototypes and educational
outreach programs to a broader class of high-performance computer user than
that represented by the scientists involved in Grand Challenge projects.
Visit http://www.crpc.rice.edu/CRPC/research/applications.html for more
detail on CRPC applications activities. The work on the National High Performance software Exchange has important links to our application work
Table 1.1: CRPC Involvement in Grand Challenge Projects
1. High-Performance Computational Chemistry: (DOE---Argonne)
Using Fortran M to develop highly parallel modeling systems appropriate for
the chemistry of hydrocarbons, the chemistry of clay materials, and the
rational redesign of biodegradative enzymes.
2. Convective Turbulence and Mixing in Astrophysics: (NASA---Argonne)
Developing a numerical code for massively parallel MIMD architectures that
combines a higher-order Godunov method with a nonlinear multigrid elliptic
solver to advance the equations of compressible hydrodynamics with
temperature-dependent thermal conduction.
3. Advanced Computing Technology Applications to SAR Interferometry and
Imaging Science: (NASA---Caltech)
Metacomputing for Earth Science applications; produce portable scalable code
for processing Synthetic Aperture Radar signals and for analyzing resultant
data.
4. High-Performance Computational Biology: (DOE---Caltech)
Using CC++ to develop a tree-search program to compute three-dimensional
protein structures, with initial implementations on networks of workstations
and on the Intel Paragon.
5. Parallel I/O Methodologies for Grand Challenges: (NSF---Caltech)
Using Pablo 5.0, which includes software for capture and analysis of
input/output data, the basis for the input/output data characterization
effort at Caltech.
6. Grand Challenge Computational Cosmology: (NSF---Indiana)
Using parallel computation to explore the origin of large-scale structure
in the universe and how galaxies form. The Pablo Group is using svPablo, part of Pablo 5.0, to study the behavior of HPF on the SGI Origin 2000.
7. Numerical Tokamak: (DOE---Los Alamos, Caltech (JPL))
Parallel algorithms for MIMD and SIMD plasma particle-in-cell codes (JPL,
LANL); visualization; C++ (Caltech, LANL).
8. HPC for Land Cover Dynamics: (NSF---Maryland)
Tools, C++ class libraries, and compiler run-time support for distributed-
memory and disk-based hierarchical data structures, with emphasis on
efficient support for hierarchical data structures used in geographic
information systems.
9. Computational Aerosciences: (NASA---Rice)
Solving computationally-intensive engineering design problems by using
cheaper, less accurate simulations with strategic reality checks requiring
the full simulation (with Boeing and IBM).
10. Texas Center for Advanced Molecular Computation: (NIH---Rice, Houston)
Simulation of 130,000-atom biomolecular system, acetylcholinesterase in
solvent, using a scalable version of the widely used code GROMOS; comparison
of explicit parallel code with Fortran D version.
11. Numerical General Relativity: (NSF---Syracuse)
Development of toolkit and interoperable modules with common data structures;
Fortran 90 integrated with DAGH to create a highly effective visualization package, SciVis, which physicists have used extensively.
12. Material Sciences: (DOE---Tennessee)
Implementing algorithms and software, which are being used in a number of
computationally intense applications such as material science, for the
parallel solution of core problems.
13. Parallel Simulation for Modeling Cleanup of Groundwater: (DOE---Texas)
Formulating and implementing on distributed platforms accurate, efficient,
parallel algorithms for solving multi-phase flow and transport with
biological and chemical kinetics in a heterogeneous porous media.
14. Computational Chemistry of Nuclear Waste Characterization and Processing: (DOE---Syracuse, Argonne) Relativistic Quantum Chemistry of Actinides. Researchers will implement relativistic quantum chemical methods on massively parallel computers and, for the first time, provide capabilities for modeling heavy-element
compounds similar to those currently available for light-element compounds.
Neither of Following is a Grand Challenge in federal sense. Both are technically very
Similar to Grand Challenges but funded differently. So remove from table and
Add to applications page?
15. General Earthquake Modelling, GEM (DOE---Syracuse, Los Alamos)
Geoscientists and physicists are developing an international project to model earthquakes as the dynamics of interacting fault segments. Part of an emerging field that uses a variety of the fast multipole algorithms developed by Caltech and Los Alamos for astrophysics.
As above This is NOT a GC – it is funded by industry
GCF: Is the following a Grand Challenge or is it an application that might be added to our applications web site? If it is a GC, is it sponsored by the DOE? Or someone else? Are other CRPC sites involved?
16. Financial Modelling (Syracuse)
Through the NPAC InfoMall technology transfer program researchers are developing small business novel High Performance software systems (involving monte carlo and neural nets) to predict behavior of complex financial instruments.