Given by Geoffrey Fox at Trip to China on July 12-28,96. Foils prepared July 6 1996
Abstract * Foil Index for this file
See also color IMAGE
We describe HPCC Applications starting with the many successes of Federal Grand Challenge Program in Government and Academic areas |
As a survey discovered, this does not translate into acceptance by industry |
We describe the trend to the the more broadly based National Challenges |
Industry has neither adopted the use of HPCC in their business operations nor has a viable software and systems industry (at high end) been created |
The resolution of "dilemma" of Industry v. National need in government and academia will underlie future programs |
This table of Contents
Abstract
http://www.npac.syr.edu/users/gcf/hpcc96appls/index.html |
Presented during Trip to China July 12-28,1996 |
Geoffrey Fox |
NPAC |
Syracuse University |
111 College Place |
Syracuse NY 13244-4100 |
We describe HPCC Applications starting with the many successes of Federal Grand Challenge Program in Government and Academic areas |
As a survey discovered, this does not translate into acceptance by industry |
We describe the trend to the the more broadly based National Challenges |
Industry has neither adopted the use of HPCC in their business operations nor has a viable software and systems industry (at high end) been created |
The resolution of "dilemma" of Industry v. National need in government and academia will underlie future programs |
High Energy Physics |
Semiconductor Industry, VLSI Design |
Graphics and Virtual Reality |
Weather and Ocean Modeling |
Visualization |
Oil Industry |
Automobile Industry |
Chemicals and Pharmaceuticals Industry |
Financial Applications |
Business Applications |
Airline Industry |
High Performance Computing Act of 1991 |
Computational performance of one trillion operations per second on a wide range of important applications |
Development of associated system software, tools, and improved algorithms |
A national research network capable of one billion bits per second |
Sufficient production of PhDs in computational science and engineering |
Executive Summary |
I. Introduction |
II. Program Accomplishments and Plan |
1. High Performance Communications
|
2. High Performance Computing Systems
|
3. Advanced Software Technologies
|
4. Technologies for the Information Infrastructure
|
5. High Performance Computing Research Facilities
|
6. Grand Challenge Applications
|
7. National Challenge Applications - Digital Libraries
|
8. Basic Research and Human Resources
|
III. HPCC Program Organization |
IV. HPCC Program Summary |
V. References |
VI. Glossary |
VII. Contacts |
NSF Supercomputing Centers |
NSF Science and Technology Centers |
NASA Testbeds |
DOE Laboratories |
NIH Systems |
NOAA Laboratories |
EPA Systems |
Applied Fluid Dynamics |
Meso- to Macro-Scale Environmental Modeling |
Ecosystem Simulations |
Biomedical Imaging and Biomechanics |
Molecular Biology |
Molecular design and Process Optimization |
Cognition |
Fundamental Computational sciences |
Grand-Challenge-Scale Applications |
Computational Aeroscience |
Coupled Field Problems and GAFD (Geophysical and Astrophysical Fluid Dynamics) Turbulence |
Combustion Modeling: Adaptive Grid Methods |
Oil Reservoir Modeling: Parallel Algorithms for Modeling Flow in Permeable Media |
Numerical Tokamak Project (NTP) |
An image from a video illustrating the flutter analysis of a FALCON jet under a sequence of transonic speed maneuvers. Areas of high stress are red; areas of low stress are blue. |
Particle trajectories and electrostatic potentials from a three- dimensional implicit tokamak plasma simulation employing adaptive mesh techniques. The boundary is aligned with the magnetic field that shears around the torus. The strip in the torus is aligned with the local magnetic field and is color mapped with the local electrostatic potential. The yellow trajectory is the gyrating orbit of a single ion. |
Massively Parallel Atmospheric Modeling Projects |
Parallel Ocean Modeling |
Mathematical Modeling of Air Pollution Dynamics |
A Distributed Computational System for Large Scale Environmental Modeling |
Cross-Media (Air and Water) Linkage |
Adaptive Coordination of Predictive Models with Experimental Data |
Global Climate Modeling |
Four-Dimensional Data Assimilation for Massive Earth System Data Analysis |
Ozone concentrations for the California South Coast Air Basin predicted by the Caltech research model show a large region in which the national ozone standard of 120 parts per billion (ppb) are exceeded. Measurement data corroborate these predictions. Scientific studies have shown that human exposure to ozone concentrations at or above the standard can impair lung functions in people with respiratory problems and can cause chest pain and shortness of breath even in the healthy population. This problem raises concern since more than 30 urban areas across the country still do not meet the national standard. |
The colored plane floating above the block represents the simulated atmospheric temperature change at the earth's surface, assuming a steady one percent per year increase in atmospheric carbon dioxide to the time of doubled carbon dioxide. The surfaces in the ocean show the depths of the 1.0 and 0.2 degree (Celsius) temperature changes. The Southern Hemisphere shows much less surface warming than the Northern Hemisphere. This is caused primarily by the cooling effects of deep vertical mixing in the oceans south of 45 degrees South latitude. Coupled ocean-atmosphere climate models such as this one from NOAA/GFDL help improve scientific understanding of potential climate change. |
A scientist uses NASA's virtual reality modeling resources to explore the Earth's atmosphere as part of the Earth and Space Science Grand Challenge. |
Environmental Chemistry |
Groundwater Transport and Remediation |
Earthquake Ground Motion Modeling in Large Basins: The Quake Project |
High Performance Computing for Land Cover Dynamics |
Massively Parallel Simulations of Large-Scale, High- Resolution Ecosystme Models |
Visible Human Project |
Reconstruction of Positron Emission Tomography (PET) Images |
Image Processing of Electron Micrographs |
Understanding Human Joint Mechanisms |
Protein and Nucleic Sequence Analysis |
Protein Folding Prediction |
Ribonucleic Acid (RNA) Structure Predition |
Biological Applications of Quantum Chemistry |
Biomolecular Design |
Biomolecular Modeling and Structure Determination |
Computational Structural Biology |
Biological Methods for Enzyme Catalysis |
A portion of the Glucocorticoid Receptor bound to DNA; the receptor helps to regulate expression of the genetic code. |
Quantum Chromodynamics |
High Capacity Atomic-Level Simulations for the Design of Materials |
First Principals Simulation of Materials Properties |
Black Hole Binaries: Coalescence and Gravitational Radiation |
Scalable Hierarchical Particle Algorithms for Galzy Formation and Accretion Astrophysics |
Radio Synthesis Imaging |
Large Scale Structure and Galaxy Formation |
The Alliance will produce an accurate, efficient description of the coalescence of black holes, and gravitational radiation emitted, by solving computationally EinsteinŐs equations for gravitational fields with direct application to the gravity-wave detection systems LIGO and VIRGO under construction in USA and Europe. |
Austin- Chapel Hill- Cornell- NCSA- Northwestern- Penn State- Pittsburgh- NPAC has Formal Goals |
To develop a problem solving environment for the Nonlinear Einstein's equations describing General Relativity, including a dynamical adaptive multilevel parallel infrastructure |
To provide controllable convergent algorithms to compute gravitational waveforms which arise from Black Hole encounters, and which are relevant to astrophysical events and may be used to predict signals which for detection by future ground-, and space-, based detectors.
|
To provide representative examples of computational waveforms. |
http://www.npac.syr.edu/projects/bbh/bbh.html |
Problem size: Analysis with Uniform Grid
|
Solution: Adaptive Mesh Refinement
|
Einstein's equations can be represented as a coupled system of hyperbolic and elliptic PDEs with non-trivial boundary conditions to be solved using adaptive multilevel methods |
We are building PSE that will support:
|
To implement the system we use technologies developed by CRPC, in particular MPI and HPF, combined with emerging new Web technologies: JAVA and VRML 2.0. |
Simulation of gravitational clustering of dark matter. This detail shows one sixth of the volume computed in a cosmological simulation involving 16 million highly clustered particles that required load balancing on a massively parallel computing system. Many particles are required to resolve the formation of individual galaxy halos seen here as red/white spots. |
Simulation of Chorismate Mutase |
Simulation of Antibody-Antigen Association |
A Realistic Ocean Model |
Drag Control |
The Impact of Turbulence on Weather/Climate Prediction |
Shoemaker-Levy 9 Collision with Jupiter |
Vortex structure and Dynamics in Superconductors |
Molecular Dynamics Modeling |
Crash Simulation |
Advanced Simulation of Chemically Reacting Flows |
Simulation of circulation in the North Atlantic. Color shows temperature, red corresponding to high temperature. In most prior modeling, the Gulf Stream turns left past Cape Hatteras, clinging to the continental shoreline. In this simulation, however, the Gulf Stream veers off from Cape Hatteras on a northeast course into the open Atlantic, following essentially the correct course. |
Impact of the comet fragment. Image height corresponds to 1,000 kilometers. Color represents temperature, ranging from tens of thousands of degrees Kelvin (red), several times the temperature of the sun, to hundreds of degrees Kelvin (blue). |
Illustrative of the computing power at the Center for Computational Science is the 50 percent offset crash of two Ford Taurus cars moving at 35 mph shown here. The Taurus model is detailed; the results are useful in understanding crash dynamics and their consequences. These results were obtained using parallel DYNA-3D software developed at Oak Ridge. Run times of less than one hour on the most powerful machine are expected. |
Digital Libraries |
Public Access to Government Information |
Electronic Commerce |
Civil Infrastructure |
Education and Lifelong Learning |
Energy Management |
Environmental Monitoring |
Health Care |
Maunfacturing Processes and Products |
Define information generally to include both CNN headline news and the insights on QCD gotten from lattice gauge theories |
Information Production e.g. Simulation
|
Information Analysis e.g. Extraction of location of oil from seismic data, Extraction of customer preferences from purchase data
|
Information Access and Dissemination - InfoVision e.g. Transaction Processing, Video-On-Demand
|
Information Integration .
|
1:Computational Fluid Dynamics |
2:Structural Dynamics |
3:Electromagnetic Simulation |
4:Scheduling |
5:Environmental Modelling (with PDE's) |
6:Environmental Phenomenology |
7:Basic Chemistry |
8:Molecular Dynamics |
9:Economic Modelling |
10:Network Simulations |
11:Particle Transport Problems |
12: Graphics |
13:Integrated Complex Systems Simulations |
14:Seismic and Environmental Data Analysis |
15:Image Processing |
16:Statistical Analysis |
17:Healthcare Fraud |
18:Market Segmentation |
Growing Area of Importance and reasonable near term MPP opportunity in decision support combined with parallel (relational) databases |
19:Transaction Processing |
20:Collaboration Support |
21:Text on Demand |
22:Video on Demand |
23:Imagery on Demand |
24:Simulation on Demand (education,financial modelling etc.) -- simulation is a "media"! |
MPP's as High Performance Multimedia (database) servers -- WebServers |
Excellent Medium term Opportunity for MPP enabled by National Information Infrastructure |
25:Military and Civilian Command and Control(Crisis Management) |
26:Decision Support for Society (Community Servers) |
27:Business Decision Support |
28:Public Administration and Political Decision(Judgement) Support |
29:Real-Time Control Systems |
30:Electronic Banking |
31:Electronic Shopping |
32:(Agile) Manufacturing including Multidisciplinary Design/Concurrent Engineering |
33:Education at K-12, University and Continuing levels |
Largest Application of any Computer and Dominant HPCC Opportunity |
In spite of the large and very succesful national activity, simulation will not be a large "real world" sales opportunity for MPP's
|
However some areas of national endeavor will be customers for MPP's used for simulation
|
Some areas which may adopt HPCC for simulation in relatively near future
|
The role of HPCC in Manufacturing is quite clear and will be critical to
|
On the other hand for
|
Return on Investment Unclear:
|
The Industry is in a very competitive situation and focussed on short term needs |
In March 1994 Arpa Meeting in Washington, Boeing(Neves) endorsed parallel databases and not parallel simulation
|
Aerospace Engineers are just like University Faculty
|
There is perhaps some general decline of Supercomputer Industry
|
MAD (Multidisciplinary Analysis and Design) links:
|
(Includes MDO -- Multidisciplinary Optimization) |
Link Simulation and CAD Processes
|
This is really important application of HPCC as addresses "Amdahl's Law" as we use HPCC to support full manufacturing cycle -- not just one part! Thus large improvements in manufacturers time to market and product quality possible. |
BUT must change and even harder integrate:
|
The limited nearterm industrial use of HPCC implies that it is critical for Government and DoD to support and promote |
DoD Simulation: Dual-Use Philosophy implies
|
Manufacturing Support can lead to future US Industry leadership in advanced HPCC based manufacturing environments 10-20 years from now
|
An HPCC Software Industry is essential if HPCC field is to become commercially succesful |
The HPCC Simulation market is small |
This market is not used to paying true cost for software
|
There is a lot of excellent available public domain software (funded by federal government) |
Small Businesses are natural implementation of HPCC Software Industry
|
Two InfoMall Success Stories
|
Anecdotes from Thinking Machines (TMC) April 94 before the fall
|
Anecdote from Digital September 94:
|
These can be defined simply as those HPCC applications which have sufficient market to sustain a true balanced HPCC computing Industry with viable hardware and software companies
|
Alternatively one can define National Challenges by the HPCC technologies exploited
|
Partial Differential Equations |
Particle Dynamics and Multidisciplinary Integration |
Image Processing |
Some: |
Visualization |
Artificial Intelligence |
Not Much: |
Network Simulation |
Economic (and other complex system) modeling |
Scheduling |
Manufacturing |
Education |
Entertainment |
Information Processing |
BMC3IS (Command & Control in military war) |
Decision Support in global economic war |
1992: Grand Challenges |
1993: Grand Challenges |
1994: Toward a National Information Infrastructure |
1995: Technology for the National Information Infrastructure |
1996: Foundation for America's Information Future |
The National Challengies have been correctly identified as the the major HPCC opportunity and there is a |
Reasonable list of targeted Government areas BUT |
The Entertainment and Consumer Information Industry will set the standards and drive the technology |
One must set up collaborations with:
|
Health Care and Electronic Commerce may be large enough areas to sustain their own enterprise but some such as |
Military Command and Control and Education are not |
However the GII (Global Information Infrastructure) will force common standards and one canNOT go it alone in any area! |
So Dual-use or Multi-use development of modular HPCC technologies, services and applications essential |
Chair: Geoffrey Fox |
Co-Chair: Andy White |
Secretary: Ken Hawick |
January 10-12,1995 Pasadena |
1) need for better debuggers, profilers, performance monitoring tools |
2) need for more stable operating systems |
3) need for tools to aid in code migration to parallel systems, whether it be in the form of libraries, or other software engineering tools. |
4) need to reduce the latencies due to system software |
5) need for looking at exciting and innovative applications areas, (to help the HPCC industry by stimulating new demands). This might involve very data intensive applications (in contradistinction to compute intensive ones) but also harder and more complex problems, irregular data structures and less obviously load balanceable problems. |
1) Viable base model: Build HPCC software on an internally viable base such as distributed computing or the WWW. |
2) Internally consistent model: areas where business case for HPCC is internally viable (eg decision support) |
3) Partnership model: Government supported teams collaborating with industry teams (eg oil and gas) |
4) Pulse/Seed support model: IGA teams to develop applications (eg Europort, IBM, TMC,...) |
5) Ongoing support model: of areas of national importance, but without identified commercial markets (NSA, Weapons, QCD) |
6) Dual benefit model: Government market bootstraps viable commercial market or vice versa. |
Categories are not rigid but (six) approximately defined regions in complex multidimensional space (they could be merged or overlapped). Different application areas have different investment strategies |
Different applications have a different mix of metrics such as:
|
Need to involve a larger group of non HPCC communities |
For instance, most of the messages on networks are
|
But MPI standrards set internally to HPCC and did not explicitly involve ATM/Internet community/standard processes |
HPF focusses on regular multidimensional arrays in an excellent standards forum that ignores
|
Need HPVRML and a broader community |
Currently the tail is wagging the dog - the BIG dog? |
What is the market area that is big enough upon which to base viable HPCC standards (eg SMP, distributed systems or WWW)? |
What are the top three standards? |