Full HTML for

Basic foilset Computational Science and HPCC Issues for GEM

Given by Geoffrey C. Fox at GEM Working Group Meeting on December 5 98. Foils prepared December 6 98
Outside Index Summary of Material


We describe the approach proposed in failed 1998 KDI proposal
This divides computational effort into 7 areas which are briefly discussed
We describe possible crisis management projects and GEM

Table of Contents for full HTML of Computational Science and HPCC Issues for GEM

Denote Foils where Image Critical
Denote Foils where Image has important information
Denote Foils where HTML is sufficient

1 Computational Science and HPCC issues for GEM General Earthquake Simulation Project GEM Meeting Dec 5 98 San Francisco
2 Abstract of Computational Science and HPCC Issues for GEM
3 Mark2 Hypercube built by JPL(1985) Cosmic Cube (1983) built by Caltech (Chuck Seitz)
4 Components of GEMCI: GEM Computational Infrastructure
5 Details of GEMCI -I
6 GRAPE Special Purpose Machines
7 Details of GEMCI -II
8 Some Performance Results of Interest from Salmon and Warren
9 Clusters of PC's 1998 (NASA)
10 Details of GEMCI - III
11 Details of GEMCI - IV
12 7: Overall Integration of GEMCI into a PSE(Problem Solving Environment)
13 Multi-Server Gateway Tier
14 Pacific Disaster Opportunity
15 Tango and HPCC Gateways
16 Crisis Management
17 Shared Simulations -- Fluid Flow and Planetary Motion

Outside Index Summary of Material



HTML version of Basic Foils prepared December 6 98

Foil 1 Computational Science and HPCC issues for GEM General Earthquake Simulation Project GEM Meeting Dec 5 98 San Francisco

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
Geoffrey Fox
Syracuse University
NPAC
111 College Place Syracuse NY 13244 4100
3154432163

HTML version of Basic Foils prepared December 6 98

Foil 2 Abstract of Computational Science and HPCC Issues for GEM

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
We describe the approach proposed in failed 1998 KDI proposal
This divides computational effort into 7 areas which are briefly discussed
We describe possible crisis management projects and GEM

HTML version of Basic Foils prepared December 6 98

Foil 3 Mark2 Hypercube built by JPL(1985) Cosmic Cube (1983) built by Caltech (Chuck Seitz)

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
Hypercube Topology for 8 machines

HTML version of Basic Foils prepared December 6 98

Foil 4 Components of GEMCI: GEM Computational Infrastructure

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
1: User Interface
2: Fault-scale (Non-local) Equation Solver (Green's Function)
3: Modules specifying local Physics and friction
4: Evaluation, Data analysis and Visualization
5: Data Storage, indexing and access for experimental and computational information
6: Complex Systems and Pattern Dynamics Interactive Rapid Prototyping Environment (RPE) for developing new phenomenological models -- RPE includes analysis and visualization aspects
7: Overall Integration of GEMCI into a PSE(Problem Solving Environment)

HTML version of Basic Foils prepared December 6 98

Foil 5 Details of GEMCI -I

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
Computational infrastructure involves link of geographically distributed observations and computation
  • Seismic sensors, SAR etc.
Special Purpose Computers (such as GRAPE in Japan) for O(N2) particle dynamics could be used in Green's function approach to equation solvers.
  • These have 100 times performance of "conventional parallel machines" but maybe there are much larger algorithmic improvements to be gained which require classic parallel computers

HTML version of Basic Foils prepared December 6 98

Foil 6 GRAPE Special Purpose Machines

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
GRAPE 4 1.08 Teraflops
GRAPE 6 200 Teraflops

HTML version of Basic Foils prepared December 6 98

Foil 7 Details of GEMCI -II

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
Some Observations:
  • a) GEM is an HPCC class problem as expected to need, for initial Green's function models, some 107 elements and compute power of some 1-100 teraflop. Estimate comes from comparing GEM multipole with related astrophysics problems and the measured extrapolation of these.
  • b) GEM is a relatively young field and is not as obliged as other fields to worry about legacy codes. It should be able to aggressively take advantage of the emerging distributed object web technologies.

HTML version of Basic Foils prepared December 6 98

Foil 8 Some Performance Results of Interest from Salmon and Warren

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
16 Pentium Pro's in a cluster (cost $30K today) reach about 1 gigaflop on a tree code with 107 particles
6800 Pentium Pro's (ASCI Sandia machine) with around 300 million particles
  • 600 gigaflops with simple O(N2) code
  • 400 gigaflops with tree code
But tree code is 105 times as efficient as O(N2) code on this problem
  • i.e if Execution Time(naïve code) is c . N2
  • then Time (tree code) is 3000 c N
  • and tree code is more efficient than O(N2) code for more than 3000 particles

HTML version of Basic Foils prepared December 6 98

Foil 9 Clusters of PC's 1998 (NASA)

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
Naegling at Caltech with Tom Sterling and John Salmon 1998 --120 Pentium Pro Processors
Beowulf at Goddard Space Flight Center

HTML version of Basic Foils prepared December 6 98

Foil 10 Details of GEMCI - III

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
1: User Interface -- Design and build Java(bean) applet to control execution of computational modules. Take advantage of ongoing national DATORR (Desktop Access TO Remote Resources) activity which will lead to standard interfaces between clients, middleware and backend machines and data repositories
2: Large Scale Equation Solver: Exploit collaboration with Caltech and Los Alamos which has developed highly efficient multipole solver for large scale parallel machines (including PC cluster). Major HPCC application. Also need cellular automata and other simulation approaches. All of these methods are expected to parallelize well
3: Local Physics and Friction modules. Develop common interfaces to allow easy experimentation with different approaches

HTML version of Basic Foils prepared December 6 98

Foil 11 Details of GEMCI - IV

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
4: Evaluation, Data analysis and Visualization. Take advantage of partner expertise. For instance Boston University, NCSA and NPACI for visualization of large scale computations; NPAC's TangoInteractive or NCSA's Habanero for distributed collaboration.
5: Data Storage, indexing and access for experimental and computational information. Here new distributed object approaches seem powerful and both NPACI and DoE have particularly strong programs that we can leverage
6: Complex Systems and Pattern Dynamics Interactive Rapid Prototyping Environment (RPE) for developing new phenomenological models -- RPE includes analysis and visualization aspects. Rather different from item 2 as interactivity more important than performance. Could either involve suite of Java applets and/or server side simulations. Distinctive feature of GEM and could be very important. Even if client side, need to integrate with other components including visualization and data handling

HTML version of Basic Foils prepared December 6 98

Foil 12 7: Overall Integration of GEMCI into a PSE(Problem Solving Environment)

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
Our suggested strategy is very compatible with a basic distributed object web approach which is growing in popularity
  • for instance, NPAC has successfully applied this in a few NCSA and DoD "Problem Solving Environments". Such a commodity software system naturally tracks rapid evolution of technologies and preserves rich functionality (UNIX and Windows compatibilty, easy access to databases etc.) Leads to more maintainable.
As object web is still under development, recommend modest effort compatible with general principles. Details will emerge ....
Include in PSE fault simulation, information systems and ground motion simulations

HTML version of Basic Foils prepared December 6 98

Foil 13 Multi-Server Gateway Tier

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
Database
Matrix Solver
Optimization Service
MPP
MPP
Parallel DB Proxy
NEOS Control Optimization
Origin 2000 Proxy
NetSolve Linear Alg. Server
IBM SP2 Proxy
Gateway Control
Agent-based Choice of Compute Engine
Multidisciplinary Control (WebFlow)
Data Analysis Server

HTML version of Basic Foils prepared December 6 98

Foil 14 Pacific Disaster Opportunity

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
Build interactive decision support system which will enable real time data and simulations to be shared in an environment supporting
  • Audio-video conferencing, white boards, shared databases, web pages etc.
Propose to use TangoInteractive which was originally developed for DoD as a "web-based" command and control system and has been explored mainly in "synchronous" distance education
  • Allow sharing of data (cf Shared GIS), visualizations of both GEM and other simulations (including existing probabilistic estimates)
  • Can share (in principle) any object web entities

HTML version of Basic Foils prepared December 6 98

Foil 15 Tango and HPCC Gateways

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
Geographically
Distributed
Supercomputer
Resources
Gateway System
hosting Seamless Access
Model Composition
TangoInteractive,
Visualization and other Services
Geographically Distributed users
and consultants

HTML version of Basic Foils prepared December 6 98

Foil 16 Crisis Management

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index
CRISIS MANAGEMENT

HTML version of Basic Foils prepared December 6 98

Foil 17 Shared Simulations -- Fluid Flow and Planetary Motion

From Computational Science and HPCC Issues for GEM GEM Working Group Meeting -- December 5 98. *
Full HTML Index

© Northeast Parallel Architectures Center, Syracuse University, npac@npac.syr.edu

If you have any comments about this server, send e-mail to webmaster@npac.syr.edu.

Page produced by wwwfoil on Sun Dec 6 1998