Full HTML for

Scripted foilset Overview of HPCC Applications at NPAC

Given by Geoffrey Fox at CRPC Annual Meeting on May 14-17 1996. Foils prepared May 12 1996
Outside Index Summary of Material


We describe several HPCC Large Scale Simulations in which NPAC is involved and comment on implications for HPF!
Work in Porting ARPS Weather code to Syracuse region and Integration with VRML Visualization
Work from InfoMall Industry Outreach on Financial modelling with Monte Carlo SP2 code linked to Web for "Pricing on Demand"
Problem Solving Environment and Adaptive Meshs for NSF Grand Challenge on Binary Black Hole Collisions
NASA Grand Challenge on 4D Data Assimilation
A set of activities (mainly with PNL) on Computational Chemistry -- Relation of HPF and Global Arrays

Table of Contents for full HTML of Overview of HPCC Applications at NPAC

Denote Foils where Image Critical
Denote Foils where HTML is sufficient
Indicates Available audio which is lightgreened out if missing
1 Overview of HPCC Applications at NPAC
2 Abstract for Overview of HPCC Applications at NPAC
3 Real-Time Interactive Distributed Weather Information System
4 Capabilities of the ARPS code
5 Collaboration with CAPS
6 VRML integration with the terrain data set
7 Aspects of Financial World Motivating HPCC
8 Financial Application areas for which High-performance computing technologies are becoming indispensable
9 Path Integral Approach to Derivative Valuation
10 Parallel Maximum Entropy and optimization
11 Web-based System Integration -- Initial Server Implementation
12 Web-based System Integration -- Futures
13 Option Pricing
14 Option Pricing
15 Option Pricing
16 Option Pricing
17 Binary Black Holes Simulation
18 The Binary Black Hole Grand Challenge Alliance
19 BBH: Computational Challenge
20 Adaptive Multilevel Parallel Infrastructure
21 HPF and DAGH Implementation Strategies
22 Implementation of AMR in BBH
23 Fortran90 Module Structure
24 Data Structures in F90 BBH Code
25 Parallelization of AMR (in HPF) -- I
26 Syracuse Contributions to Black Hole GC
27 HPF implementation of T2
28 HPF Application Experience at NPAC
29 HPF: Some Problems we found
30 Special Problems with PGI HPF Compiler
31 Problems with Adaptive Mesh Refinement in HPF
32 Main Window for Java Interface to Distributed Computing Environment
33 Screens Opened for Java Interface to Distributed Computing Environment
34 Data Window Opened for Java Interface to Distributed Computing Environment
35 Specification Screen Opened for Java Interface to Distributed Computing Environment
36 Subroutine Specification Screen Opened for Java Interface to Distributed Computing Environment
37 Computational Chemistry at NPAC
38 Computational Chemistry at NPAC -- MOPAC
39 Computational Chemistry at NPAC -- NWChem
40 Global Array Toolkit (PNNL)
41 Computational Chemistry at NPAC -- Related Projects
42 AskNPAC about Chemistry -- NHSE Discipline Specific Resource
43 AskNPAC about Chemistry -- NHSE

Outside Index Summary of Material



HTML version of Scripted Foils prepared May 12 1996

Foil 1 Overview of HPCC Applications at NPAC

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
CRPC Annual Meeting
May 15 1996
Geoffrey Fox
NPAC
Syracuse University
111 College Place
Syracuse NY 13244-4100

HTML version of Scripted Foils prepared May 12 1996

Foil 2 Abstract for Overview of HPCC Applications at NPAC

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
We describe several HPCC Large Scale Simulations in which NPAC is involved and comment on implications for HPF!
Work in Porting ARPS Weather code to Syracuse region and Integration with VRML Visualization
Work from InfoMall Industry Outreach on Financial modelling with Monte Carlo SP2 code linked to Web for "Pricing on Demand"
Problem Solving Environment and Adaptive Meshs for NSF Grand Challenge on Binary Black Hole Collisions
NASA Grand Challenge on 4D Data Assimilation
A set of activities (mainly with PNL) on Computational Chemistry -- Relation of HPF and Global Arrays

HTML version of Scripted Foils prepared May 12 1996

Foil 3 Real-Time Interactive Distributed Weather Information System

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
We briefly review 3 topics
Capabilities of the Oklahoma Advanced Regional Prediction System (ARPS) code.
Collaboration with the Center for Analysis and Prediction of Storms (CAPS).
VRML integration to the Terrain Data Set.

HTML version of Scripted Foils prepared May 12 1996

Foil 4 Capabilities of the ARPS code

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
Non-hydrostatic, compressible dynamics in a terrain-following vertical coordinate.
6 water phases microphysics (water vapor, cloud water, rain water, cloud ice, snow, and hail/graupel.)
Supports MPP's (T3D's, SP2's, clusters, ...)
Other current numerical prediction models lack the spatial resolution required to capture small-scale, short-duration events such as snow bands.
  • Can predict down to the Microscale Phenomena
  • 0 to 1 hours.
  • Location of events to within 1 km & timing to within 5 min.
  • DV ±2 m/s, DT ±2 Kelvin, precip rate ±2 mm/hr.
Note Syracuse Area weather very position sensitive

HTML version of Scripted Foils prepared May 12 1996

Foil 5 Collaboration with CAPS

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
We had a meeting with CAPS during the last week of April, and the outcome is the following:
  • They will supply us with an initial data set over the Syracuse region.
  • This data set will be taken from the end of April, during severe thunderstorms in the region.
  • By mid summer they will try to help us run the code to evolve the initial data.
    • The first attempts should result in stable runs, with inaccurate predictions.
  • The ETA of accurate simulations should be roughly 1/2 person year.
    • This will time line would allow for the accurate prediction of lake effect snow.
    • Lake effect snow is caused mostly from wind blowing over a cold lake.

HTML version of Scripted Foils prepared May 12 1996

Foil 6 VRML integration with the terrain data set

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
We will use the current output from ARPS in hdf format.
We will visualize the moisture variables.
  • We run an isosurface routine on the raw data, which generates output of cloud formation.
  • Will are developing a Java applet which will read in 3D data and determine isosurfaces.
    • The user will input the isosurface value from a WWW page running the applet.
    • The output of this applet will be VRML data that will be instantly rendered.
    • Also it can be stored in our Illustra object database
For the integration of the weather data to the existing 3D VRML terrain data set, we will use a 2D mask function which will determine where snow has fallen to a certain level.
VRML data will be stored in a database, allowing the user to interactively turn on and off the weather information over the terrain.

HTML version of Scripted Foils prepared May 12 1996

Foil 7 Aspects of Financial World Motivating HPCC

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
Cooperative distributed (and parallel) computing will become mainstream in financial engineering due to a convergence of the following factors:
Increased volatility due to globalization of financial markets
Global distribution of data sources
Increase in complexity of derivatives and risk management vehicles
Increased demand for real-time asset allocation decision support
Increased volume of raw data and need to process large databases
Increased volume on the retail side of the spectrum in part due to on-line technologies (Internet and WWW)

HTML version of Scripted Foils prepared May 12 1996

Foil 8 Financial Application areas for which High-performance computing technologies are becoming indispensable

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
HPCC is becoming indispensable in the application domains such as:
  • Derivative Valuation -- particularly over-the-counter products and exotics
  • Portfolio optimization, valuation and asset allocation
  • Hedging of large portfolios in real time
  • Arbitrage trading
  • Risk analysis simulations
  • Pattern recognition
  • Detection of fraud
  • Credit risk analysis
  • Market segmentation
NPAC is engaged in development of new tools for quantitative financial modeling which take advantage of scalable computer architectures
The ultimate goal is to integrate various quantitative analysis transparently using Web technologies into a seamless cooperative computing environment, capable of supporting all aspects of enterprise-wide risk management.

HTML version of Scripted Foils prepared May 12 1996

Foil 9 Path Integral Approach to Derivative Valuation

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
We developed new algorithms for risk neutral valuation of derivative financial instruments
Theoretical prices of derivative instruments are obtained by discounting their expected payoffs under the equivalent martingale measure using money market interest rate.
The core algorithm is Path Integral Monte Carlo which used to generate arbitrary distributions of underlying risk factors (stocks, bonds, short interest rates, commodities, indices etc.)
The advantage of the new algorithm is that sensitivities of derivative prices with respect to changes in all model parameters are computed in a single simulation.
  • This is crucial for effective hedging.
Parallel version of the algorithm is written in C and MPI and relies on task parallelism and functional decomposition (could also use HPF)
Monte Carlo samples are generated on multiple processors in embarrassingly parallel fashion

HTML version of Scripted Foils prepared May 12 1996

Foil 10 Parallel Maximum Entropy and optimization

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
Pricing modules can either run in lock-step with the Monte Carlo module which generates histories of risk factors or asynchronously perform valuation functions on the histories which are broadcast as they are generated by the Monte Carlo module
We are linking this flexible algorithm with a novel scheme based on Maximum Entropy method which generates implied probability distributions from reported option prices.
The implied distributions can be used within the Path Integral Monte Carlo module to price exotic contracts consistently with exchange-traded contracts and they can also be used to search for arbitrage opportunities
Estimation of implied distributions requires large scale global optimizers.
We are developing two parallel stochastic optimizers based on mean field approximation (Laplace formula) and Langevin equation

HTML version of Scripted Foils prepared May 12 1996

Foil 11 Web-based System Integration -- Initial Server Implementation

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
Derivative valuation functions are integrated using Web technologies into a service which can be accessed from any platform which supports a graphical browser
Using a combination of HTML forms or Java front-end, CGI mechanism, Perl scripts and modules written in C and MPI, which are executed on multiple NPAC RS 6000 and Sun workstations and the SP-2, the user can:
  • retrieve historical data from flat files
  • perform statistical analysis
  • display charts and histograms of historical data
  • estimate parameters of the underlying stochastic processes
  • enter own estimates of model parameters
  • perform simulations
  • display charts and plots of option prices and their sensitivities as functions of time, underlying stock price or option contract excercise (strike) price

HTML version of Scripted Foils prepared May 12 1996

Foil 12 Web-based System Integration -- Futures

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
In the next stage, flat files will be replaced with a parallel Oracle server
Ultimately, the graphical user interface will be supplemented with an agent-based middleware layer, implemented in Java, where derivative pricing and risk management services will be requested and dispatched to the parallel Monte Carlo engine and returned to the client using an EDI-like protocol encapsulated within the KQML envelope.
This will be a prototype of the new service economy that will flourish on the Web.

HTML version of Scripted Foils prepared May 12 1996

Foil 13 Option Pricing

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index

HTML version of Scripted Foils prepared May 12 1996

Foil 14 Option Pricing

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index

HTML version of Scripted Foils prepared May 12 1996

Foil 15 Option Pricing

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index

HTML version of Scripted Foils prepared May 12 1996

Foil 16 Option Pricing

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index

HTML version of Scripted Foils prepared May 12 1996

Foil 17 Binary Black Holes Simulation

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index Secs 72
The Alliance will produce an accurate, efficient description of the coalescence of black holes, and gravitational radiation emitted, by solving computationally EinsteinŐs equations for gravitational fields with direct application to the gravity-wave detection systems LIGO and VIRGO under construction in USA and Europe.

HTML version of Scripted Foils prepared May 12 1996

Foil 18 The Binary Black Hole Grand Challenge Alliance

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index Secs 70
Austin- Chapel Hill- Cornell- NCSA- Northwestern- Penn State- Pittsburgh- NPAC has Formal Goals
To develop a problem solving environment for the Nonlinear Einstein's equations describing General Relativity, including a dynamical adaptive multilevel parallel infrastructure
To provide controllable convergent algorithms to compute gravitational waveforms which arise from Black Hole encounters, and which are relevant to astrophysical events and may be used to predict signals which for detection by future ground-, and space-, based detectors.
  • This code will be made available to researchers in Computational Relativity (by publication and via the World Wide Web).
To provide representative examples of computational waveforms.
http://www.npac.syr.edu/projects/bbh/bbh.html

HTML version of Scripted Foils prepared May 12 1996

Foil 19 BBH: Computational Challenge

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index Secs 48
Problem size: Analysis with Uniform Grid
  • Requested spatial resolution: 50 mesh points per black hole (radius of event horizon, R)
  • To extract gravitational waves a space region of ~100 R is necessary
  • Number of mesh points: (50 x100)3 => ~ 1011
  • Time evolution: 50,000 steps (corresponds to distance ~1000R with dt=dx)
  • Total number of events: ~1016
  • Floating point operations per event: ~104
  • Total FLOP count: 1020 => 30 years of a Teraflop machine!
Solution: Adaptive Mesh Refinement
  • one week of a Teraflop machine

HTML version of Scripted Foils prepared May 12 1996

Foil 20 Adaptive Multilevel Parallel Infrastructure

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index Secs 74
Einstein's equations can be represented as a coupled system of hyperbolic and elliptic PDEs with non-trivial boundary conditions to be solved using adaptive multilevel methods
We are building PSE that will support:
  • composition of stable, convergent AMR and MG solvers
  • software integration (initial value problem, apparent horizon finders, ...
  • automatic conversion of sequential unigrid codes into parallel, multigrid versions
  • collaborative visualization environment
To implement the system we use technologies developed by CRPC, in particular MPI and HPF, combined with emerging new Web technologies: JAVA and VRML 2.0.

HTML version of Scripted Foils prepared May 12 1996

Foil 21 HPF and DAGH Implementation Strategies

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index Secs 53
Main Approach is DAGH: Distributed Adaptive Grid Hierarchy (J. Browne, M. Parashar at Texas )
  • a set of programming abstractions in which computations on dynamic hierarchical grid structures are directly implementable (C++).
  • a set of distributed dynamic data-structures providing transparent scalable distribution of the grid hierarchy across processors (MPI).
  • a set of computational modules and AMR/MG support (shared with HPF when HPF supports this adaptive data structure)
  • (http://godel.ph.utexas.edu/Members/parashar/DAGH/dagh.html)
HPF (Syracuse) relegated to reserve status
  • data parallel implementation of unigrid PDE solvers as HPF does not support necessary distributions
  • Fortran90/HPF dynamic memory management
  • (http://www.npac.syr.edu/projects/bbh/more.html)
DAGH is similar model to HPF with much less general capability but excellent support for AMR data structures and distribution

HTML version of Scripted Foils prepared May 12 1996

Foil 22 Implementation of AMR in BBH

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
Architecture of AMR for both HPF and other implementation

HTML version of Scripted Foils prepared May 12 1996

Foil 23 Fortran90 Module Structure

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
We use object oriented as well as parallel features of HPF

HTML version of Scripted Foils prepared May 12 1996

Foil 24 Data Structures in F90 BBH Code

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
See previous foil for definition of module types

HTML version of Scripted Foils prepared May 12 1996

Foil 25 Parallelization of AMR (in HPF) -- I

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
HPF currently doesnt support necessary general irregular distributions for good AMR parallelization

HTML version of Scripted Foils prepared May 12 1996

Foil 26 Syracuse Contributions to Black Hole GC

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index Secs 51
The Alliance Software Library
Coordination of the development of the base ADM (one of two major algorithms) evolution code (T2)
  • DAGH implementation of T2 (with Austin)
  • HPF implementation of T2 on SP-2, T3D and DEC Alphas
  • Library Components of the AMR/MG systems
Design and implementation of the PSE
  • Standarization of the module interfaces
  • AMR drivers (DAGH, Fortran 90, HPF)
  • JAVA based GUI
Visualization and Collaborative tools
Development of CS course module (CPS713) on Numerical Relativity and CFD

HTML version of Scripted Foils prepared May 12 1996

Foil 27 HPF implementation of T2

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index Secs 47
Scalable, portable performance of the unigrid code
http://www.npac.syr.edu/users/haupt/bbh/HPF/index.html

HTML version of Scripted Foils prepared May 12 1996

Foil 28 HPF Application Experience at NPAC

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
6 "full" size applications
  • BBH: ADM Evolution Code (integrates Einstein's equations)
  • BBH: model for waveform extraction (for scalar waves)
  • Electromagnetic Simulations
  • Financial Modelling
  • 4D Data Assimilation (Study -- Incomplete)
  • 1D PIC Plasma Simulations (using extrinsics)
HPFA application kernels
HPF taught in CPS615 and CPS713 courses
  • Many Student projects including quite complex CFD codes from Aerospace Engineering -- students (currently) are confused by HPF Environment

HTML version of Scripted Foils prepared May 12 1996

Foil 29 HPF: Some Problems we found

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
We find that our best results with HPF codes come from those designed as parallel codes.
Traditional Model of "dusty deck + HPF directives" often does not work well.
Is not as intuitive as expected. Students have problems with getting performance. Reasons:
  • Poor knowledge of Fortran 90.
  • Many "details" are not handled well without some code massaging. Often HPF actions are highly non intuitive for inexperienced users.
  • Poor and cryptic compiler diagnostics and feedback.
  • Profilers (both PGI and DEC) help to identify sources of inefficiencies, though.

HTML version of Scripted Foils prepared May 12 1996

Foil 30 Special Problems with PGI HPF Compiler

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
PGI Compiler is typically very succesful but ..
PGHPF is a translator, not a "real" compiler.
We encountered situations that the fortran 77 code generated by PGHPF cannot be compiled by the native IBM xlf compiler at a decent optimalization level (O3) which leads to a lousy node performance (with linear parallel speedup).
We suspect that the problem lies in array's index expressions which are always one dimensional(?).

HTML version of Scripted Foils prepared May 12 1996

Foil 31 Problems with Adaptive Mesh Refinement in HPF

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
Lack of support for pointers to distributed arrays (vendors promise that soon, however).
HPF-1 does not allow for distribution of components of derived types (tensor notation, tree structure of grids in AMR). Wait for HPF-2.
HPF-1 is too restrictive wrt data and computation distribution (irregular block, subset of processors). Wait for HPF-2.

HTML version of Scripted Foils prepared May 12 1996

Foil 32 Main Window for Java Interface to Distributed Computing Environment

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
From Gregor von Laszewski

HTML version of Scripted Foils prepared May 12 1996

Foil 33 Screens Opened for Java Interface to Distributed Computing Environment

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
From Gregor von Laszewski

HTML version of Scripted Foils prepared May 12 1996

Foil 34 Data Window Opened for Java Interface to Distributed Computing Environment

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
From Gregor von Laszewski

HTML version of Scripted Foils prepared May 12 1996

Foil 35 Specification Screen Opened for Java Interface to Distributed Computing Environment

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
From Gregor von Laszewski

HTML version of Scripted Foils prepared May 12 1996

Foil 36 Subroutine Specification Screen Opened for Java Interface to Distributed Computing Environment

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
From Gregor von Laszewski

HTML version of Scripted Foils prepared May 12 1996

Foil 37 Computational Chemistry at NPAC

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
http://www.npac.syr.edu/users/bernhold/comp_chem
Use of modeling in chemistry has exploded
in recent years, driving the push for larger
and more accurate calculations to simulate
"real world" chemical phenomena.
Chemistry applications range in cost from N2 to N4, N6, and higher (N proportional to the size of the molecule). Can be both CPU- and memory-intensive.
Interested both in legacy and "HPCC-designed" applications
  • Existing codes are often quite large (100,000+ lines) and embody perhaps decades of effort -- not rewritten lightly!
  • Many legacy codes can be retrofitted with simple parallel algorithms that allow reasonable efficiency for small-scale parallelism on local resources (including NOWs).
  • Large-scale calculations require parallel computing using a distributed-data model. Naive parallel algorithms are generally insufficient. Requires codes constructed from scratch.

HTML version of Scripted Foils prepared May 12 1996

Foil 38 Computational Chemistry at NPAC -- MOPAC

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
http:/www.npac.syr.edu/users/thlin/Mopac.html
Widely used nearly-free legacy application, 30,000 lines of Fortran77
Solves the Hartree-Fock/self-consistent field (SCF) problem using "semiempirical" (approximate) representations of the electronic interactions.
  • Applicable to large molecules (including biomolecules).
Majority of computations in concentrated in
  • construction of the Coulson electron density matrix (embarrasingly parallel: implemented in MPI)
  • diagonalization (the original Mopac routines replaced by PEIGS library developed at PNNL)
The parallel implementation is being tested on Cornell SP-2 now

HTML version of Scripted Foils prepared May 12 1996

Foil 39 Computational Chemistry at NPAC -- NWChem

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
http://www.emsl.pnl.gov:2080/docs/nwchem/nwchem.html
New (begun 1993) computational chemistry package designed specifically for large-scale calculations on MPPs.
NPAC collaborating with Pacific Northwest National Laboratory (PNNL) , which leads the development.
Includes many comp. chem. methods: molecular dynamics, ab initio self-consistent field (SCF) and correlated methods.
Implemented in Fortran77 & C using a distributed-data approach. All data larger than O(N) is distributed.
Based on Global Array Toolkit -- provides programmer with one-sided shared-memory programming model regardless of underlying platform
  • Portable: Implementations for distributed memory, shared memory, distributed clusters of SMP nodes, NOWs, I-WAY
  • Exposes NUMA nature common to all platforms to programmer -- efficient portable algorithms consider or use NUMA
  • Designed for straightforward migration to HPF

HTML version of Scripted Foils prepared May 12 1996

Foil 40 Global Array Toolkit (PNNL)

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
http://www.emsl.pnl.gov:2080/docs/global/ga.html
Note Matrix Formation and Algebra underlies much Chemistry
Provides programmer with one-sided shared-memory programming model regardless of underlying platform
Interfaces with parallel linear algebra libraries: PeIGS, ScaLAPACK, ISDA, etc.
Exposes NUMA nature common to all platforms to programmer -- efficient portable algorithms consider or use NUMA
Portable -- implementations available for
  • Distributed memory (interrupt-driven messages)
  • Shared memory (using SysV shared memory features)
  • Clusters of SMP nodes, NOWs, etc. (shared-memory within cluster, data server process for inter-cluster comms via simple message passing)
  • I-WAY (data replicated on distant MPPs)
Designed for straightforward migration to HPF

HTML version of Scripted Foils prepared May 12 1996

Foil 41 Computational Chemistry at NPAC -- Related Projects

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
Web-based Global Arrays (similar to WebHPF)
being developed by Kivanc Dincer (NPAC)
Parallel I/O requirements of NWChem algorithms with Alok Choudhary (SU) and Dan Reed (UIUC)
Global Arrays on top of Active Messages with Nikos Chrisochoides (Cornell)
Development of new theoretical methods and new algorithms for large-scale correlated calculations
Chemical applications in collaboration with Syracuse and PNNL chemists
Future Plans: Model computational chemistry applications in HPF (port from Global Array-based algorithms)

HTML version of Scripted Foils prepared May 12 1996

Foil 42 AskNPAC about Chemistry -- NHSE Discipline Specific Resource

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
Problem: Knowledge and discussions of interest to chemists are scattered all over the Internet -- hard to find and use!
  • Nearly 90 mailing lists/newsgroups identified on the first pass
  • Many lists not archived or offer only e-mail retrieval (tedious)
  • Search capability very limited or nonexistent
  • Info is too widely distributed
Solution: Use the AskNPAC news Web linked database technology to provide single point of contact, archiving, and search capability via WWW
  • AskNPAC already supports archives (primarily newsgroups) in Computers & Software, Education & Kids, Politics, New York State & Health, Jobs
  • Use with largely e-mail-based discussions in a particular discipline puts a different "spin" on the technology

HTML version of Scripted Foils prepared May 12 1996

Foil 43 AskNPAC about Chemistry -- NHSE

From Overview of HPCC Applications at NPAC CRPC Annual Meeting -- May 14-17 1996. *
Full HTML Index
Provides "one stop shopping":
  • Archiving (persistence)
  • Structured searching (headers vs. body, URLs, phrases, etc.)
  • Hypermail-like browser
Future plans:
  • Public roll-out when archive large enough to make searches worthwhile
  • Use search capabilities to extract announcements of software, web resources, etc. for further cataloging (i.e. NHSE)
Contact: David Bernholdt,
or Gang Cheng {bernhold,gcheng}@npac.syr.edu

© Northeast Parallel Architectures Center, Syracuse University, npac@npac.syr.edu

If you have any comments about this server, send e-mail to webmaster@npac.syr.edu.

Page produced by wwwfoil on Sun Dec 14 1997