Given by Geoffrey Fox at CRPC Annual Meeting on May 14-17 1996. Foils prepared May 12 1996
Outside Index
Summary of Material
We describe several HPCC Large Scale Simulations in which NPAC is involved and comment on implications for HPF! |
Work in Porting ARPS Weather code to Syracuse region and Integration with VRML Visualization |
Work from InfoMall Industry Outreach on Financial modelling with Monte Carlo SP2 code linked to Web for "Pricing on Demand" |
Problem Solving Environment and Adaptive Meshs for NSF Grand Challenge on Binary Black Hole Collisions |
NASA Grand Challenge on 4D Data Assimilation |
A set of activities (mainly with PNL) on Computational Chemistry -- Relation of HPF and Global Arrays |
Outside Index Summary of Material
CRPC Annual Meeting |
May 15 1996 |
Geoffrey Fox |
NPAC |
Syracuse University |
111 College Place |
Syracuse NY 13244-4100 |
We describe several HPCC Large Scale Simulations in which NPAC is involved and comment on implications for HPF! |
Work in Porting ARPS Weather code to Syracuse region and Integration with VRML Visualization |
Work from InfoMall Industry Outreach on Financial modelling with Monte Carlo SP2 code linked to Web for "Pricing on Demand" |
Problem Solving Environment and Adaptive Meshs for NSF Grand Challenge on Binary Black Hole Collisions |
NASA Grand Challenge on 4D Data Assimilation |
A set of activities (mainly with PNL) on Computational Chemistry -- Relation of HPF and Global Arrays |
We briefly review 3 topics |
Capabilities of the Oklahoma Advanced Regional Prediction System (ARPS) code. |
Collaboration with the Center for Analysis and Prediction of Storms (CAPS). |
VRML integration to the Terrain Data Set. |
Non-hydrostatic, compressible dynamics in a terrain-following vertical coordinate. |
6 water phases microphysics (water vapor, cloud water, rain water, cloud ice, snow, and hail/graupel.) |
Supports MPP's (T3D's, SP2's, clusters, ...) |
Other current numerical prediction models lack the spatial resolution required to capture small-scale, short-duration events such as snow bands.
|
Note Syracuse Area weather very position sensitive |
We had a meeting with CAPS during the last week of April, and the outcome is the following:
|
We will use the current output from ARPS in hdf format. |
We will visualize the moisture variables.
|
For the integration of the weather data to the existing 3D VRML terrain data set, we will use a 2D mask function which will determine where snow has fallen to a certain level. |
VRML data will be stored in a database, allowing the user to interactively turn on and off the weather information over the terrain. |
Cooperative distributed (and parallel) computing will become mainstream in financial engineering due to a convergence of the following factors: |
Increased volatility due to globalization of financial markets |
Global distribution of data sources |
Increase in complexity of derivatives and risk management vehicles |
Increased demand for real-time asset allocation decision support |
Increased volume of raw data and need to process large databases |
Increased volume on the retail side of the spectrum in part due to on-line technologies (Internet and WWW) |
HPCC is becoming indispensable in the application domains such as:
|
NPAC is engaged in development of new tools for quantitative financial modeling which take advantage of scalable computer architectures |
The ultimate goal is to integrate various quantitative analysis transparently using Web technologies into a seamless cooperative computing environment, capable of supporting all aspects of enterprise-wide risk management. |
We developed new algorithms for risk neutral valuation of derivative financial instruments |
Theoretical prices of derivative instruments are obtained by discounting their expected payoffs under the equivalent martingale measure using money market interest rate. |
The core algorithm is Path Integral Monte Carlo which used to generate arbitrary distributions of underlying risk factors (stocks, bonds, short interest rates, commodities, indices etc.) |
The advantage of the new algorithm is that sensitivities of derivative prices with respect to changes in all model parameters are computed in a single simulation.
|
Parallel version of the algorithm is written in C and MPI and relies on task parallelism and functional decomposition (could also use HPF) |
Monte Carlo samples are generated on multiple processors in embarrassingly parallel fashion |
Pricing modules can either run in lock-step with the Monte Carlo module which generates histories of risk factors or asynchronously perform valuation functions on the histories which are broadcast as they are generated by the Monte Carlo module |
We are linking this flexible algorithm with a novel scheme based on Maximum Entropy method which generates implied probability distributions from reported option prices. |
The implied distributions can be used within the Path Integral Monte Carlo module to price exotic contracts consistently with exchange-traded contracts and they can also be used to search for arbitrage opportunities |
Estimation of implied distributions requires large scale global optimizers. |
We are developing two parallel stochastic optimizers based on mean field approximation (Laplace formula) and Langevin equation |
Derivative valuation functions are integrated using Web technologies into a service which can be accessed from any platform which supports a graphical browser |
Using a combination of HTML forms or Java front-end, CGI mechanism, Perl scripts and modules written in C and MPI, which are executed on multiple NPAC RS 6000 and Sun workstations and the SP-2, the user can:
|
In the next stage, flat files will be replaced with a parallel Oracle server |
Ultimately, the graphical user interface will be supplemented with an agent-based middleware layer, implemented in Java, where derivative pricing and risk management services will be requested and dispatched to the parallel Monte Carlo engine and returned to the client using an EDI-like protocol encapsulated within the KQML envelope. |
This will be a prototype of the new service economy that will flourish on the Web. |
The Alliance will produce an accurate, efficient description of the coalescence of black holes, and gravitational radiation emitted, by solving computationally EinsteinŐs equations for gravitational fields with direct application to the gravity-wave detection systems LIGO and VIRGO under construction in USA and Europe. |
Austin- Chapel Hill- Cornell- NCSA- Northwestern- Penn State- Pittsburgh- NPAC has Formal Goals |
To develop a problem solving environment for the Nonlinear Einstein's equations describing General Relativity, including a dynamical adaptive multilevel parallel infrastructure |
To provide controllable convergent algorithms to compute gravitational waveforms which arise from Black Hole encounters, and which are relevant to astrophysical events and may be used to predict signals which for detection by future ground-, and space-, based detectors.
|
To provide representative examples of computational waveforms. |
http://www.npac.syr.edu/projects/bbh/bbh.html |
Problem size: Analysis with Uniform Grid
|
Solution: Adaptive Mesh Refinement
|
Einstein's equations can be represented as a coupled system of hyperbolic and elliptic PDEs with non-trivial boundary conditions to be solved using adaptive multilevel methods |
We are building PSE that will support:
|
To implement the system we use technologies developed by CRPC, in particular MPI and HPF, combined with emerging new Web technologies: JAVA and VRML 2.0. |
Main Approach is DAGH: Distributed Adaptive Grid Hierarchy (J. Browne, M. Parashar at Texas )
|
HPF (Syracuse) relegated to reserve status
|
DAGH is similar model to HPF with much less general capability but excellent support for AMR data structures and distribution |
Architecture of AMR for both HPF and other implementation |
We use object oriented as well as parallel features of HPF |
See previous foil for definition of module types |
HPF currently doesnt support necessary general irregular distributions for good AMR parallelization |
The Alliance Software Library |
Coordination of the development of the base ADM (one of two major algorithms) evolution code (T2)
|
Design and implementation of the PSE
|
Visualization and Collaborative tools |
Development of CS course module (CPS713) on Numerical Relativity and CFD |
Scalable, portable performance of the unigrid code |
http://www.npac.syr.edu/users/haupt/bbh/HPF/index.html |
6 "full" size applications
|
HPFA application kernels |
HPF taught in CPS615 and CPS713 courses
|
We find that our best results with HPF codes come from those designed as parallel codes. |
Traditional Model of "dusty deck + HPF directives" often does not work well. |
Is not as intuitive as expected. Students have problems with getting performance. Reasons:
|
PGI Compiler is typically very succesful but .. |
PGHPF is a translator, not a "real" compiler. |
We encountered situations that the fortran 77 code generated by PGHPF cannot be compiled by the native IBM xlf compiler at a decent optimalization level (O3) which leads to a lousy node performance (with linear parallel speedup). |
We suspect that the problem lies in array's index expressions which are always one dimensional(?). |
Lack of support for pointers to distributed arrays (vendors promise that soon, however). |
HPF-1 does not allow for distribution of components of derived types (tensor notation, tree structure of grids in AMR). Wait for HPF-2. |
HPF-1 is too restrictive wrt data and computation distribution (irregular block, subset of processors). Wait for HPF-2. |
From Gregor von Laszewski |
From Gregor von Laszewski |
From Gregor von Laszewski |
From Gregor von Laszewski |
From Gregor von Laszewski |
http://www.npac.syr.edu/users/bernhold/comp_chem |
Use of modeling in chemistry has exploded |
in recent years, driving the push for larger |
and more accurate calculations to simulate |
"real world" chemical phenomena. |
Chemistry applications range in cost from N2 to N4, N6, and higher (N proportional to the size of the molecule). Can be both CPU- and memory-intensive. |
Interested both in legacy and "HPCC-designed" applications
|
http:/www.npac.syr.edu/users/thlin/Mopac.html |
Widely used nearly-free legacy application, 30,000 lines of Fortran77 |
Solves the Hartree-Fock/self-consistent field (SCF) problem using "semiempirical" (approximate) representations of the electronic interactions.
|
Majority of computations in concentrated in
|
The parallel implementation is being tested on Cornell SP-2 now |
http://www.emsl.pnl.gov:2080/docs/nwchem/nwchem.html |
New (begun 1993) computational chemistry package designed specifically for large-scale calculations on MPPs. |
NPAC collaborating with Pacific Northwest National Laboratory (PNNL) , which leads the development. |
Includes many comp. chem. methods: molecular dynamics, ab initio self-consistent field (SCF) and correlated methods. |
Implemented in Fortran77 & C using a distributed-data approach. All data larger than O(N) is distributed. |
Based on Global Array Toolkit -- provides programmer with one-sided shared-memory programming model regardless of underlying platform
|
http://www.emsl.pnl.gov:2080/docs/global/ga.html |
Note Matrix Formation and Algebra underlies much Chemistry |
Provides programmer with one-sided shared-memory programming model regardless of underlying platform |
Interfaces with parallel linear algebra libraries: PeIGS, ScaLAPACK, ISDA, etc. |
Exposes NUMA nature common to all platforms to programmer -- efficient portable algorithms consider or use NUMA |
Portable -- implementations available for
|
Designed for straightforward migration to HPF |
Web-based Global Arrays (similar to WebHPF) |
being developed by Kivanc Dincer (NPAC) |
Parallel I/O requirements of NWChem algorithms with Alok Choudhary (SU) and Dan Reed (UIUC) |
Global Arrays on top of Active Messages with Nikos Chrisochoides (Cornell) |
Development of new theoretical methods and new algorithms for large-scale correlated calculations |
Chemical applications in collaboration with Syracuse and PNNL chemists |
Future Plans: Model computational chemistry applications in HPF (port from Global Array-based algorithms) |
Problem: Knowledge and discussions of interest to chemists are scattered all over the Internet -- hard to find and use!
|
Solution: Use the AskNPAC news Web linked database technology to provide single point of contact, archiving, and search capability via WWW
|
Provides "one stop shopping":
|
Future plans:
|
Contact: David Bernholdt, |
or Gang Cheng {bernhold,gcheng}@npac.syr.edu |