Reply-to: gcf@npac.syr.edu To: hawick Subject: Glossary Date: Thu, 23 Mar 95 07:23:41 EST From: gcf BELOW *** I is Instance *** G is general **** make set of crpc applications and grandchallenges specific instances **** crpc applications are curently in rather gernerically leave this in general part need to add more information for instance version when I split entry into I and G; G is start and I at end The Instance should be called X at Y where Y site and X field HPCC Applications and Technology Glossary

HPCC Applications and Technology Glossary

Version 1.01


A

*** I AHPCRC (n.) Researchers from the US Army High Performance Computing Research Center (AHPCRC) at the University of Minnesota is engaged in research into computational fluid dynamics, finding efficient algorithms for modern parallel computers, and means of visualising the results of CFD simulations. In particular, they have targeted the problem of simulating time-dependent problems over different Mach regimes, using state-of-the-art parallel conditioned conjugate gradient solvers.

*** G Asynchronous Problems(n.) These are hard to parallelize problems with no natural algorithmic synchronization between the evolution of linked irregular data points. Further information and examples can be found in Parallel Computing Works, chapters I and II.

*** G atmospheric research (n.) Studies of the behaviour of the atmosphere involve both large scale numerical simulation and the analysis of significant quantities of data.

B

*** G back substitution (n.) See LU decomposition.

*** G battlefield simulations (n.) The complexity and expense of modern warfare has created the need for accurate computer simulations, hence allowing commanders to explore tactical and strategic issues at minimum costs. *** I The US Army Research Laboratory (ARL)'s Simulation Technology Division conducts research into this area.

**** this area goes to battlefield simulation *** G biomolecular simulations (n.) The field of biotechnology has seen spectacular advances in the recent years. The understanding of some of these processes at the molecular level has allowed computationally demanding models to be made. In particular two areas which have received attention are automated DNA sequencing, and the modeling of protein folding. The BLAS (n.) Basic linear algebra software: a suite of very basic linear algebra routines, out of which almost any other matrix calculation can be built. See the National Software Exchange for the BLAS source and documentation.

C

*** G C2I (n.) Command, Control and Intelligence. See C3I.

*** G C3I (n.) Command, Control, Communications and Intelligence (C3I) is one of the most important concepts in modern US warfare doctrine. The underlying idea is as old as war itself, but the prevalence of modern technology has allowed new avenues to be explored in the task of providing the vital eyes, ears, and voices for the military commanders. *** I The US Airforce's Rome Laboratory has a dedicated C3I Technology programme, **** in the C3I area and has developed nonmilitary spin off *** note deletions contributions in a wide area of activities, from helping the United Nations' aid effort in the Former-Yugloslavia, demonstration of the New York Network high speed network (with NYNEX, Inc), as well as industrial technology transfer programmes.

*** G C4I (n.) Command, Control, Communications, Computers and Intelligence. See C3I.

*** G CAD (n.) Computer-Aided Design; a term which can encompass all facets of the use of computers in manufacturing although the term CAM is also in use.

CAE (n.)Computer-Aided Engineering, like CAD, but usually applied to the use of computers in fields such as civil and nautical engineering.

CAM (n.) Computer-Aided Manufacturing.

*** G cardiac fluid dynamics (n.) The availability of supercomputers has expanded the models medical researchers are able to make. *** I Of particular note in the anatomical field **** of cardiac fluid dynamics is recent work by researchers at New York University who have used supercomputer resources at the Pittsburgh Supercomputing Center to produce a full three dimensional View of the Heart. After 15 years of development, beginning with two dimensions and for the last five to six years working on the much more difficult three-dimensional version, the researchers finally achieved their goal of producing a fully functioning 3-D computational model of the heart, its valves and the flow of blood through the nearby major vessels. For this accomplishment, the researchers C. Peskin and D. McQueen received the 1994 Computerworld Smithsonian Award for Breakthrough Computational Science.

*** G CEM (n.) computational electromagnetics; the simulation or prediction of electromagnetic fields using computers. Models used include field and particle models and a number of techniques including: finite differences; finite elements; method of moments; and discrete methods. Further information is available as an article in the Encyclopaedia Technologica. See also CFD.

*** G CFD (n.) Computational fluid dynamics; the simulation or prediction of fluid flow using computers, a field which has generally required twice the computing power available at any given time.

*** G chemical processing plant simulations (n.) The building of modern chemical plants on an industrial scale prohibits is often orders of magnitude higher than the cost of a high-end supercomputer. Hence it is obviously cost effective to simulate the processes within the prospective plant as closely as possible, for example the behaviour inside distillation columns, to both prove designs as well as optimise existing plants. *** I Bayer AG, is one of the larger users of high performance computing for processing plant simulations.

*** G chemistry modeling (n.) an application category that includes simulations of: chemical potentials; elemental reactions and chemical dynamics. The problems are often formulated in terms of matrix elements; matrix eigenvalues; matrix multiplication and matrix inversion.

*** G classical dynamics (n.) See molecular dynamics.

*** G cleanup of groundwater (n.) This topic combines aspects of CFD, multiphase fluids in porous medium etc into groundwater flow simulation for the production of environmental modeling codes. For further information see Computational Applications of the CRPC.

*** G climate modeling (n.) The development of sophisticated numerical models on high performance computers is currently the only realistic means Man has for answering questions regarding Earth's future climate. These models are more demanding than those used for short term, comparatively local, forecasting because entire planet's atmospheric and ocean circulation need to be considered, as well as for substantially longer time periods. The Climate and Global Dynamics Division (CGD) of NCAR which studies the physical causes of present and past climates, with particular emphasis on global change. ****I Call Community Climate Model and other general climate applications *** repeat last sentence of climate entry The division also examines the large-scale dynamics of the atmosphere and oceans, developing complex supercomputer models, including NCAR's Community Climate Model. In addition, researchers at the Lawrence Livermore National Laboratory have demonstrated an Atmospheric-Oceanic model on a variety of high-performance platforms.

*** G climate studies (n.) The numerical simulation of the long term development of the earths climate is an area of growing concern. Numerical models bring together chemistry modeling, fluid flow. Models are intergrated for long time periods and numerical errors are a major concern. Advanced methods are adopted in an attempt to reduce these. See also weather forecasting.

*** G combined applications (n.) Here several currently isolated computations are tightly coupled together to performan advanced computational simulation of a complex problem. One example would be an engineering problem such a fire simulation. Current codes make many simplifing assumptions but an advanced code could have coupled high resolution CFD, structural dynamics, chemistry modeling, environmental modeling, graphics rendering and particle flux transport modeling computations within one code.See also complex systems.

*** G combustion process simulation (n.) Combustion processes dominate current transportations, as well as playing a major part in many industrial processes. Both the requirements of energy efficiency and emission reduction have led industry to increased use of computer simulation to design combustion devices. *** I *** call Specific Combustion Projects Researchers from the Los Alamos National Laboratory have worked with those from the Lawrence Livermore National Laboratory, the Courant Institute for the Mathematical Sciences, and the University of California, Berkeley, to develop a new generation of computational tools for simulating combustion processes on complex three-dimensional geometries, using state-of-the-art numerical algorithms. This project makes use of adaptive meshes, and both data-parallel and domain decomposition algorithms are being developed for the parallel implementation. Support is from the Department of Energy's HPCC program.

*** G complex systems (n.) an applications category where many relatively simple modeling components are combined into a complex whole systems model. Examples might include defense simulations; education simulations; multimedia and virtual reality in entertainment situations; multiuser virtual worlds; and chemical and nuclear plant operation. Applications are often event driven and time stepped. Powerful computational engines are often needed as well as very large database backends. These systems are often intractive. The MADIC project concerns the integration of large complex systems for the aerospace industry. See also CFD, CEM, structural dynamics.

*** G computational biology (n.) The simulation of biological molecules and processes. This combines many application areas including computational chemistry, genetic sequencing and molecular dynamics. Parallel technologies can assist in the simulation of the processes involved, as well as the processing and visualisation of the large amount of data produced. For further information see Computational Biology applications of the CRPC.

*** G computational biomolecular design (n.) Here aspects of biomolecular simulations and computational biology are combined with expertise in chemistry, chemical engineering, computer science and mathematics to produce models for life processes. For further information see Computational Applications of the CRPC.

*** G computational chemistry (n.) Numerical simulation of chemical reactions. Associated areas are molecular modeling, classical and quantum molecular dynamics involving both classical mechanics for particle-in-cell methods, and Monte Carlo methods for quantum systems. Manufacturing processes can involve study of applications such as CFD and multiphase fluids in porous medium flows. For further information see Computational Chemistry applications of the CRPC.

*** G Computational Fluid Dynamics (n.) See CFD.

*** G conjugate gradient method (n.) A technique for solving systems of linear algebraic equations, which proceeds by minimizing a quadratic residual error function. The method is iterative but quite powerful: in the absence of roundoff error, it will converge exactly in M steps, where M is the order of the system in question.

D

*** G data assimilation (n.) Here the collection and processing of realtime data from sources such as satellite or enviromental studies are coupled with models of environmental change such as weather forecasting to improve accuracy. The process involves using temporal data along with spatial data as part of the input boundary conditions. For further information see Computational Applications of the CRPC.

*** G direct method (n.) Any technique for solving a system of equations that relies on linear algebra directly. LU decomposition with back substitution is an example of a direct method. See also indirect method.

E

*** G Embarrassingly Parallel Applications(n.) Such applications can employ complex algorithms but can be parallelized because the evolution of different points is largely independent. More information and examples can be found in Parallel Computing Works, I

*** G energy management (n.) Here the goal is to provide support for the design, implementation and management of energy efficent processes. It includes a wide range of computing technologies from the design cycle and CAD/CAE to the management of large databases and realtime active control systems which many include neural networks.

*** G environmental modeling (n.) an application category that can include: weather forecasting; climate simulation; oil reservoir simulation; waste repository simulation. Most work in this field is done using partial differential equation solvers either using finite difference methods or finite element methods. An important complication of models used in this field is their sensitivity to input data in the form of initial conditions and feedback.

F

*** G fast Fourier transform (n.) See FFT

*** G fast multipole (n.)

*** G FFT (n.) The fast Fourier transform is a technique for the rapid calculation of discrete Fourier transform of a function specified discretely at regular intervals. The technique makes use of a butterfly data structure.

*** G financial modeling (n.) an application category wihich is mostly concerned with decision support and optimisation. Techniques such as Monte Carlo methods and simulated annealing are often used. Financial simulation models are often linked to very large database systems to access pricing and historical data.

*** G finite difference method (n.) A direct method for the approximate solution of partial differential equations on a discrete grid, by approximating derivatives of the unknown quantities on the grid by linear differences. See also finite element method.

*** G finite element method (n.) An approximate method for solving partial differential equations by replacing continuous functions by piecewise approximations defined on polygons, which are referred to as elements. Usually polynomial approximations are used. The finite element method reduces the problem of finding the solution at the vertices of the polygons to that of solving a set of linear equations. This task may then be accomplished by a number of methods, including Gaussian elimination, the conjugate gradient method and the multigrid method. See also finite difference method.

*** G forward reduction (n.) See LU decomposition.

*** G full matrix (n.) A full matrix is one in which the number of zero elements is small (in some sense) compared to the total number of elements. See also sparse matrix.

G

*** G *** Individual mentioned sites should be listed in instance glossary with *** some description galaxy formation (n.) Study of the formation and evolution of galaxies, using large-scale N-body simulation. Further information is available from the Grand Challenge Cosmology Consortium, the University of Washington HPCC Earth and Space Science Project, and the Los Alamos Theoretical Astrophysics Group. See also Grand Challenges, N-body simulation.

*** G Gauss-Seidel method (n.) An iterative method for solving partial differential equations on a grid. When updating a grid point the new value depends on the current values at the neighbouring grid points, some of which are from the previous iteration and some, which have already been updated, are from the current iteration. So updates are performed by sweeping through the grid in some user-chosen fashion. The key feature of this algorithm is that it makes use of new information (updated grid point values) as soon as they become available.

*** G Gaussian elimination (n.) A method for solving sets of simultaneous linear equations by eliminating variables from the successive equations. The original equation in the form Ax=b (A is a matrix, b the vector of known values, and x the unknown solution vector) is reduced to Ux=c, where U is an upper triangular matrix. The solution vector x can then be found by back substitution. This method is usually formulated as LU decomposition.

*** G geoscience (n.) The study and simulation of the Earth's geology. This encompasses groundwater flow simulation, oil reservoir simulation, seismic wave simulation, and pollution modeling. Codes often involve several application areas including CFD, computational chemistry and data assimilation. For further information see the Geophysical Parallel Computation Project.

*** G Grand Challenges (n.) Grand Challenges are defined by the National Science Foundation as "fundamental problems in science and engineering with broad economic and scientific impact, whose solutions require the application of high-performance computing". For further information, see a list of Grand Challenges.

*** G graphics rendering (n.) An applications category covering computer graphics, computer animation and virtual reality. Common rendering algorithms include ray tracing, radiosity, and polygon-based rendering.

*** G groundwater flow simulation (n.) The problem of nuclear waste from highly radioactive spent reactor fuel rods to rubber gloves from medical uses is becoming a source of concern in many countries, as available above ground storage sites are filled. One promising solution is to place the material in vast underground repository, to be sealed over geological time scales. For those examining potential sites, the major concern is whether the radionuclides will escape in unacceptable quantities, via transport from groundwater. Computer simulations have been developed to answer these questions, based on the best available geological data. **** I The Edinburgh Parallel Computing Centre has collaborated with the United Kingdom Atomic Energy Authority to develop a suite of their

H

********** all H are G Hawick (n.) a Scots word meaning a "town surrounded by a hedge"; also an actual town on the border between Scotland and England; also my surname. This is not relevant to HPCC except that this is a usful way of ensuring my email address (hawick@npac.syr.edu) does not get lost from this file so you can always seek out the latest version of this glossary.

health modeling (n.) an application category that includes biological modeling with examples such as simulations of the effects of pollutants such as lead in human blood. This field typically makes use of empirical models, Monte Carlo methods and histograms and statistical modeling.

High Performance Fortran (n.) See HPF.

high speed networks (n.) The Federal High Performance Computing and Communications (HPCC) programme has always stressed the need for high speed network technology, as well as high performance computing machines. Researchers in the US Navy's Naval Research Laboratory have worked on the High Speed Optical Networking (HSON) project which sponsors commercial development of Asynchronous Transfer Mode (ATM) networking technology. The intent is to develop an early capability, including supercomputing and multimedia interfaces, that will be compatible with future ATM service offerings from public networks.

HPCC (n.) an acronymn for High Performance Computing and Communications, which is the field of information addressed by this glossary. A USA National Coordination Office for HPCC also exists, and other information on HPCC can be found from the Northeast Parallel Architectures Center, the Center for Research in Parallel Computing the National Software Exchange or the Edinburgh Parallel Computing Centre depending upon your geography.

HPF (n.) High Performance Fortran, a data parallel programming language definition developed by the HPF Forum lead by CRPC. HPF expands Fortran 90 by adding various directives and other parallel constructs. For further information see the HPF Applications collection at the Northeast Parallel Architectures Center .

*********** all I are G ***** except Icase which is an instance

I

ICASE (n.) The Institute for Computer Applications in Science and Engineering (ICASE), is a center of research in Computational Fluid Dynamics (CFD), providing mechanisms for interactions among NASA scientists and engineers, the ICASE staff, universities and related industries.

indirect method (n.) Any technique for solving a system of equations that does not rely on linear algebra directly. Successive over-relaxation is an example of an indirect method. See also direct method.

InfoMall (n.) a virtual organisation for the development of HPCC software and development. For further information see the InfoMall server at the Northeast Parallel Architectures Center .

inner product method (n.) method of matrix multiplication in which one element of the resultant matrix is computed at a time. See also middle product method

ISO (n.) International Standards Organization, which, among other things, sets standards for programming languages.

J

*** G Jacobi method (n.) A stationary, iterative method for solving a partial differential equation on a discrete grid. The update of each grid point depends only on the values at neighbouring grid points from the previous iteration. See also Gauss-Seidel method.

K

L

*** G land cover dynamics (n.) The analysis of large data volumes from remote sensing sources such as high resolution satellite images and enviromental monitoring programs. Large databases need to be search and analysised to detect any changes and then feed these into biogeochemical cycling, hydrological modeling and ecosystem response modeling codes to study the possible impacts. For further information see Computational Applications of the CRPC.

*** :I LAPACK (n.) A linear algebra software package, which has been mounted on a wide range of platforms. It evolved from the older LINPACK package from Netlib. See also ScaLAPACK.

*** I LINPACK (n.) A linear algebra software package, which has been mounted on a wide range of platforms. It has now been superceded by LAPACK. (n.) also a set of widely quoted performance benchmarks based on linear algebra and available from the National Software Exchange.

*** G Loosely Synchronous Applications (n.) This class of applications are iterative or time-stepped but unlike the synchronous case, employ different evolution(update) procedures which synchronize macroscopically, , more information and examples can be found in Parallel Computing Works, I ,II , III and IV

*** G ***** add cross reference to gaussian elimination LU decomposition (n.) a technique where a matrix A is represented as the product of a lower triangular matrix, L, and an upper triangular matrix U. This decomposition can be made unique either by stipulating that the diagonal elements of L be unity, or that the diagonal elements of L and U be correspondingly identical.

M

*** I MADIC (n.) Multidisciplinary Analysis and Design Industrial Consortium - an industrial consortium of several aerospace industry partners, developing engineering codes for cross optimizing design of CFD, CEM and structural engineering simulation and modeling. For further information see MADIC at NPAC.

*** I **** should have a general entry for manufacturing processes saying The Manufacturing Process is receiving growing interest computationally and is included as part of multidisciplinary analysis which links all aspects of a product in a single simulation (metaproblem) from conception design, testing, manufacturing marketing and maintenance. But below could be Instance manufacturing processes (n.) Researchers at the North Carolina Supercomputing Center have simulated the plastics/injection molding process, combined with high-performance computing resources and visualization techniques.

*** G materials simulation (n.) Simulation of materials properties,in order to understand and predict the structural, magnetic, optic, electrical, and thermal properties of materials, with the ultimate goal of being able to design and synthesize materials with specific properties. This requires large-scale simulations using techniques such as classical potentials, tight-binding models, and ab initio methods. *** here reference our grand challenge entry for this project Further information is available from the First-Principles Simulation of Materials Properties Grand Challenge Project.

*** G medical simulations (n.) The use of radiotherapy in the treatment of tumours is amenable to the numerical techniques familiar to physicists. In particular, Monte Carlo methods can be used to model the transport of neutrons, photons, electrons, or protons through a three dimensional model of a patient derived from computer tomography. The aim is to maximise the radiation dosage to the target, but to minimise damage to the surrounding healthy tissues. Moreover, the use of Monte Carlo techniques is particularly suited to parallel architectures. *** I file under peregrine The Lawrence Livermore National Laboratory has a particle transport code `PEREGRINE', for use in calculating the dose deposition in patients receiving radiation therapy.

*** G message passing (n.) The most generally adopted, portable and high performance parallel paradigm so far accepted. an early explosion of message passing extension and languages has come together into the MPI standard.

*** G Metaproblems(n.) Such problems are hybrid integration of several subproblems of the other basic application classes, Synchronous Applications, Loosely Synchronous Applications, Embarrassingly Parallel Applications, Asynchronous Problems etc. More information and examples can be found in Parallel Computing Works, I

*** G middle product method (n.)a method of matrix multiplication in which entire columns of the result are computed concurrently. See also inner product method.

*** G *** please define as multiple instruction single data and xref mimd and simd MISD (n.) the problem with Flynn's taxonomy - no MISD machines exists - at least we have never found one!

*** G molecular dynamics (n.) an application category that includes the study of atomic and molecular motion and dynamics at the level of classical dynamics as well as quantum mechanics. The dynamics may be formulated in terms of particle methods with long range forces with or without a cutoff applied or in terms of short range forces with neighbour interaction lists. Fast multipole methods allow the simulation of very large systems as do particle-in-cell methods which employs a combination of particle methods and PDE solver methods.

*** G Monte Carlo (adj.) Making use of randomness. A simulation in which many independent trials are run independently to gather statistics is a Monte Carlo simulation. A search algorithm that uses randomness to try to speed up convergence is a Monte Carlo algorithm. See also random number generator. Further information is available in an Introduction to Monte Carlo Methods from the Computational Science Education Project.

*** G multigrid method (n.) A method for solving partial differential equations in which an approximate solution on a coarse resolution grid is used to obtain an improved solution on a finer resolution grid. The method reduces long wavelength components of the error or residual by iterating between a hierarchy of coarse and fine resolution grids.

*** G multiphase fluids in porous media (n.) Understanding the behaviour of water and oil in porous rocks (ie, multiphase fluids) is essential for improving the accuracy of numerical reservoir simulations. The physics here is the relationship between the microscopic detail of the porous medium, and the resultant macroscopic response of the fluids. It is this need for microscopic detail which drives the use of high performance computing resources. *** I Researchers from the Los Alamos National Laboratory are engaged in simulating multiphase fluid behaviour in a porous medium, using a lattice Boltzmann approach for their code.

N

*** G **** add fast multipole as G entryreferencing greengard's work *** add pointers to grand challenges using this N-body simulation (n.) Simulation of the motion of interacting particles, such as the gravitational interactions of stars in galaxy formation. Standard algorithms for N bodies are O(N^2), however heirarchical tree-code or multipole methods such as the Barnes-Hut algorithm have been developed which are O(NlogN).

*** G nanotechnology (n.) Molecular systems engineering. An interdisiplinary field where devices are constructed at, and to work with objects on the molecular scale. This technology should allowthe construction of very compact and high performance computing devices and other services.

*** G **** the separate entries mentioned below should each be I ** could initially have instance tagged to us navy c c and o s *** keep full text in G entry but replicate navy center naval warfare (n.) Researchers in the US Navy use high performance computer technology in many of its activities, making use of the increased computing power to execute novel algorithms. The US Navy's Naval Command, Control and Ocean Surveillance Center have a number of such projects within its Research and Development Division. These include the demonstration of the feasibility of using embedded scalable high performance digital and optical processing for submarine detection and classification in shallow water, the automatic recognition and targeting of hostile aircraft and ground vehicles, interferometric synthetic aperture radar (SAR) processing of terrain, and simulation of the suppression of acoustic signals.

*** G network simulations (n.) an applications category covering optimisation problems for power distribution networks; telecommunications providers; and other distributors. The problems are frequently formulated in terms of a large sparse matrix where the zero structure is defined by the network.

*** G neural network (n.) artificial devices using interconnects and processing capabilities suggested by models of the cortex are termed neural networks. These systems are widely used for optimization problems including content addressable memories and pattern recognition.

*** G numerical general relativity (n.) studies are providing a significant insight into the processess occuring within remote galatic bodies and hope to predict features which, if observed, will provide verification of theories regarding gravitational waves. For further information see Computational Applications of the CRPC.

*** G numerical tokamak (n.) numerical computations to study realistic plasma and geometry parameters for large tokamak experiments. 3D CFD models are using with additional electromagnetic field equations resulting in a complex system to be modelled. For further information see Computational Applications of the CRPC.

O

*** G ***** there should be an I which is in grand challenge section ocean modeling (n.) The modeling of the world's ocean is analogous to that for the atmosphere. The fluid being modelled obviously has different characteristics, and the availability of measurements for boundary and initial value conditions are on a far lesser scale. Nonetheless many research groups, in particular those specifically concerned with understanding the World's ocean system, are engaged in this activity. Researchers at the Los Alamos National Laboratory have developed a Global Ocean Model to reflect the new generation of massively parallel computers, under the Department of Energy's Computer Hardware, Advanced Mathematics and Model Physics and the Federal High Performance Computing and Communications programs. This high resolution global circulation model simulates the evolution in time of the ocean currents and distributions of temperature and salinity. Its domain is the three-dimensional global ocean, including realistic bottom topography and coastal boundaries of continents and islands. Ocean modeling effort by United States Navy research laboratories include: the Center for Computational Sciences at the Naval Research Laboratory who have a project to produce massively parallel versions of operational ocean prediction and weather forecast models; the High-Performance Computing at Stennis Space Center at the Naval Oceanographic Office who also provide oceanographic support to the Department of Defense (DoD) through a wide range of oceanographic modeling, prediction, and data collection techniques; and the Naval Postgraduate School Scientific Visualisation Laboratory in collaboration with scientists at the Los Alamos National Laboratory.

*** G OEM (n.) Original Equipment Manufacturer; a company which adds components to someone else's computers and sells the result as a complete product.

*** G oil reservoir simulation (n.) The numerical simulation of oil reservoirs is used to optimise the recovery of oil from the porous rock, in particular when water is pumped in at injection points to maintain the pressure of the outcoming oil. Given a description of the reservoir from geological data, such as the permeability of the constituent rocks, the numerical model attempts to predict the productivity of the reservoir as fluid is pumped in, and oil removed, over a typical period of years. Although the petroleum industry was comparatively swift to make use of numerical simulations, the advent of distributed memory computing resources -- from workstation clusters to massively parallel distributed memory machines with gigabytes of core memory -- promises the ability to simulate large and/or complex reservoirs such as those found in the Arabian Peninsula. *** G The Edinburgh Parallel Computing Centre has had a long running project with Intera Information to port the latter's widely used Eclipse-100 black oil reservoir simulation code.

*** G OLTP (n.) On-line transaction processing; handling transactions (such as deposits and withdrawals) as they occur. An application area of great importance to banks and insurance companies.

*** G OSF (n.) Open Software Foundation; an organization established by a number of the major computer manufacturers to set software standards.

P

*** G Parallel Computing Works! (n.) Book describing work done at the Caltech Concurrent Computation Program, Pasadena, California. This project ended in 1990 but the work has been updated in key areas until early 1994. The book also contains links to some current projects. A WWW version of the book exists which contains links to additional and expanded sources of information.

*** G parallel I/O (n.) Input output operation on parallel computers can be a major problem in obtaining both performance and portability. Studies are currently investigating the behavior of specific application programs to define application-level methodologies for efficient parallel I/O. For further information see Parallel I/O applications of the CRPC.

*** G particle dynamics (n.) See molecular dynamics.

*** G particle flux transport modeling (n.) an application category relevant to the nuclear and weapons industries. Monte Carlo methods are often used for the simuylation of models for neutron transport in a reactor or in an explosion for example.

*** G PDE (n.) partial differential equation. For further information see Some High Performance Computing Issues in Partial Differential Equations from the Computational Science Education Project.

*** G **** these are WRONG references as I said in PCW comments *** petaflops book is 1994........ petaflops (n.) A performance level currently being associated with typical applications for early next century.Several workshops has been help to discuss the challenges such performance levels will present. Issues addressed included Applications and Alogorithms, Device Technology, Architectures and Systems and Software Technology. Further information is available in the proceedings of the 1992 Workshop and 1995 Workshop on System Software and Tools for HPCC.

*** I *** need a more general G entry pollution modeling (n.) Atmospheric pollution, especially from internal combustion engines, are an acknowledged problem in many cities in the world. Mathematical models to improve the understanding of the interaction of the pollutant with the atmosphere are currently being developed by the United States Environmental Protection Agency (EPA), using high performance resources from the North Carolina Supercomputing Center. The aim of this work is to evaluate the effectiveness of proposed emissions control legislation in reducing air pollution and acidic deposition. Environmental models, including the Urban Airshed Model for Clean Air Act compliance.

Q

*** G QCD (n.) Quantum Chromodynamics; a model of the behaviour of matter on sub-nuclear scales, the simulation of which is very hungry of computing power. See also Grand Challenges and Monte Carlo.

Quantum Chromodynamics (n.) See QCD.

R

*** G random number generator (n.) A program or algorithm that produces a series of numbers which can be used as random variables, for applications such as Monte Carlo simulation.

*** G RDMS (n.) Relational Database Management System; software to manage a database in which data are stored by attribute.

*** G relaxation method (n.) A type of indirect method in which the values making up a trial solution to an equation are repeatedly updated according to some rule until convergence criteria are met. See also direct method

*** G rendering (n.) See graphics rendering.

S

*** G SAXPY (n.) elemental BLAS operation involving "constant (a) times vector (x) plus vector (y)". The S refers to Fortran Single precision. SAXPY and related BLAS operations are often implemented by the hardware manufacturer as part of the system software and the execution time for such operations has been used a a primitive benchmark of a high performance computing system.

*** I ScaLAPACK (n.) A linear algebra software package, which has been mounted on a wide range of platforms. This is a version of LAPACK suitable for distributed memory computer systems. The software is available from the National HPCC Software Exchange. See also LAPACK, LINPACK.

*** G scheduling (n.) an application category requiring expert systems; neural networks; simulated annealing; linear programming. Typical examples include: manufacturing (and just-in-time situations); transportation ranging from dairy delivery to military deployment; timetablinga nd university classes; airline crew and aircraft scheduling.

*** G seismic wave simulation (n.) The use of seismic waves has allowed the oil exploration industry to make detailed predictions of subterranean geology, and is a key tool in the search for potential oil reserves. However, inverting the raw data from the reflected acoustic waves into the three dimensional spatial domain is a known computationally demanding problem. *** I Researchers from the Lawrence Livermore National Laboratory has ported a 3-D Seismic wave propagation code from the French Institut Francais Du Petrole onto a massively parallel computer (with sponsorship from the United States Department of Energy's Gas and Oil National Information Infrastructure project). This code simulates the propagation in three dimensions of acoustic waves through a region of orders kilometres along each dimension. From this reflections from characteristic features such as salt domes can be simulated with unprecedented time resolution and spatial size.

*** G seismology (n.) an applications category that includes oil and gas exploration and recovery as well as geological prediction studies. Typical applications tend to require very large quantities of data and often result in relatively simple calculations on the measured data.

*** I semiconductor design (n.) Researchers at the Los Alamos National Laboratory have worked on theoretical design tools for advanced semiconductor devices. The application is the modeling of future semiconductor devices which will be an order of magnitude smaller and faster than current devices. The code makes use of state of the art Monte Carlo (MC) simulations for building better theoretical models, for one and two-dimensional cases. The use of advanced computing platforms will allow full three-dimensional MC simulations to be made.

*** G simulated annealing (n.) Optimization technique introduced by Kirkpatrick in 1983 which applies statistical physics methods to find an approximate optimal solution to a problem. Typically a thermodynamic analogy is used for the model system under study and the task of finding an optimal solution is mapped to that of finding the ground state of the thermodynamic system.

*** G smart memory (n.) refers to integrated processors and memories; content addressable memories; and various application specific memories . Examples of application which would benefit from such technologies are financial and economic modeling where large scale databases are accessed. Graphics rendering or direct solution techniques such as Gaussian elimination where high memory bandwidth is required.

*** G.. sparse matrix (n.) A matrix with the majority of its elements equal to zero. See also full matrix.

*** G SQL (n.) Standard Query Language; a standard for adding data to, or recovering data from, relational databases.

*** I separate for dyna3d and southampton work **** need a general entry as well -- please invent structural analysis (n.) Modern vehicles and vessels have to meet stringent safety requirements, through testing of their structural integrity. The standard numerical technique for meeting these is through the use of structural analysis codes. Examples here are the PAFEC-FE finite element package, and the DYNA3D nonlinear response code. Their uses range from modeling vehicle collisions, earthquake and highway safety models, and damage assessment to shipping containers. Researchers at the Southampton Parallel Applications Center have ported the PAFEC-FE finite element crash simulation code onto a parallel platform, by parallelising the core frontal solver. The DYNA3D program, developed by researchers at the Livermore National Laboratory is an engineering design tool for modeling the nonlinear, transient response of complex structures. A developmental version of DYNA3D has been installed on Meiko CS-2 and the Cray Research Incorporated T3D parallel supercomputers.

*** G *** I invent instance entry for nastran structural dynamics (n.) the use of computational models for simulation in structural and civil engineering. The most well know commercial code in this field is NASTRAN. Finite element methods are the most widely used method in this field.

*** G Synchronous Applications (n.) is a simple class of applications which tend to be regular and characterised by algorithms employing simultaneous identical updates to a set of points. More information and examples can be found in Parallel Computing Works, I, II and III

T

*** G telemedicine (n.) The availability of reliable, high bandwidth nationwide networks promise to remove the constraints of geographical locations in many professions. One example which has been investigated is in the area of medicine, whereby the use of such networks allow medical information and expertise to be shared across the country. ***** I Los Alamos National Laboratory and the National Jewish Center for Immunology and Respiratory Medicine in Denver, CO have collaborated on such a scheme. Their project, `TeleMed', allows the Center's expertise in pulmonary diseases and radiology, the treatment of tuberculosis and other lung diseases to be made available throughout the nation. A national radiographic repository located at Los Alamos National Laboratory, such that participating doctors can view radiographic data via a sophisticated multimedia interface without leaving their offices. With the new system, a doctor can match a patient's radiographic information with the data in the repository, review treatment history and success, and then determine the best treatment. Moreover, the availability of high performance computers in LANL allows sophisticated image processing to be done on the Computer Tomography data.

*** G *** add Instances for Dow and dupont thermochemical calculations (n.) The ab initio calculations of the thermodynamic properties of molecules require supercomputer performances and large amounts of machine memory and disk storage. The Du Pont chemical company uses molecular modeling packages such as `Gaussian92' and `DGauss' in the search for alternatives to chlorofluorocarbon (CFC) products; in particular in determining their thermodynamic properties. In a similar vein, the Dow Chemical Company uses similar techniques to routinely calculate thermodynamic properties of complex organic molecules.

*** G traffic simulation (n.) The popularity of private transport and the limited public tolerance for major road projects is a dilemma faced by many industrialised cities, especially in Europe. High performance computers have recently been introduced to aid the urban planning process, whereby realistic models of traffic behaviour allows the planner to decide on the best means of controlling traffic flow. *** G The Edinburgh Parallel Computing Centre has collaborated with SIAS Ltd to develop a parallel version of the latter's Paramics microscopic traffic simulation program. The use of a Thinking Machine CM-200 allows individual vehicles to be simulated around the roads of Edinburgh, and future effort is directed at porting the code onto the distributed memory environment of a Cray T3D.

U

*** G unindexed text search (n.) The majority of databases use indexing to improve performance. If datasets are highly dynamic or transient is not feasible.

*** G unstructured grids (n.) In unstructured grids each node can be in a different topological location with respect to neighbouring nodes. Such meshes typically use linked lists and involved indirect addressing. Both features which can inhibit high performance on vector, cache or distributed memory systems without some programming effort.

V

*** G Video on Demand (n.) See VOD

*** G visualization (n.) In large applications where the data set is very large, the problem of presenting the data can also be a numerically intensive task. Moreover, some of the well known visualization techniques, such as ray-tracing and volume rendering, are naturally amenable to modern parallel architectures. *** I Researchers at the Scientific Visualization Center in the US Army Corps of Engineers Waterways Experiment Station are developing Parallel Visualization algorithms for scientific and engineering applications. *** G Also, the US Army Research Laboratory (ARL) has a Scientific Visualization team engaged in exploiting high performance computing technology to meet visualization demands from within the ARL. Some of the applications include computational fluid dynamics, penetration mechanics, battlefield troop movement and artificial terrain generation, and explosive effects simulations.

*** G VOD (n.) Video on Demand, a new service which will allow the high costs of the information superhighway infrastructure to be recovered by charging on a pay per view basis for services which would otherwise be freely distributed.

W

*** G waste minimization (n.) Current environmental concerns are leading to the legal requirement for companies and in many cases householded to reduce waste, both in terms of material management through recycling and energy through programs of energy management. Here CAD/CAE can be used to produce minimum waste products and processes. However care must be taken to analyse full the impact of any changes and this will require large databases and complex financial and economic modeling

*** G **** should have european (guy) and us specific instance entries weather forecasting (n.) Short and medium term weather forecasting is perhaps the most publicly accessible application of supercomputers. Essentially the problem is that of solving the Navier-Stokes equation for fluid flow, subject to boundary and initial value conditions from physical measurements. Most of the industrialised countries in the world have their own centre for developing and running the models most pertinent to them. In the United States, this responsibility lies with National Weather Service, a division of the National Oceanic and Atmospheric Administration (NOAA).

X

Y

Z


Glossary Index and Credits
Ken Hawick, Geoffrey Fox, and the RoadMap team.
Geoffrey Fox gcf@npac.syr.edu, http://www.npac.syr.edu Phone 3154432163 (Npac central 3154431723) Fax 3154434741