Table of Contents
Designing and Building Parallel Programs
Outline
Message Passing Interface (MPI)
What is MPI?
Compiling and Linking (in MPICH)
Running MPI Programs(in MPICH)
Sending/Receiving Messages:Issues
What Gets Sent: The Buffer
Generalizing the Buffer in MPI
Advantages of Datatypes
To Whom It Gets Sent:Process Identifiers
Generalizing the Process Identifier in MPI
How It Is Identified: Message Tags
Sample Program using Library
Correct Execution
Incorrect Execution
What Happened?
Solution to the Tag Problem
MPI Basic Send/Receive
Six-Function MPI
Simple Fortran Example
Simple Fortran Example (2)
Simple Fortran Example (3)
Advanced Features in MPI
Collective Communication
Synchronization
Data Movement (1)
Data Movement (2)
Collective Computation Patterns
List of Collective Routines
Example: Performing a Sum
Buffering Issues
Avoiding Buffering Costs
Combining Blocking and Send Modes
Connecting Programs Together
Connecting Programs via Intercommunicators
Regular (Cartesian) Grids
Regular Grid Example:Getting the Decomposition
Regular Grid Example:Conclusion
Designing MPI Programs
Jacobi Iteration: The Problem
Jacobi Iteration:MPI Program Design
Jacobi Iteration:MPI Program Design
Jacobi Iteration: MPI Program
Jacobi Iteration: MPI Prog. II
|