Basic IMAGE version of Foils prepared 19 September 98

Foil 12 Generalizing the Buffer in MPI

From MPI Message Passing Interface Computational Science for Simulations -- Fall Semester 1998. by Geoffrey C. Fox, Nancy McCracken
(0 to 5):





© Northeast Parallel Architectures Center, Syracuse University, npac@npac.syr.edu

If you have any comments about this server, send e-mail to webmaster@npac.syr.edu.

Page produced by wwwfoil on Sun Apr 11 1999

Table of Contents for MPI Message Passing Interface


1 CPS615 Introduction to Computational Science The Message Passing Interface MPI
2 Abstract of MPI Presentation
3 MPI Overview -- Comparison with HPF -- I
4 MPI Overview -- Comparison with HPF -- II
5 Some Key Features of MPI
6 What is MPI?
7 History of MPI
8 Who Designed MPI?
9 Some Difficulties with MPI
10 Sending/Receiving Messages: Issues
11 What Gets Sent: The Buffer
12 Generalizing the Buffer in MPI
13 Advantages of Datatypes
14 To Whom It Gets Sent: Process Identifiers
15 Generalizing the Process Identifier in MPI
16 Why use Process Groups?
17 How It Is Identified: Message Tags
18 Sample Program using Library
19 Correct Library Execution
20 Incorrect Library Execution
21 What Happened?
22 Solution to the Tag Problem
23 MPI Conventions
24 Standard Constants in MPI
25 The Six Fundamental MPI routines
26 MPI_Init -- Environment Management
27 MPI_Comm_rank -- Environment Inquiry
28 MPI_Comm_size -- Environment Inquiry
29 MPI_Finalize -- Environment Management
30 Hello World in C plus MPI
31 Comments on Parallel Input/Output - I
32 Comments on Parallel Input/Output - II
33 Blocking Send: MPI_Send(C) or MPI_SEND(Fortran)
34 Example MPI_SEND in Fortran
35 Blocking Receive: MPI_RECV(Fortran)
36 Blocking Receive: MPI_Recv(C)
37 Fortran example: Receive
38 Hello World:C Example of Send and Receive
39 HelloWorld, continued
40 Interpretation of Returned Message Status
41 Collective Communication
42 Some Collective Communication Operations
43 Hello World:C Example of Broadcast
44 Collective Computation
45 Examples of Collective Communication/Computation
46 Collective Computation Patterns
47 More Examples of Collective Communication/Computation
48 Data Movement (1)
49 Examples of MPI_ALLTOALL
50 Data Movement (2)
51 List of Collective Routines
52 Example Fortran: Performing a Sum
53 Example C: Computing Pi
54 Pi Example continued
55 Buffering Issues
56 Avoiding Buffering Costs
57 Combining Blocking and Send Modes
58 Cartesian Topologies
59 Defining a Cartesian Topology
60 MPI_Cart_coords or Who am I?
61 Who are my neighbors?
62 Periodic meshes
63 Motivation for Derived Datatypes in MPI
64 Derived Datatype Basics
65 Simple Example of Derived Datatype
66 Derived Datatypes: Vectors
67 Example of Vector type
68 Why is this interesting?
69 Use of Derived Types in Jacobi Iteration
70 Derived Datatypes: Indexed
71 Designing MPI Programs
72 Jacobi Iteration: The Problem
73 Jacobi Iteration: MPI Program Design
74 Jacobi Iteration: MPI Program Design
75 Jacobi Iteration: Fortran MPI Program
76 Jacobi Iteration: create topology
77 Jacobi iteration: data structures
78 Jacobi Iteration: send guard values
79 Jacobi Iteration: update and error
80 The MPI Timer
81 MPI-2
82 I/O included in MPI-2
Click outside pointer rectangle to move pointer
Click on Pointer to Hide
Click on Pointer + ALT to toggle message hiding
Click on Pointer + CNTL to abolish pointer
Click on Pointer + Shift to cycle families
Click outside + Alt is Change Image
Click outside + Control is Double Size
Click outside + Shift is Halve Size
Right Mouse Down on Pointer Toggles Index
Shift Right Mouse aligns top with scrolled Page
While With Mouse Down on Current Pointer
h hides This Message while m restores
i Toggles Index Aligned with Page Top
j Toggles Index Aligned with Scrolled View Top
a Abolishes Pointer while CNTL-Click restores
f cycles through pointer families
c cycles through members of a family
u increases Size Up and d decreases Down
Mouse Up-Down between changes of
Pointer to process new option