-
This is just one view of parallel programs (and algorithms).
-
Note: The tasks mentioned here may not be the tasks you're used to.
-
Here, łtask˛ means ła sequence of operations that is scheduled atomically and can run to completion once started˛
-
That is, a task is the computation between synchronization points
-
For example, an MPI process is a collection of tasks
A parallel algorithm is a collection of tasks and a partial ordering between them.
|
Design goals:
-
Match tasks to the available processors (exploit parallelism).
-
Minimize ordering (avoid unnecessary synchronizations).
-
Utilize remaining ordering (exploit locality).
|
Sources of parallelism:
-
Data parallelism: updating array elements simultaneously.
-
Functional parallelism: conceptually different tasks which combine to solve the problem.
-
Speculative parallelism: temporarily ignore partial ordering, repair later if this causes problems.
-
This list is not exhaustive
|