Logo
Pattern

Discover published sets by community

Explore tens of thousands of sets crafted by our community.

Parallel Computing Patterns

15

Flashcards

0/15

Still learning
StarStarStarStar

Data Parallelism

StarStarStarStar

A parallel computing pattern where similar operations are performed concurrently on elements of distributed data structures. Use when large data sets can be processed independently in parallel.

StarStarStarStar

Task Parallelism

StarStarStarStar

A pattern where different computational tasks are performed in parallel. Use when tasks can be defined that operate relatively independently and do not need to synchronize frequently.

StarStarStarStar

Pipeline Parallelism

StarStarStarStar

A pattern where a sequence of stages process a stream of input data in an assembly-line fashion. Use when tasks can be broken down into sequential stages with data flowing between them.

StarStarStarStar

Geometric Decomposition

StarStarStarStar

A pattern that involves dividing a large geometric problem space into smaller pieces that can be solved in parallel. Use when the problem can be represented in a geometric space and can be subdivided.

StarStarStarStar

Recursive Parallelism

StarStarStarStar

A pattern where a problem is solved by recursively dividing it into smaller sub-problems that are solved in parallel. Use when problems exhibit a natural hierarchical structure and can be divided recursively.

StarStarStarStar

MapReduce

StarStarStarStar

A programming model for processing large data sets with a distributed algorithm on a cluster. Use when data can be mapped over a set of operations and then reduced to a result.

StarStarStarStar

Speculative Parallelism

StarStarStarStar

A pattern where multiple speculative branches of a computation are executed in parallel. Use when there's uncertainty about which direction a computation will take.

StarStarStarStar

Load Balancing

StarStarStarStar

A pattern that aims to evenly distribute work across parallel compute resources to avoid idle times and optimize throughput. Use when workload is dynamic or varies greatly in size.

StarStarStarStar

Master-Worker

StarStarStarStar

A pattern where a master process distributes work to multiple worker processes and collects results. Use when tasks can be easily partitioned and distributed to workers.

StarStarStarStar

Shared Memory

StarStarStarStar

A parallel computing pattern where multiple processors operate on a common address space. Use when high-speed data exchange between processors is required.

StarStarStarStar

Distributed Memory

StarStarStarStar

A computing pattern where each processor has its own private memory and processors communicate by passing messages. Use when the system consists of a network of interconnected processors.

StarStarStarStar

Dataflow

StarStarStarStar

A pattern where the execution is driven by the availability of data operands rather than by a statically determined sequence. Use when you can represent computations as a directed graph of operations.

StarStarStarStar

Event-Driven Parallelism

StarStarStarStar

A pattern where computation takes place in response to external events. Use when the system needs to react to asynchronous events or actions.

StarStarStarStar

Fork-Join

StarStarStarStar

A pattern where a task is divided into subtasks that are executed in parallel, and then joined upon completion. Use when you can decompose a task into independent subtasks that can run concurrently.

StarStarStarStar

Bulk Synchronous Parallel

StarStarStarStar

A parallel computing pattern that organizes parallel computation into a sequence of parallel steps followed by synchronization. Use when you can structure your computations into phases separated by barrier synchronizations.

Know
0
Still learning
Click to flip
Know
0
Logo

© Hypatia.Tech. 2024 All rights reserved.