Saturday 20 April 2013

Concurrency, Parallel, Synchronization and Reactive. How to distinguish them?

Parallel and Concurrent

Parallel

Parallel computing is a form of computation in which many calculations are carried out simultaneously,[1] operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently ("in parallel"). (from Wikipedia)

Key points of Parallel:

  • Usually for time saving and speeding up the calculation
  • Decompose the big problem into smaller ones
  • The smaller ones run in different processors at the same time
  • Reference: https://computing.llnl.gov/tutorials/parallel_comp/#Whatis

Three-level Parallelisms:

  • Bit-level (4-bit, 8-bit, 16-bit, 32-bit and 64-bit microprocessors)
  • Instruction-level (pipelining (CISC, RISC), superscalar)
  • Task-level (multi-processors, tasks running on different processors simultaneously)

Task Parallelism and Data Parallelism:

  • Task Parallelism (processes, or threads of execution)

"entirely different calculations can be performed on either the same or different sets of data"

  • Data Parallelism

"the same calculation is performed on the same or different sets of data"

Concurrent

Concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other. (from Wikipedia)

a program is typically factored into multiple threads of control with distinct responsibilities and purposes, which may run simultaneously or take turns. This is often about dealing with concurrency and nondeterminism in the world, or about reducing latency.

Key points of concurrent programming:

  • It may run simultaneously on different processors or multithreads on single processor
  • the number of possible execution paths in the system can be extremely large and the resulting outcome can be nondeterministic
  • Concurrent access of shared resources with nondeterminism may lead to issues such as deadlock and starvation
  • What differentiates concurrent system from sequential system most is the fact that their subprocesses communicate with each other

The difference between parallel and concurrency programming

Parallel and concurrent are synonyms in most cases, but in programming, it's not the case. They are totally different concepts.

Example (Parallel and Concurrent programming in Haskell)

A parallel program

  • utilize the more powerful and computational hardware (e.g. multiple processors or cores) to make it quickly and achieve more efficiency
  • A big task is splitted into several smaller tasks, and each of them is executed in different processor or core at the same time (in parallel). Thus the whole delivery time will be shorter than sequentially performed task

Concurrent program

  • concurrency is a progarm-structuring technique in which there are multiple threads of control.
  • From the user level, threads are executed "at the same time" because their effects interleaved. But whether they actually execute at the same time or not is not a design problem, and just an implementation detail.
  • concurrency can be implemented on one single process with multithreads or on multiple processors
  • concurrent programming also concerns program-structuring to interact with multiple independent external agents (for example, one thread responsible for the user, one for database server, and another for external clients)

Distributed, Reactive and Concurrent System

Distributed System

A distributed system consists of multiple computers that communicate through a computer network, and the computers interact with each other in order to achieve a common goal

For example,

  • a lithography has several subsystems, and each subsystem runs on its individual embedded system. They are working with central workstation together to achieve the function of the lithography

Reactive System

A system that responds or reacts to external events

For example,

  • a motor control driver: it gets the input from the caller or user, then move the motor forward or backward to specified position, and stop
  • interactive system is a kind of typical reactive system, which reacts to its user and operators, and provides the feedback

Concurrent System

several computations are running simultaneously and may interact with each other while they are executing.

Architectural Aspect

  • Each computer in distributed system has the individual processor and memory
  • A parallel system will have the tasks on different processors with shared memory, such as SMP or AMP system
  • A concurrent system may have the multiple threads on a single processor or multiple processors with shared memory

Synchronization

Synchronization refers to one of two distinct but related concepts: synchronization of processes, and synchronization of data. Process synchronization refers to the idea that multiple processes are to join up or handshake at a certain point, in order to reach an agreement or commit to a certain sequence of action. Data synchronization refers to the idea of keeping multiple copies of a dataset in coherence with one another, or to maintain data integrity. (from Wikipedia)

CSP (Communicating Sequential Processes)

is a formal language for modelling and reasoning the patterns of interaction in concurrent system

  • interaction by synchronous message passing

Parallel

  • Synchronous Parallel (P || Q)
  • Alphabetised Parallel (P [ Ap || Aq ] Q)
  • Generalised Parallel (P [ | a | ] Q, such as P [| {| in, out |} |] Q)
  • Interleaving |||

No comments :

Post a Comment