15

I often read that parallelism and concurrency are different things. Very often the answerers/commenters go as far as writing that they're two entirely different things. Yet in my view they're related but I'd like some clarification on that.

For example if I'm on a multi-core CPU and manage to divide the computation into x smaller computation (say using fork/join) each running in its own thread, I'll have a program that is both doing parallel computation (because supposedly at any point in time several threads are going to run on several cores) and being concurrent right?

While if I'm simply using, say, Java and dealing with UI events and repaints on the Event Dispatch Thread plus running the only thread I created myself, I'll have a program that is concurrent (EDT + GC thread + my main thread etc.) but not parallel.

I'd like to know if I'm getting this right and if parallelism (on a "single but multi-cores" system) always implies concurrency or not?

Also, are multi-threaded programs running on multi-cores CPU but where the different threads are doing totally different computation considered to be using "parallelism"?

Cedric Martin
  • 1,067
  • 10
  • 16
  • See also: [Concurrency vs Parallelism - What is the difference?](http://stackoverflow.com/questions/1050222/concurrency-vs-parallelism-what-is-the-difference) – Theraot Sep 14 '16 at 05:40

3 Answers3

15

According to Wikipedia:

Parallel computing is a form of computation in which many calculations are carried out simultaneously, operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently ("in parallel").

That is, parallelism always implies concurrency.

Also, are multi-threaded programs running on multi-cores CPU but where the different threads are doing totally different computation considered to be using "parallelism"?

No. The essence of parallelism is that a large problem is divided into smaller ones so that the smaller pieces can be solved concurrently. The pieces are mutually independent (to some degree at least), but they're still part of the larger problem, which is now being solved in parallel.

The essence of concurrency is that a number of threads (or processes, or computers) are doing something simultaneously, possibly (but not necessarily) interacting in some ways. Wikipedia again:

Concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other.

Joonas Pulakka
  • 23,534
  • 9
  • 64
  • 93
  • 4
    Great post. Parallelism is a subset of Concurrency. –  Jul 01 '12 at 12:31
  • 5
    Sorry, but this answer is incorrect. You can definitely have parallelism without concurrency (e.g. bit-level parallelism) and in fact, the two are distinct concepts. Concurrency is about composing independent units of executions whereas parallelism is about simultaneous execution of potentially associated computations. – Kai Sellgren Oct 02 '15 at 13:45
  • @KaiSellgren: Please cite some source to support your statement. – Joonas Pulakka Oct 05 '15 at 07:11
  • The first wikiquote is simply outright wrong. Luckily it was fixed some time ago and now it states correctly that parallelism does not rely on concurrency. – Kai Sellgren Oct 06 '15 at 16:52
  • Indeed, that's what the wiki entry currently says (_it is possible to have parallelism without concurrency (such as bit-level parallelism)_). But I don't get that point at all; isn't bit-level parallelism the most concurrent thing imaginable - more operations performed with less instructions, then aren't some of those operations performed within same instruction, i.e. simultaneously, i.e. concurrently? – Joonas Pulakka Oct 07 '15 at 13:43
  • Concurrency is a property of a system or a program. I.e. a way to compose and build something. Parallelism, however, is just about executing independent actions at the same instant. If you remove parallelism from bit-level parallelism, you end up in serial execution. Therefore, it's easy to conclude there is no concurrency in it. – Kai Sellgren Oct 09 '15 at 17:51
  • Quantitative costs involved with concurrency are throughput and latency. Concurrency is also often used to improve responsiveness. Parallelism is all about throughput, however. I agree that at first sight they seem similar, but they are completely distinct concepts and can co-exist, or exist without one another. – Kai Sellgren Oct 09 '15 at 17:52
  • The Go language author made a good talk about this a few years ago, try searching Youtube for that. – Kai Sellgren Oct 09 '15 at 17:53
  • If you write a parallel mapreduce or use SIMD or do some parallel divide-and-conquer, you are only doing parallelism, and not concurrency. I should also note that concurrency is non-deterministic, whereas parallelism is entirely deterministic. – Kai Sellgren Oct 09 '15 at 17:56
  • 1
    I think @KaiSellgren was referring to [Concurrency is not parallelism](https://blog.golang.org/concurrency-is-not-parallelism). The talk says that concurrency is about design and parallelism is about the execution. Now parallelism without concurrency would be something that gets parallelized (for example, by using SIMD) but its design is not concurrent. – Theraot Sep 14 '16 at 05:36
3

Code can be concurrent, but not parallel.

Imagine multiple threads running on single core machine. This single core machine will only process one thread at the time, so there will be no parallelism of operations. But for each thread, thanks to how OS handles multiple threads, then each thread needs to assume all other threads are running at the same time.

Euphoric
  • 36,735
  • 6
  • 78
  • 110
0

Parallelism simply means doing many tasks simultaneously; on the other hand concurrency is the ability of the kernel to perform many tasks by constantly switching among many processes.

In order to achieve parallelism it is important that system should have many cores only then parallelism can be achieved efficiently. And there is lot of hit on performance and lot of overhead is incurred if parallelism is tried on a single core machine.

For example, earlier system had only one core and CPU schedulers would give an illusion of parallelism by constantly switching between processes allowing each process to make progress.

Theraot
  • 8,921
  • 2
  • 25
  • 35