The difference between “concurrent” and “parallel” execution?

What is the difference between the terms concurrent and parallel execution? I’ve never quite been able to grasp the distinction.

The tag defines concurrency as a manner of running two processes simultaneously, but I thought parallelism was exactly the same thing, i.e.: separate threads or processes which can potentially be run on separate processors.

Also, if we consider something like asynchronous I/O, are we dealing with concurrency or parallelism?

11

Concurrency and parallelism are two related but distinct concepts.

Concurrency means, essentially, that task A and task B both need to happen independently of each other, and A starts running, and then B starts before A is finished.

There are various different ways of accomplishing concurrency. One of them is parallelism–having multiple CPUs working on the different tasks at the same time. But that’s not the only way. Another is by task switching, which works like this: Task A works up to a certain point, then the CPU working on it stops and switches over to task B, works on it for a while, and then switches back to task A. If the time slices are small enough, it may appear to the user that both things are being run in parallel, even though they’re actually being processed in serial by a multitasking CPU.

5

The two concepts are related, but different.

Concurrency means that two or more calculations happen within the same time frame, and there is usually some sort of dependency between them.

Parallelism means that two or more calculations happen simultaneously.

Put boldly, concurrency describes a problem (two things need to happen together), while parallelism describes a solution (two processor cores are used to execute two things simultaneously).

Parallelism is one way to implement concurrency, but it’s not the only one. Another popular solution is interleaved processing (a.k.a. coroutines): split both tasks up into atomic steps, and switch back and forth between the two.

By far the best known example of non-parallel concurrency is how JavaScript works: there is only one thread, and any asynchronous callback has to wait until the previous chunk of code has finished executing. This is important to know, because it guarantees that any function you write is atomic – no callback can interrupt it until it returns. But it also means that “busy loops” won’t work – you can’t set a timeout and then loop until it fires, because the loop will prevent the timeout callback from executing.

3

I believe this answer to be more correct than the existing answers and editing them would have changed their essence. I have tried to link to various sources or wikipedia pages so others can affirm correctness.


Concurrency: the property of a system which enables units of the program, algorithm, or problem to be executed out-of-order or in partial order without affecting the final outcome 1 2.

A simple example of this is consecutive additions:

0 + 1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9 = 45

Due to the commutative property of addition the order of these can be re-arranged without affecting correctness; the following arrangement will result in the same answer:

(1 + 9) + (2 + 8) + (3 + 7) + (4 + 6) + 5 + 0 = 45

Here I have grouped numbers into pairs that will sum to 10, making it easier for me to arrive at the correct answer in my head.

Parallel Computing: a type of computation in which many calculations or the execution of processes are carried out simultaneously 3 4. Thus parallel computing leverages the property of concurrency to execute multiple units of the program, algorithm, or problem simultaneously.

Continuing with the example of consecutive additions, we can execute different portions of the sum in parallel:

Execution unit 1:  0 + 1 + 2 + 3 + 4 = 10
Execution unit 2:  5 + 6 + 7 + 8 + 9 = 35

Then at the end we sum the results from each worker to get 10 + 35 = 45.

Again, this parallelism was only possible because consecutive additions have the property of concurrency.

Concurrency can be leveraged by more than just parallelism though. Consider pre-emption on a single-core system: over a period of time the system may make progress on multiple running processes without any of them finishing. Indeed, your example of asyncronous I/O is a common example of concurrency that does not require parallelism.


Confusion

The above is relatively straightforward. I suspect people get confused because the dictionary definitions do not necessarily match what was outlined above:

  • Concurrent: occurring or existing simultaneously or side by side 5.
  • Concurrency: the fact of two or more events or circumstances happening or existing at the same time From searching on google: “define: concurrency”.

The dictionary defines “concurrency” as a fact of occurrence, whereas the definition in the computing vernacular is a latent property of a program, property, or system. Though related these things are not the same.


Personal Recommendations

I recommend using the term “parallel” when the simultaneous execution is assured or expected, and to use the term “concurrent” when it is uncertain or irrelevant if simultaneous execution will be employed.

I would therefore describe simulating a jet engine on multiple cores as parallel.

I would describe Makefiles as an example of concurrency. Makefiles state the dependencies of each target. When targets depend on other targets this creates a partial ordering. When the relationships and recipes are comprehensively and correctly defined this establishes the property of concurrency: there exists a partial order such that order of certain tasks can be re-arranged without affecting the result. Again, this concurrency can be leveraged to build multiple rules simultaneously but the concurrency is a property of the Makefile whether parallelism is employed or not.

Concurrent execution is the generalized form of parallel execution. For example parallel program can also be called concurrent but reverse is not true.

  1. Concurrent execution is possible on single processor (multiple threads, managed by scheduler)
  2. Parallel execution is not possible on single processor but on multiple processors. (One process per processor)

For details read this research paper
Concepts of Concurrent Programming

2

Parallel processing is a subset of concurrent processing.

Concurrent processing describes two tasks occurring asynchronously, meaning the order in which the tasks are executed is not predetermined. Two threads can run concurrently on the same processor core by interleaving executable instructions. For example, thread 1 runs for 10ms, thread 2 runs for 10ms etc.

Parallel processing is a type of concurrent processing where more than one set of instructions is executing simultaneously. This could be multiple systems working on a common problem as in distributed computing, or multiple cores on the same system.

Obviously, the terms are used differently in different cultures.

My understanding is the following:

Parallelism is a way to speed up processing. Whether you do matrix multiplication on a single core, on multiple cores or even in the GPU, the outcome is the same (or else your program is broken). It doesn’t add new functionality to some program, just speed.

While concurrency is about things you couldn’t do sequentially. For example, serving 3 different webpages at the same time to 3 clients, while waiting for the next request. (Though you could simulate this to some degree through interleaving, as it was done in the elder days.)
Note that the behaviour of concurrent programs is nondeterministic. It is for example not clear, which of the 3 clients will be completly served first. You could run quite some tests and get a different result each time regarding the order the request will be finished. The run-time system should guarantee that a) all clients will be served and b) in a reasonable amount of time.

Usually, the work horse of a parallel computation isn’t aware of, nor does it care about, parallelism. While concurrent tasks often explicitly employ inter-process or inter-thread communications – such as blocking queues, synchronization and locking mechanisms.

In my opinion, from an application programming perspective there is no difference between these two concepts and having two words is confusing for confusion’s sake. I think thread interleaving was brought about to simulate multicore processing back in the days when multicore wasn’t a possibility. Why do we have a word for this outdated mindset?

Mason Wheeler and Penguin have given the same answer. One Core with task switching and or multicore is concurrent, strictly multicore = parallel.

My opinion is that these two terms should be rolled into one and I make an effort to avoid saying “concurrent”. I guess on the OS programming level the distinction is important, but from the application programmer’s perspective it doesn’t matter too much. I’ve written mapReduce, Spark, MPI, cuda, openCL, and multithreaded c++ and I’ve never had to stop and think if the job is running with interleaved threads or with multiple cores.

For example, when I write multithreaded c++ sometimes I’m not sure how many cores I’ll get, though there are ways to make demands on how many cores you get as described here https://stackoverflow.com/questions/2166425/how-to-structure-a-c-application-to-use-a-multicore-processor . In spark I just do map and reduce operations and have no idea how the jvm is handling them on the hardware level. On GPUs I think every thread is assigned to it’s own simple processor, but I always sync my threads wherever a problem might arise. With MPI the communication between machines is specified explicitly, but we could interleave the functions running on multiple machines on a single core and combine the results via an appropriate single threaded function. And what if we use MPI to coordinate a bunch of single core machines, each one with multithreading? What difference does it make? I’d say none. Call it all “parallel” and be done with it.

1

tdammer’s statement comes close, the rest is all besides the point. He says:

“Put boldly, concurrency describes a problem (two things need to happen together), while parallelism describes a solution (two processor cores are used to execute two things simultaneously”

Let’s just analyse the words.

Current means happening now, actual, relevant at this moment.
Con means against, counter, not aligning with.

Parallel means in the same direction without crossing, without being in eachother’s way.

So, concurrency implies competing for the same resource. Parallelism does not. Parallel processes may be using the same resource but it is not considered a problem, it is not an issue. With concurrency, it is an issue to be dealt with.

3

Another common and specific use of the term “parallel” refers to array processors. (The GPU in your computer is a prime example of this.) Here, massively redundant computing units literally compute many results at the same instant.

In this context, “concurrency” is a little looser: we humans might say that over the course of the last second many tasks were worked-on, thus they were “handled ‘concurrently,'” although it may be the case that at any particular nanosecond only one of them was actually being worked on.

But also: “it depends.” The two words have the same colloquial meaning in common human conversation. So, you might need to ask for clarification to determine if some more-precise meaning was or was not intended.

Trang chủ Giới thiệu Sinh nhật bé trai Sinh nhật bé gái Tổ chức sự kiện Biểu diễn giải trí Dịch vụ khác Trang trí tiệc cưới Tổ chức khai trương Tư vấn dịch vụ Thư viện ảnh Tin tức - sự kiện Liên hệ Chú hề sinh nhật Trang trí YEAR END PARTY công ty Trang trí tất niên cuối năm Trang trí tất niên xu hướng mới nhất Trang trí sinh nhật bé trai Hải Đăng Trang trí sinh nhật bé Khánh Vân Trang trí sinh nhật Bích Ngân Trang trí sinh nhật bé Thanh Trang Thuê ông già Noel phát quà Biểu diễn xiếc khỉ Xiếc quay đĩa Dịch vụ tổ chức sự kiện 5 sao Thông tin về chúng tôi Dịch vụ sinh nhật bé trai Dịch vụ sinh nhật bé gái Sự kiện trọn gói Các tiết mục giải trí Dịch vụ bổ trợ Tiệc cưới sang trọng Dịch vụ khai trương Tư vấn tổ chức sự kiện Hình ảnh sự kiện Cập nhật tin tức Liên hệ ngay Thuê chú hề chuyên nghiệp Tiệc tất niên cho công ty Trang trí tiệc cuối năm Tiệc tất niên độc đáo Sinh nhật bé Hải Đăng Sinh nhật đáng yêu bé Khánh Vân Sinh nhật sang trọng Bích Ngân Tiệc sinh nhật bé Thanh Trang Dịch vụ ông già Noel Xiếc thú vui nhộn Biểu diễn xiếc quay đĩa Dịch vụ tổ chức tiệc uy tín Khám phá dịch vụ của chúng tôi Tiệc sinh nhật cho bé trai Trang trí tiệc cho bé gái Gói sự kiện chuyên nghiệp Chương trình giải trí hấp dẫn Dịch vụ hỗ trợ sự kiện Trang trí tiệc cưới đẹp Khởi đầu thành công với khai trương Chuyên gia tư vấn sự kiện Xem ảnh các sự kiện đẹp Tin mới về sự kiện Kết nối với đội ngũ chuyên gia Chú hề vui nhộn cho tiệc sinh nhật Ý tưởng tiệc cuối năm Tất niên độc đáo Trang trí tiệc hiện đại Tổ chức sinh nhật cho Hải Đăng Sinh nhật độc quyền Khánh Vân Phong cách tiệc Bích Ngân Trang trí tiệc bé Thanh Trang Thuê dịch vụ ông già Noel chuyên nghiệp Xem xiếc khỉ đặc sắc Xiếc quay đĩa thú vị
Trang chủ Giới thiệu Sinh nhật bé trai Sinh nhật bé gái Tổ chức sự kiện Biểu diễn giải trí Dịch vụ khác Trang trí tiệc cưới Tổ chức khai trương Tư vấn dịch vụ Thư viện ảnh Tin tức - sự kiện Liên hệ Chú hề sinh nhật Trang trí YEAR END PARTY công ty Trang trí tất niên cuối năm Trang trí tất niên xu hướng mới nhất Trang trí sinh nhật bé trai Hải Đăng Trang trí sinh nhật bé Khánh Vân Trang trí sinh nhật Bích Ngân Trang trí sinh nhật bé Thanh Trang Thuê ông già Noel phát quà Biểu diễn xiếc khỉ Xiếc quay đĩa
Thiết kế website Thiết kế website Thiết kế website Cách kháng tài khoản quảng cáo Mua bán Fanpage Facebook Dịch vụ SEO Tổ chức sinh nhật