Amazing, tried to understand this a few months ago when my friends were experimenting with go Finally understood it now, concurrency is achieved when the processor switches between tasks really quickly and parallel is basically when we do two tasks at the same time, could be me and my friend parallel writing down notes of different lectures.
multiprocessor execution surely does not imply parallelism. while I have 16 physical cores, I also have 587 processes running, so the OS will have to resort to scheduling (what you define as concurrency).
Unless only one of the cores is working at each moment, surely you're executing more than one process at the same time, and thus have parallel execution?
You only get real parallelism in multicore bare metal processors and FPGA/ASIC. Everything else is concurrent or concurrently parallel. This is because there is a clear separation of concerns between the OS operation and software you develop on top of OS functionalities.
The way I think of it is that Parallelism is the parent of concurrency. That is, parallelism deals with multi-threading, and is constrained by hardware implementations; to effectively process task in parallel, task run at the same time and should be divided to as atomic as possible, accessing no shared (or unprotected/locked) data simultaneously. Concurrency could or could not timeslice between task, depending on the Task Partitioner, hyperthreading, etc, but it does not concern itself with the partitioning of tasks/data; concurrency IS running things while parallel is the identification of how things can be run in a thread safe manner, concurrently. It’s been a while since I worked with System.Threading namespace, but that’s how I remember it.
Hi Jacob, For the diagram at 3:25 , can an application be parallel and not concurrent? That would contradict the diagram. "An application can be parallel but not concurrent, which means that it processes multiple sub-tasks of a single task at the same time."
Great question! I think basically that's why threads were created, some tasks have multiple subtask inside so those are what actually interest you, not the big one. So you create threads of execution that run concurrently and ideally in parallel due to multithreaded processors. Now, i guess if you have multiple cores with multiple threads each, your processes are running in a parallel fashion, with threads running concurrently on each individual thread of the processor. For example, a processor with 2 cores with one thread of execution each can run one thread of one process on one core and another thread of another process on the other core, while threads are executed concurrently waiting for their turn on the core.
Could you do a video on atomics sometime? I've only ever used mutexes for making multithreaded code safe and it seems like it's a bit limiting in certain circumstances
How do we get to redefine English words to mean what we want? The definition of ‘concurrent’ says “two or more things happening at the same time.” 10/10 would fail this quiz question and be okay with it.
@@The-Real-Jack This is incorrect. Asynchrony has nothing to do with concurrency in itself. Asynchrony is merely the absence or privation of synchrony.
Async isn't necessarily "true" concurrency in the sense of multithreaded concurrency (though it can be), but even in the case of single threaded async, it allows you to unblock further events and actions while your other instructions continue executing, practically allowing things to happen at the same time, even if they're not technically truely simultaneous. Whether you consider that concurrent or not is mostly semantic.
Instead of using synonyms to confuse people, why didn't they just call "a single core juggling several processes" what it is: multitasking, as that is what it is called when a person does it. Or does that mean something else entirely in CS?
It depends on your hardware and OS. If you're using a modern multicore system then most probably they're running in parallel. Especially since those two processes have nothing to do with each other and are independent.
DO NOT USE THESE DEFINITIONS this is a REALLY, REALLY bad relationship of names and it causes everyone tangible harm to advocate these names for these concepts. "parallel" and "concurrent" in the context of time are synonymous in the English language, and so using "concurrent" as the name for something discretely different from "parallel" is REALLY, REALLY confusing and semantically misleading. call this concept of rapid OS context switching "scheduling", "round robin", "taking turns", "context switching", "time sharing", "time-slicing", "time-rationing", "focus-swapping", "pseudo-parallism", "virtual parallelism", "interleaving", "task dithering", "sequential sharing", or literally ANYTHING else. PLEASE stop giving things horrible names. PLEASE. computer science has way too fucking many bad names already. "concurrent" is literally THE word in the English language for when multiple things are happening at the same time. PLEASE do not use that word to refer to something where multiple things are explicitly NOT actually happening at the same time. this kind of thoughtless, rote repetition of extremely stupid traditions is exactly what's wrong with so much of programming.
Programmers use terms in set X, mathematicians use set Y and "civilians" use set Z, and never the twain shall meet. That's life, get used to it and grow up or get out while you can. Another one that'll trip you up is thinking that terms stay the same for all time, because human beings can and will change the dictionary as often as they can. Go back to the 1950's and you'll more or less be able to have a casual conversation with someone, but go back another 70 years and you might not understand anyone. Something to ponder, if the average is at a specific level, then roughly half, give or take, will be below that average.
I tried to keep this in mind “Concurrency is a software mechanism and parallelism is a hardware concern”
thanks
They're both achieved through software
I think what he meant is that parallelism is done by CPU @@mindfulnessrock
@@mindfulnessrock If the hardware doesn't support it, no amount of software will help.
These diagrams helped me to visualise it better
Concurrency...
_ _
_ _ _
Parallelism :
_ _ _ _ _
_ _ _ _ _
i know this is maybe off-topic, but can you make a video about distributed systems and perhaps MPI ?
Keep uploading ... These videos are very insightful !
Glad to see you putting out more videos!
Amazing, tried to understand this a few months ago when my friends were experimenting with go
Finally understood it now, concurrency is achieved when the processor switches between tasks really quickly and parallel is basically when we do two tasks at the same time, could be me and my friend parallel writing down notes of different lectures.
Hey Jacob, really enjoy your videos, could you make a topic about RISC-V?
multiprocessor execution surely does not imply parallelism. while I have 16 physical cores, I also have 587 processes running, so the OS will have to resort to scheduling (what you define as concurrency).
Unless only one of the cores is working at each moment, surely you're executing more than one process at the same time, and thus have parallel execution?
You only get real parallelism in multicore bare metal processors and FPGA/ASIC. Everything else is concurrent or concurrently parallel. This is because there is a clear separation of concerns between the OS operation and software you develop on top of OS functionalities.
My interpretation has always been the same. I always think of "parallel processors" which requires more than one core. 🙂
Loved this explanation
This is the way I've learned and understood it as well, granted I've had to double check and google it a few times when I forgot 😁
Well explained... thank you !!!
The way I think of it is that Parallelism is the parent of concurrency. That is, parallelism deals with multi-threading, and is constrained by hardware implementations; to effectively process task in parallel, task run at the same time and should be divided to as atomic as possible, accessing no shared (or unprotected/locked) data simultaneously. Concurrency could or could not timeslice between task, depending on the Task Partitioner, hyperthreading, etc, but it does not concern itself with the partitioning of tasks/data; concurrency IS running things while parallel is the identification of how things can be run in a thread safe manner, concurrently.
It’s been a while since I worked with System.Threading namespace, but that’s how I remember it.
Hi Jacob,
For the diagram at 3:25 , can an application be parallel and not concurrent? That would contradict the diagram.
"An application can be parallel but not concurrent, which means that it processes multiple sub-tasks of a single task at the same time."
Great question! I think basically that's why threads were created, some tasks have multiple subtask inside so those are what actually interest you, not the big one. So you create threads of execution that run concurrently and ideally in parallel due to multithreaded processors.
Now, i guess if you have multiple cores with multiple threads each, your processes are running in a parallel fashion, with threads running concurrently on each individual thread of the processor. For example, a processor with 2 cores with one thread of execution each can run one thread of one process on one core and another thread of another process on the other core, while threads are executed concurrently waiting for their turn on the core.
An example would help me understand this better
my summary of this video:
* Concurrency means: overlap in their turning-around period
* Parallelism means: overlap in their actively running time
Have you seen "Modern C and we can learn from it?" I wonder what you think on that as it's quite different from the usual way you'd use C.
Could you do a video on atomics sometime? I've only ever used mutexes for making multithreaded code safe and it seems like it's a bit limiting in certain circumstances
They done it :)
Check the last video «Making variables atomic in C»
If two completely independent tasks are run in parallel, is that (still) concurrent?
(Referring to the last diagram.)
How do we get to redefine English words to mean what we want? The definition of ‘concurrent’ says “two or more things happening at the same time.” 10/10 would fail this quiz question and be okay with it.
Indeed. Redefining imprecise natural language to meet our specific needs is kind-of what we tech people do.
@@JacobSorber :(
@@JacobSorbertech language is so bad
is async computation concurrency?
Yes
@@The-Real-Jack This is incorrect. Asynchrony has nothing to do with concurrency in itself. Asynchrony is merely the absence or privation of synchrony.
It can be both based on the programmer's decision. It can run on a separate hardware thread or it can run on the same thread sequentially.
Async isn't necessarily "true" concurrency in the sense of multithreaded concurrency (though it can be), but even in the case of single threaded async, it allows you to unblock further events and actions while your other instructions continue executing, practically allowing things to happen at the same time, even if they're not technically truely simultaneous. Whether you consider that concurrent or not is mostly semantic.
yes or not, bcause the distributed task don't have to be ran immediately
Instead of using synonyms to confuse people, why didn't they just call "a single core juggling several processes" what it is: multitasking, as that is what it is called when a person does it. Or does that mean something else entirely in CS?
Do you know what a semaphore is?
can two thread in the same process run in different cores and implement parallism?❤
I'm downloading a file in my web browser and meanwhile I am watching your TH-cam video. Are both happening parallelly or concurrently?
It depends on your hardware and OS. If you're using a modern multicore system then most probably they're running in parallel. Especially since those two processes have nothing to do with each other and are independent.
DO NOT USE THESE DEFINITIONS
this is a REALLY, REALLY bad relationship of names and it causes everyone tangible harm to advocate these names for these concepts. "parallel" and "concurrent" in the context of time are synonymous in the English language, and so using "concurrent" as the name for something discretely different from "parallel" is REALLY, REALLY confusing and semantically misleading. call this concept of rapid OS context switching "scheduling", "round robin", "taking turns", "context switching", "time sharing", "time-slicing", "time-rationing", "focus-swapping", "pseudo-parallism", "virtual parallelism", "interleaving", "task dithering", "sequential sharing", or literally ANYTHING else.
PLEASE stop giving things horrible names. PLEASE. computer science has way too fucking many bad names already. "concurrent" is literally THE word in the English language for when multiple things are happening at the same time. PLEASE do not use that word to refer to something where multiple things are explicitly NOT actually happening at the same time.
this kind of thoughtless, rote repetition of extremely stupid traditions is exactly what's wrong with so much of programming.
Programmers use terms in set X, mathematicians use set Y and "civilians" use set Z, and never the twain shall meet. That's life, get used to it and grow up or get out while you can. Another one that'll trip you up is thinking that terms stay the same for all time, because human beings can and will change the dictionary as often as they can. Go back to the 1950's and you'll more or less be able to have a casual conversation with someone, but go back another 70 years and you might not understand anyone. Something to ponder, if the average is at a specific level, then roughly half, give or take, will be below that average.
TF
Completely agree. Computer Science uses trash language
Thank you so much for your awesome videos.
Would you do some series on Rust.
I hope not. No one should be using it.