Good job. Remember the code will not run in windows without if __name__ == '__main__': from multiprocessing import freeze_support freeze_support() Here is the code from multiprocessing import Process from threading import Thread import os import time def worker(name): print(f" Processors {os.getpid()} is Running on {name}") time.sleep(0.1) if __name__ == '__main__': from multiprocessing import freeze_support freeze_support() processors = [] num_processors = os.cpu_count() if num_processors is None: num_processors = 1 # create processes for i in range(num_processors): p = Process(target=worker, args=["Feto"]) processors.append(p) # start each process for p in processors: p.start() # Join the process to wait all the process to finsih then close the main thread that enable program to work. for p in processors: p.join()
Hi! Just a bit more! How I can get the result value of a function executed in a Process? Or, maybe, I can write to a global list from the executed function on a process?
I'm trying to wrap my head around multiprocessing. I have some code that seems ideal to be done in parallel. I have a large number of files (500) that I loop through 1 at a time. within each loop, a series of functions act on the file modifying it. at the end of each loop, a new file is created. (think of a factory where raw material goes in and a finished product comes out a the end of the process). using your code as a template, where would I loop through the 500 files? (substituting square_number for my functions/methods...) thank you!
Because then your two processes do NOT run in parallel. By calling join() you wait until the process terminates, and your main thread is blocked in the mean time...
I am using a MacOS, with a 6 core Intel i5, so I got 6 threads. The code in the video would not work in my PyCharm IDE using Python 3.9. It gave me MANY errors. I fixed them using "if __name__ == '__main__'" and a def main(): _____________________________________________________ from multiprocessing import Process import os import time def square_numbers(): for i in range(100): i + i time.sleep(0.1) def def main(): processes = [] num_processes = os.cpu_count() for i in range(num_processes): p = Process(target=square_numbers) processes.append(p) for p in processes: p.start() for p in processes: p.join() print('end main') if __name__ == '__main__': main()
I don't understand if the p.start() will call the function square_numbers 4 times (number of your cpus) or partition the for loop inside square_numbers into 4 pieces and run each one on a cpu.
if __name__ == "__main__": is needed in the program for windows its weird..In my windows pc, when I run the program, end main at the end of the program is getting printed 9 times..could you plz check and comment? for me, num_processes is 8(I think it is 8 logical cores) from multiprocessing import Process import os import time def square_numbers(): for i in range(100): i * i time.sleep(0.1) if __name__ == "__main__": processes = [] num_processes = os.cpu_count() #create processes for i in range(num_processes): p = Process(target=square_numbers) processes.append(p) #start for p in processes: p.start() #join for p in processes: p.join() print('end main')
yes because you manually create 8 processes, and you also have your main python process that started this file, so 9 times. actually the print('end main') should be included in the if __name__ == '__main__' because only this part is running from the main process
Have you investigated using CPU affinity with multiprocessing? I am using Windows 10 and am trying to get cpu affinity to work to set a specific core to run only a specific process. I really enjoyed your videos.
Hey, great video. I think it's important to say that despite the interfaces are so similar in Python, from the OS standpoint these two are immensely different things. Coming from dotnet, where the API is drastically different which helps to grasp/guess the distinction too. Also, really interesting, how the join() on process works in Windows, is it WinApi's WaitForSingleObject() or something alike? Will read about it.
Change the name for this video series. This is beginner stuff. Don't lie to people by calling it advanced. You lose your credibility from the start when you LIE to people right from the start.
Clear and detailed explanation. Many thanks for your work.
glad you like it!
Very Impressive !!
Excellent work!!
concise, very clear, easy to understand examples
Keep going like this. You are doing great.
Thank you!
Explaining was really good, I now somewhat feel like I understand it better than before..
Good job.
Remember the code will not run in windows without
if __name__ == '__main__':
from multiprocessing import freeze_support
freeze_support()
Here is the code
from multiprocessing import Process
from threading import Thread
import os
import time
def worker(name):
print(f" Processors {os.getpid()} is Running on {name}")
time.sleep(0.1)
if __name__ == '__main__':
from multiprocessing import freeze_support
freeze_support()
processors = []
num_processors = os.cpu_count()
if num_processors is None:
num_processors = 1
# create processes
for i in range(num_processors):
p = Process(target=worker, args=["Feto"])
processors.append(p)
# start each process
for p in processors:
p.start()
# Join the process to wait all the process to finsih then close the main thread that enable program to work.
for p in processors:
p.join()
very well😀
Excellent video
Thank you!
Hi! Just a bit more! How I can get the result value of a function executed in a Process? Or, maybe, I can write to a global list from the executed function on a process?
its a great explanation, keep doing
Thank you!
Thanks! Great explanation.
Glad you enjoyed it!
how do you open the activity manager file i dont understand that process
I'm trying to wrap my head around multiprocessing. I have some code that seems ideal to be done in parallel. I have a large number of files (500) that I loop through 1 at a time. within each loop, a series of functions act on the file modifying it. at the end of each loop, a new file is created. (think of a factory where raw material goes in and a finished product comes out a the end of the process). using your code as a template, where would I loop through the 500 files? (substituting square_number for my functions/methods...) thank you!
Question:
Why not
for p in processes:
p.start()
p.join()
?
Because then your two processes do NOT run in parallel. By calling join() you wait until the process terminates, and your main thread is blocked in the mean time...
I am using a MacOS, with a 6 core Intel i5, so I got 6 threads.
The code in the video would not work in my PyCharm IDE
using Python 3.9. It gave me MANY errors. I fixed them
using "if __name__ == '__main__'" and a def main():
_____________________________________________________
from multiprocessing import Process
import os
import time
def square_numbers():
for i in range(100):
i + i
time.sleep(0.1)
def def main():
processes = []
num_processes = os.cpu_count()
for i in range(num_processes):
p = Process(target=square_numbers)
processes.append(p)
for p in processes:
p.start()
for p in processes:
p.join()
print('end main')
if __name__ == '__main__':
main()
Danke!
Gives error when at the end of execution says attempt to start a new process before current process finished bootstrapping
I don't understand if the p.start() will call the function square_numbers 4 times (number of your cpus) or partition the for loop inside square_numbers into 4 pieces and run each one on a cpu.
if __name__ == "__main__": is needed in the program for windows
its weird..In my windows pc, when I run the program, end main at the end of the program is getting printed 9 times..could you plz check and comment?
for me, num_processes is 8(I think it is 8 logical cores)
from multiprocessing import Process
import os
import time
def square_numbers():
for i in range(100):
i * i
time.sleep(0.1)
if __name__ == "__main__":
processes = []
num_processes = os.cpu_count()
#create processes
for i in range(num_processes):
p = Process(target=square_numbers)
processes.append(p)
#start
for p in processes:
p.start()
#join
for p in processes:
p.join()
print('end main')
yes because you manually create 8 processes, and you also have your main python process that started this file, so 9 times. actually the print('end main') should be included in the if __name__ == '__main__' because only this part is running from the main process
@@patloeber Thank you
thx bro
Have you investigated using CPU affinity with multiprocessing? I am using Windows 10 and am trying to get cpu affinity to work to set a specific core to run only a specific process.
I really enjoyed your videos.
Hey, great video.
I think it's important to say that despite the interfaces are so similar in Python, from the OS standpoint these two are immensely different things.
Coming from dotnet, where the API is drastically different which helps to grasp/guess the distinction too.
Also, really interesting, how the join() on process works in Windows, is it WinApi's WaitForSingleObject() or something alike? Will read about it.
How can there be a race condition if only one thread executes at a time?
I have a written explanation on my website: www.python-engineer.com/blog/advancedpython16_threading#Race-condition
That you very much to explain why GIL is used. It was very helpful
Glad you like it!
Excellent video! Btw, which IDE and color theme are those? Thanks!
thanks! In this old video it's VS Code and Monokai theme
Can i ask some questions...
How many thread we should create ?
How many thread we can create ?
Please answer my questions.
Getting "PermissionError: [WinError 5] Access is denied"
Try to run the terminal as administrator, or move the file to a folder where you have full access rights
How to avoid broken process pool in multiprocessing?
After watching this video, i got confused and my watching other videos goes to trash along with this one
Sorry if it was not clear enough. It’s a difficult topic
Change the name for this video series. This is beginner stuff. Don't
lie to people by calling it advanced. You lose your credibility from
the start when you LIE to people right from the start.