ไม่สามารถเล่นวิดีโอนี้
ขออภัยในความไม่สะดวก
How FastAPI Handles Requests Behind the Scenes
ฝัง
- เผยแพร่เมื่อ 29 มี.ค. 2024
- Unleash the power of FastAPI! Discover how Asyncio and blocking I/O impact performance. Learn to handle requests concurrently for a blazing-fast API! In this video, we explore FastAPI's handling of concurrent requests. We'll compare Asyncio vs Blocking methods and see how normal functions differ. Understand when to use each approach for optimal performance. Optimize your FastAPI application and handle more requests efficiently. Subscribe for more FastAPI deep dives!
fastapi handle concurrent requests
fastapi async vs def performance
fastapi asyncio.sleep vs time.sleep
improve fastapi performance
fastapi endpoint order of execution
fastapi async def blocking io
fastapi thread pool vs event loop
fastapi best practices for concurrency
how to use asyncio in fastapi for database calls
fastapi handle multiple requests at once efficiently
choose async def vs def in fastapi endpoints
fastapi concurrency with external api calls
#FastAPI #Asyncio #Python #WebDevelopment
Great content brother
Quick Modification: sync router is Concurrency not Parallelism. In python parallelism is achieved only by multiprocessing
Great video, but I think it’s important to mention that multi-threading in Python is not parallel
Multi processing*
@@bakasenpaidesu Actually you are wrong; multiprocessing is parallel because Python spawns an entirely new process with an entirely new interpreter.
Multi threading in python is technically also parallel programming whenever a thread releases the GIL, such as during time.sleep or open calls. In those specific instances, there can be two (or more) threads truly executing in parallel in the same python process, because one thread is waiting for the results of a system call, during which time it releases the GIL since reference counts don’t need to be updated, allowing another thread to acquire the GIL and execute a piece of code in parallel.
On this topic also: check out the beta of. Python 3.13! There is a flag that can be passed when launching python that removes the GIL, allowing truly parallelized execution with just threads. Been playing around with asyncio and concurrent.futures.ThreadPoolExecutor -> noticeable speed up. Shame that c-extension based libraries like numpy are unusable with this setting
@@benshapiro9731 This is called concurrency, not parallelism. Parallelism is when two or more tasks can run simultaneously, using the CPU, without waiting for I/O-bound operations. While the GIL doesn't prevent Python from switching context between threads waiting for I/O-bound operations, this is still considered concurrency, not parallelism.
OMG! This is so helpful and a great video. Thank you and please post more videos like this!
Great explanation, you should create more videos bro...
Nice explanation. Concise and to the point.
Hello, great video. I am super new to python. And started using very recently for backend where its being used along with FastAPI. Can you please tell me what are some concepts/ topics which I should be aware for developing efficient code.
For eg from this video, I learnt that which function runs concurrently and which doesn't. I had no idea tbh about this before.
If you could spare few minutes and share some stuff which I should know it would be great.
Thanks for clearing this concept.
Beautifully explained!
Thank you for this, I always wondered the difference between async def and def
Clearly explained!! Thank you
My question would be how FastAPI then manages workload when it´s handed over to the worker thread. Because I can only see one worker thread running, at the same time it handles 40 'workloads' concurrently.
I need some help, I want to create a fast api endpoint that calls a synchronous function that has a lot of blocking I/0 operations. But I want the endpoint function to run asynchronously so it can accept many requests at the same time. How should I do this, is there an alternative approach?
The only way to achieve that is to use multi-threading which I advice against.... instead, make the function asynchronous and try to find the non-blocking function for what you want to do...
Better still, use the run_in_threadpool function from fastapi to run the process in a different thread so that you don't block the event...better than implementing multi threading on your own.
@@Praise-rs4mc thanks alot, I will give it a try.
Great video. Thanks
very well explanation.
def endpoint3() is not running parallely for me as supposed to what u said in the video. Instead it is sunning one at a time. Do u know why?
I believe you are testing APIs in the browser. Sometimes, browsers like Chrome have limitations on making parallel requests to the same URL. In the video, if you look closely, I am using two different browsers to hit the same API in parallel. You can try the same approach.
@@codecollider Yes you are right. I tried from different browsers and it worked. Strange though. Thanks
Thanks man for the video. I am trying to use fast api for db CRUD, which one do you think i should use for get post put and delete?
It depends on whether your database library supports non-blocking queries. Ideally, for endpoints involving database calls, use async def if your library allows awaiting query execution (like await db.execute()).
If you're using SQLAlchemy, it provides both blocking and non-blocking methods for queries. It's generally recommended to use the non-blocking approach for better performance.
@@codecollider thank you, everything i write is in normal, non async, and I am using sqlite3 package. I think i will use normal def for all of it, since they are all parallel, and the blocking of read and write on database is performed by SQLite itself
great video
❤
boooozi txeq
can you make fastapi how run under the hood and how @app.exception_handler work Thanks awesome contentent