LangChain Streaming - stream, astream, astream_events API & FastAPI Integration
ฝัง
- เผยแพร่เมื่อ 8 ก.ค. 2024
- In this Video I will show you how to perform streaming with LangChain. I will also show you the astream_events API. At the end I gonna show you how you can integrate both approaches with FastAPI.
Code: github.com/Coding-Crashkurse/...
Timestamps:
0:00 Introduction
0:27 stream & astream
2:00 astream_events API
4:52 FastAPI Integration (StreamingResponse)
7:42 Frontend Events
9:16 astream_events with FastAPI
#langchain #streaming #fastapi
This is so good! Thank you so much.
Wonderfully done
Thank you
Thanks a lot !! Love your stuff (:
Thank you
This is so cute
spend so many hours on this couple of weeks ago until I got it running. I think this video will help a lot of people!
One question to this: for the astream_events, can you also include the source documents "on_chat_model_end"? At the moment I have a second FastAPI endpoint which returns the source docs but I think it would be more efficient to include it at the end of a streaming response.
When you know how to do it, its not that hard actually :)
yes not that hard, you are right. I think there is a little mistake in the script.In "app_events.py" you are streaming the "model" and not the "retrieval_chain", or is this intended to be?
Can you tell me how to get data source as output? Like I want the document name and the text from document it referred to. I can't figure out how to access that.
Great video! How could I do this but with ConversationalRetrievalChain to mantain memory & vectordb ?
You can learn this in other videos from me, like the LCEL video, which is the basis for pretty much everything
I think we can only use data streaming with openai? I wish you could do this on a local model like Llama2 without using openai.
No, other models also offer streaming, not only openai. LLama3 SHOULD offer it, but I am not totally sure.
can you tell how to do streaming for a react agent with a vector db retriever tool and a internet tool
they all share the same interface, streaming works the same for all custom runnables
@@codingcrashcourses8533 But I'm getting intermediate steps instead. Not getting a token by token answer like a runnable chain.
How can we know if a model supports streaming or not?
You have to read the docs. No way around that
Hi, nice one, how can I stream using Flask not fast api? Can you please provide the code.
flask.palletsprojects.com/en/2.1.x/patterns/streaming/
You made retriever_chain but never used it, also since it is never used how are we establishing connection between model and retriever then?
Yes, i used the wrong variable in that project. Sorry
@@codingcrashcourses8533 no worries, good stuff man
Full stack streaming- beautiful one suggestion mate give me a few seconds at the end so I can click like on my tv then you can suggest subscribe thank you!
Yeah, need some Kind of outro