Dynamic Few-shot Prompting with Llama 3 on local Environment | Ollama | Langchain | SQL Agent

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 ก.ค. 2024
  • This video teaches you how to implement dynamic few-shot prompting with open-source LLMs like Llama 3 using Langchain on local environment.
    In this tutorial, we will follow these steps:
    1. Import Llama3 : Begin by importing the necessary Llama3 library using Ollama.
    2. Fetch SQL Data : Connect to your SQL database and fetch the data you need. This involves establishing a connection to sqlite database.
    3. Initialize Few-Shot Examples : Select a few-shot learning approach by initializing a set of examples that will guide the model.
    4. Convert Examples to Embeddings : Transform the few-shot examples into embeddings.
    5. Create Custom Tools : Develop custom tools tailored to your specific needs (Here relate to SQL database).
    6. Create Prompt: Design a prompt that will be used to interact with the model.
    7. Create an Agent with ReAct Logic : Develop an agent that incorporates ReAct (Reasoning and Acting) logic. This agent will use the prompt and the few-shot examples to perform tasks interactively.
    8. Agent Executor : Implement the agent executor, which will manage the execution of tasks by the agent. This component should handle the flow of information between the agent and other parts of your system, ensuring smooth and efficient operation.
    Code Link : github.com/TheAILearner/Langc...
    Ollama Github - github.com/ollama/ollama
    SQL Agent with Llama 3(With Ollama Installation in Local) - • Build an SQL Agent wit...
    #dynamicfewshotprompting #sqlagent #llama3 #langchain #ollama #customtools #customagent #fewshotprompting #sql #database #langchain #machinelearning #nlp
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 10

  • @GordonShamway1984
    @GordonShamway1984 25 วันที่ผ่านมา

    very nicely explained. You helped me a lot, thank you!

  • @umeshtiwari9249
    @umeshtiwari9249 หลายเดือนก่อน

    Thanks for such nice tutorial on complex topic

  • @MeTuMaTHiCa
    @MeTuMaTHiCa หลายเดือนก่อน +1

    İt will be good when this works with cloud

    • @MeTuMaTHiCa
      @MeTuMaTHiCa หลายเดือนก่อน

      By the way thx for good ai work

  • @NillsBoher
    @NillsBoher หลายเดือนก่อน +1

    Great!!!! Thanks for sharing your knowledge! However I want to ask it the prompt is not too long for the context of ollama3?

    • @theailearner1857
      @theailearner1857  หลายเดือนก่อน +1

      Not at all. Llama 3 has context length of 8192 while the prompt shown in the video varies from 450 to 500 tokens only.

  • @MScProject-u9n
    @MScProject-u9n 29 วันที่ผ่านมา +1

    How can I run it in colab instead of local Environment?

    • @MScProject-u9n
      @MScProject-u9n 28 วันที่ผ่านมา +1

      Can you also provide us the source code

    • @theailearner1857
      @theailearner1857  27 วันที่ผ่านมา

      You can check out this video to run Ollama-based models on Google Colab, after which the dynamic few-shot prompting steps can be easily implemented.
      th-cam.com/video/XDvTt_TOewU/w-d-xo.htmlsi=RkjXX-jO3VSA08Em

    • @theailearner1857
      @theailearner1857  27 วันที่ผ่านมา

      Code Link : github.com/TheAILearner/Langchain-Agents/blob/main/Dynamic%20Few-shot%20Prompting%20with%20Llama%203.ipynb