SmolAgents: How to Build Your Own Agentic LLM Flows!

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 ม.ค. 2025

ความคิดเห็น • 5

  • @chokshiroshan
    @chokshiroshan 11 วันที่ผ่านมา +1

    cool video!

    • @SmartAlexAU
      @SmartAlexAU  11 วันที่ผ่านมา

      Glad you enjoyed it

  • @Cysops
    @Cysops 16 วันที่ผ่านมา

    Can these smolagents run on anythingLLM

    • @SmartAlexAU
      @SmartAlexAU  16 วันที่ผ่านมา +1

      Essentially as AnythingLLM can run it's own agentic script in it's 'AI Agent' tab it can potentially do similar tasks to this. SmolAgents basically endeavours to replace tools like AnythingLLM but it is currently not very fleshed out. To understand this agentic flow better you are better off looking deeper into the github repo for smolagents to understand how they customised their agents to run multi-step and focused delivery based on a prompt: github.com/huggingface/smolagents/blob/main/src/smolagents/agents.py
      This file is the most important file in the src to understand the agentic behaviour of smolAgents and you can most likely replicate this kind of Agentic flow using anythingLLMs 'AI agents'. Essentially Anything LLM does similar actions with locally installed LLMs through ollama like llama3.3 or qwen-coder but requires some finessing to make it more customised.

    • @Cysops
      @Cysops 12 วันที่ผ่านมา

      ​@@SmartAlexAU thank you