Rivet: How To Use Local LLMs & ChatGPT At The Same Time (LM Studio tutorial)

แชร์
ฝัง
  • เผยแพร่เมื่อ 15 ต.ค. 2024
  • This tutorial explains how to connect LM Studio with Rivet to use local models running on your own pc (e.g. Mistral 7B), but also how you are able to still use openai models (e.g. gpt-4) at the same time and mix & match as you need them.
    Shoutout to "Egalitaristen" from the Rivet discord community, who helped lots of people setting this up.
    Links:
    Get Rivet: rivet.ironclad...
    Get LM Studio: lmstudio.ai/
    Follow me on LinkedIn: / tim-k%c3%b6hler-ai-mad...
    If you enjoy my content, consider buying me a coffee: www.buymeacoff...

ความคิดเห็น • 8

  • @95jack44
    @95jack44 5 หลายเดือนก่อน +2

    For people having issues. Try to click the 3 dots button in the top right corner. Then "executor" switch it to "node". Now it works...

    • @UTJK.
      @UTJK. 4 หลายเดือนก่อน

      Thanks. You saved me an headache.

  • @holgergelhausen8616
    @holgergelhausen8616 9 หลายเดือนก่อน

    Wonderful Tutorial, i installed LMStudio, put it on GPU and on RAM and the answer is very fast! Thanks

  • @holgergelhausen8616
    @holgergelhausen8616 9 หลายเดือนก่อน

    Very good, i will write a new blog post about rivet and you channel! Thank you very much for sharing

  • @EagerEggplant
    @EagerEggplant 7 หลายเดือนก่อน

    Thank you for this. YT algo read my head.

    • @AIMadeApproachable
      @AIMadeApproachable  7 หลายเดือนก่อน

      Just us an update: Maybe in the next 1-2 days you will not need to use my subgraphs anymore to make the ollama chat node work with any model. Just requested a pull request where I added the "promptFormat": "auto" option and ollama will handle it. Just needs to get approved :)