👍 Please like if you found this video helpful, and subscribe to stay updated with my latest tutorials. 🔔 ❤ You can support this channel by buying me a ☕: buymeacoffee.com/codesfinance
Vincent, I must say that your TH-cam channel is truly exceptional and stands out among the rest. I just hit the subscribe button and I am eagerly looking forward to seeing more of your amazing content.
Hello Vincent, I hope you're well. I noticed that Msty has updated some software features and added RAG to their LLMs. Could you help us understand this? You're always my go-to for these things.
Hi, thank you for video. Can you answer if there is any way I can load a separately downloaded gguf model into Msty. Just specifying the models folder does not show them in the list.
I'm using a macbook pro with m3 max cpu and 64gb of ram. If using a pc, you'll want a fast gpu with a decent amount of ram on the gpu otherwise ollama will run on the cpu (much slower).
👍 Please like if you found this video helpful, and subscribe to stay updated with my latest tutorials. 🔔
❤ You can support this channel by buying me a ☕: buymeacoffee.com/codesfinance
Vincent, I must say that your TH-cam channel is truly exceptional and stands out among the rest. I just hit the subscribe button and I am eagerly looking forward to seeing more of your amazing content.
Thanks for the kind words, it's always nice to hear!
Hello Vincent, I hope you're well. I noticed that Msty has updated some software features and added RAG to their LLMs. Could you help us understand this? You're always my go-to for these things.
Already working on it! It will likely be more than one video given all the new features.
keep going you are the best
Hi, thank you for video. Can you answer if there is any way I can load a separately downloaded gguf model into Msty. Just specifying the models folder does not show them in the list.
That I don't know. You should ask Ashok (Msty developer) on their discord channel, he is very responsive to user questions: discord.gg/2QBw6XxkCC
@@VincentCodesFinance Thanks for the response
Absolutely, very versatile app..
can you make a video about how to use local LLM with metagpt or taskwaver .
Thanks for the tips. I never used them but will look into them. TaskWeaver seems like something that would be useful for my research work.
What is the configuration of your PC?
I'm using a macbook pro with m3 max cpu and 64gb of ram. If using a pc, you'll want a fast gpu with a decent amount of ram on the gpu otherwise ollama will run on the cpu (much slower).