How To Run ANY LLM Using Cloud GPU and TextGen WebUI Easily!

แชร์
ฝัง
  • เผยแพร่เมื่อ 9 พ.ย. 2024

ความคิดเห็น • 9

  • @intheworldofai
    @intheworldofai  8 หลายเดือนก่อน +1

    💓Thank you so much for watching guys! I would highly appreciate it if you subscribe (turn on notifcation bell), like, and comment what else you want to see!
    📅Book a 1-On-1 Consulting Call WIth Me: calendly.com/worldzofai/ai-consulting-call-1
    🔥Become a Patron (Private Discord): patreon.com/WorldofAi
    🚨Subscribe to my NEW Channel! www.youtube.com/@worldzofcrypto
    🧠Follow me on Twitter: twitter.com/intheworldofai
    Love y'all and have an amazing day fellas. ☕ To help and Support me, Buy a Coffee or Donate to Support the Channel: ko-fi.com/worldofai - Thank you so much guys! Love yal

  • @intheworldofai
    @intheworldofai  8 หลายเดือนก่อน

    Hyperwrite: Create and Deploy Personalized AI Agents on Your Operating System! - th-cam.com/video/VLBBeqhZmjc/w-d-xo.htmlsi=B0GJER43rFZJG1yr

  • @intheworldofai
    @intheworldofai  8 หลายเดือนก่อน

    Mistral Large: NEW Mistral Model Beats GPT-3.5 and Llama2-70B on EVERY Benchmark!: th-cam.com/video/EgYWzwhHc_k/w-d-xo.html

  • @nhtna4706
    @nhtna4706 7 หลายเดือนก่อน

    Any monthly affordable packages?

  • @DoctorMandible
    @DoctorMandible 6 หลายเดือนก่อน

    How can an A100 be "overkill" but also "more efficient"?

  • @sanjithsivendra2991
    @sanjithsivendra2991 8 หลายเดือนก่อน

    are there any cloud platforms ,where i can run LLM's for free?

    • @antdx316
      @antdx316 8 หลายเดือนก่อน

      You don't really need it. ChatGPT, Claude, Gemini, Huggingface Chat, etc. are all good.
      We would only need to do this if those servers got destroyed or they turned their servers off either during WW3 or whatever.
      This would be a good backup solution but very costly.
      You can just run LLMs through your PC then just IP in through Port Forwarding.

    • @sd5853
      @sd5853 8 หลายเดือนก่อน

      it would involve Google cloud for free, but the GPU at your disposal is more on the lower tier.

    • @micropatato679
      @micropatato679 6 หลายเดือนก่อน

      You can use LM studio or ollama to run LLMs on your local computer offline