Local LLM with Ollama, LLAMA3 and LM Studio // Private AI Server

แชร์
ฝัง
  • เผยแพร่เมื่อ 23 ก.ค. 2024
  • Forget using public generative AI resources like ChatGPT. Use local large language models (LLMs) instead which allows you to create a private AI server. In the video, we step through setting up a private AI server in Windows Subsystem for Linux and a few hacks that need to happen so that your API can be accessible to the open source frontend called Open WebUI.
    Written post on the steps: www.virtualizationhowto.com/2...
    Ollama download: ollama.com/download
    LM Studio: lmstudio.ai/
    ★ Subscribe to the channel: / @virtualizationhowto
    ★ My blog: www.virtualizationhowto.com
    ★ Twitter: / vspinmaster
    ★ LinkedIn: / brandon-lee-vht
    ★ Github: github.com/brandonleegit
    ★ Facebook: / 100092747277326
    ★ Discord: / discord
    ★ Pinterest: / brandonleevht
    Introduction - 0:00
    What are Large Language Models (LLMs) - 0:45
    Advantages of hosting LLMs locally - 1:30
    Hardware to run LLMs on your own hardware - 2:30
    Setting up Ollama - 3:18
    Looking at the Linux script for Ollama - 3:40
    Downloading and running popular LLM models - 4:18
    Command to download LLAMA3 language model - 4:31
    Initiating a chat session from the WSL terminal - 5:16
    Looking at Hugging Face open source models - 5:46
    Open WebUI web frontend for private AI servers - 6:20
    Looking at the Docker run command for Open WebUI - 6:50
    Accessing, signing up, and tweaking settings in Open WebUI - 7:22
    Reviewing the architecture of the private AI solution - 7:38
    Talking about a hack for WSL to allow traffic from outside WSL to connect - 7:54
    Looking at the netsh command for the port proxy - 8:24
    Chatting with the LLM using Open WebUI - 9:00
    Writing an Ansible Playbook - 9:20
    PowerCLI scripts - 9:29
    Overview of LM Studio - 9:42
    Business use cases for local LLMs - 10:16
    Wrapping up and final thoughts - 10:51
  • แนวปฏิบัติและการใช้ชีวิต

ความคิดเห็น • 14

  • @kenmurphy4259
    @kenmurphy4259 หลายเดือนก่อน

    Thanks Brandon, nice review of what’s out there for local LLMs

  • @steheverod
    @steheverod หลายเดือนก่อน

    Its great idea, thanks Brandon. I will test on my homelab.

  • @fermatdad
    @fermatdad 29 วันที่ผ่านมา

    Thank you for the helpful tutorial.

  • @romayojr
    @romayojr หลายเดือนก่อน +1

    this is awesome and can’t wait to try it. is there a mobile app for open webui?

    • @jjaard
      @jjaard 22 วันที่ผ่านมา

      I suspect technically it can easily run via any browser

  • @trucpham9772
    @trucpham9772 หลายเดือนก่อน

    How to run ollama3 in macos, i want public localhost to use nextchatgpt , can you share command this solution

  • @kderectorful
    @kderectorful 8 วันที่ผ่านมา

    I am accessing the openai server via a Mac, and my guess is that the netsh command is about your windows workstation you are accessing the server. Is there a similar command that would need to be run, or if I do this on my Linux server via Firefox, will I still have the same issue. I cannot seem to get the ollama3:latest installed for openweb. Any insight would be greatly appreciated as this was the most concise video I have seen on the topic.

  • @mjes911
    @mjes911 5 วันที่ผ่านมา

    How many concurrent users can this support for business cases?

  • @nobody-P
    @nobody-P หลายเดือนก่อน

    😮I'm gonna try this now

  • @user-nh4kg2zj2q
    @user-nh4kg2zj2q หลายเดือนก่อน

    Nobody explained how to install ollama and run it in properite way ، it should be step's ، is docker important to install before ollama? I tried to install ollama alone and it doesn't installed completely!! I don't know why

    • @kironlau
      @kironlau หลายเดือนก่อน +1

      1. you should mention what is your os
      2. read the official documentation
      3. if you run on win, just download the exe/msi file, install with one click(and click yes...)

  • @SyamsQbattar
    @SyamsQbattar 10 วันที่ผ่านมา

    Is LMStudio better than Ollama?

    • @camsand6109
      @camsand6109 4 วันที่ผ่านมา

      no, but its a good option

    • @SyamsQbattar
      @SyamsQbattar 4 วันที่ผ่านมา

      @@camsand6109 then, Ollama is better?