Let's Update Ollama Everywhere

แชร์
ฝัง
  • เผยแพร่เมื่อ 15 พ.ย. 2024

ความคิดเห็น • 20

  • @anjei929
    @anjei929 วันที่ผ่านมา

    epic ending, Matt. Thanks, amazing job you're doing over here/there.

  • @FrankSchwarzfree
    @FrankSchwarzfree 3 วันที่ผ่านมา

    Matt, this video was incredibly helpful! I was one of those poor souls lost in the Discord trying to figure out updates. You just saved me (and probably a lot of others) a ton of time and frustration. The step-by-step for each platform was clear and concise - even *I* could understand it! Keep up the amazing work, and looking forward to more Ollama wisdom in your course. Subscribed! 😄 👍

  • @gemini1q
    @gemini1q 2 วันที่ผ่านมา

    First introduction, presentation quality is like Tim Cook, video content is even better...

  • @brinkoo7
    @brinkoo7 3 วันที่ผ่านมา +1

    Idk why I didn’t think about just using docker 😂 thank you 🙏

    • @saurav1096
      @saurav1096 2 วันที่ผ่านมา

      using ollama in docker makes it slowww

  • @Francotujk
    @Francotujk วันที่ผ่านมา

    hi Matt!
    Some time ago I asked you about how to integrate llms directly in an app. I mean for example for an Electron (desktop) app, how to include an llm inside it. You have helped me a lot indicating me a post of how to use the binaries (a post on discord) as Llamacpp does.
    However, I've not achieved the performance that Ollama has. Do you think is it possible to use ollama as something like a "package/container" and include it directly on an Electron app? So instead of dealing with creating ollama again, just use it directly.
    Thanks for all the content you post in your channel, it's really helpful!

  • @jamesspo
    @jamesspo 3 วันที่ผ่านมา

    Very clear and helpful content. Thank you.

  • @enriquebruzual1702
    @enriquebruzual1702 2 วันที่ผ่านมา

    I have it installed on Win 11, I really like how simple the app is. I have had some set backs with the automatic updates, but so far always fixable.

  • @emil8367
    @emil8367 2 วันที่ผ่านมา

    thanks Matt, is there a way to do the update without loosing the ollama.service configuration each time for eg Ubuntu ? (ofc I do have backup and I paste my old config but I'd really like to keep what is inside the ollama.service 🙂)

  • @FahtheGamer
    @FahtheGamer 2 วันที่ผ่านมา +1

    Hi i am running ollama on android it seems to run fine for models with 8b and less then that fine

  • @LawrenceOrsini
    @LawrenceOrsini วันที่ผ่านมา

    @technovangelist why not use WSL? You didn’t give a reason why it is faster?

    • @technovangelist
      @technovangelist  วันที่ผ่านมา +2

      Using wsl means you are using ollama in a Ubuntu container in the wsl vm on top of windows. Native for most folks is 20-30% faster.

  • @diggajupadhyay
    @diggajupadhyay 3 วันที่ผ่านมา

    Thanks Matt ❤

  • @TheTechnicalMystique
    @TheTechnicalMystique 2 วันที่ผ่านมา

    Do you happen to know how to retrieve or change your webui password on linux? Ive been trying and asked discord, but to no avail.

  • @JanezNovak-u6y
    @JanezNovak-u6y 3 วันที่ผ่านมา

    How to configure ollama for windows to use proxy. I have pc behind corporate proxy but ollama for windows not using it.

  • @brinkoo7
    @brinkoo7 3 วันที่ผ่านมา

    Funny this pops up as im trying to figure out how to build newest version on NixOS 😂😝

  • @AliAlias
    @AliAlias 2 วันที่ผ่านมา

    How to make ollama auto download and update

  • @HaydonRyan
    @HaydonRyan 2 วันที่ผ่านมา

    I do wish the ollama team would build and distribute via the standard package managers. That said I understand it’s not the core work.

  • @azharalibhutto1209
    @azharalibhutto1209 3 วันที่ผ่านมา

    ❤❤❤

  • @paulomtts
    @paulomtts 3 วันที่ผ่านมา

    First!