Level Up Your Typescript Skills: Adding Ollama To Your Apps!

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 ธ.ค. 2024

ความคิดเห็น •

  • @userou-ig1ze
    @userou-ig1ze 9 หลายเดือนก่อน +10

    This channel is top on my list lol, I'm addicted to the casual seeming, yet perfect presentations (besides the interesting content and impressive successive iterations of code...). It's clear, concise, intuitive flow and so calm I find myself not always using 2x speed

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน +1

      Thanks so much. Nice to know it’s resonating with folks

  • @DevynDowler-Lewis
    @DevynDowler-Lewis 9 หลายเดือนก่อน +7

    This is remarkably well presented

  • @yoemasey
    @yoemasey 9 หลายเดือนก่อน

    Thank you for explaining the details so carefully and easy to understand. Can't wait to try out Ollama :)

  • @brogeby
    @brogeby 8 หลายเดือนก่อน

    This is exactly the channel I was searching for. Your presentations are very well presented, great examples and you talk about very relevant topics. Thanks!

  • @jemilsuleimanov209
    @jemilsuleimanov209 3 หลายเดือนก่อน

    Thanks a lot Matt, i was looking exactly for how to set up chat like convesration with llm via ollama in js. Your video helped

  • @lucioussmoothy
    @lucioussmoothy 9 หลายเดือนก่อน +3

    nice over view -I'd love to see a view on how we can use JS to train and fine tune an Ollama model (if you haven't posted one yet)

    • @moonchildeverlasting9904
      @moonchildeverlasting9904 9 หลายเดือนก่อน

      this is great matt, you probably forgot that someone made a demo showing mario kart in JS. I think you come abit longer with knowledge, then ai, but we all think differently right?

  • @ikibkilam8383
    @ikibkilam8383 3 หลายเดือนก่อน

    Very informative! From one ex-microsofty (who also worked there three decades ago) to another.

  • @photorealm
    @photorealm 9 หลายเดือนก่อน

    Best video I have seen on this subject by far. Great usable information.

  • @grigorikochanov3244
    @grigorikochanov3244 9 หลายเดือนก่อน

    Sorry, at 7:00 you say that in chat endpoint parameters are replaced with messages, and show a code sample with /api/generate endpoint. The chat endpoint for Ollama is /api/chat. Is it a mistyping?

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน

      Doh. Good catch. Yes a typo. It’s right in the GitHub repo.

  • @jacknimble8331
    @jacknimble8331 9 หลายเดือนก่อน +4

    Thank you very much for these helpful videos!
    If you have the bandwidth could you go over incorporating embedding for RAG in this app?

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน +5

      I'll have that in a video coming next week

  • @user-he8qc4mr4i
    @user-he8qc4mr4i 9 หลายเดือนก่อน

    Thank you for sharing, Matt! I did not realize that JS was an option :-). Any plans to add a video showing Ollama + Lanchain ?

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน

      Yes. But for most projects it doesn’t add anything. I have to think of a good use case to leverage it.

  • @MuhammadAzhar-eq3fi
    @MuhammadAzhar-eq3fi 9 หลายเดือนก่อน +1

    Thanks a lot, Sir. ❤

  • @adityasingh017
    @adityasingh017 5 หลายเดือนก่อน

    Hey anybody knows, how could i use this library in my BROWSER ?

    • @technovangelist
      @technovangelist  5 หลายเดือนก่อน

      What do you want to do? What have you tried? Best to ask on the discord

  • @CookerSingh
    @CookerSingh 9 หลายเดือนก่อน

    Which models currently support function calling, and is there a function calling proxy available for any of these models, or does Ollama provide one?

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน +1

      All of them. Well except the embedding models. But it’s not a feature the model itself needs to support.

  • @BoominGame
    @BoominGame 9 หลายเดือนก่อน

    Great video as usual, but did you say we can infer with fine-tuned models from ollama?

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน

      Yes, absolutely. Ollama doesn’t do the fine tuning but it can use the adapters.

    • @BoominGame
      @BoominGame 9 หลายเดือนก่อน

      @@technovangelist How do you load them, with 'custom models' ?

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน +1

      The docs show how. Create a modelfile and use the adapter instruction

    • @BoominGame
      @BoominGame 9 หลายเดือนก่อน

      @@technovangelist ok thanks!

  • @Pdizzle-ic5sk
    @Pdizzle-ic5sk 9 หลายเดือนก่อน

    I downloaded the webUi docker container (Chat GPT like interface) about 3 months ago.
    Everything worked great. Downloading models was super convenient and easy.
    Tried to download the latest image on a new machine and the interface got a lot more complex. I dont even know how to download a model anymore.
    Any chance you can make a video on the web UI docker container?
    Easy docker compose startup.
    Downloading the diferenent models, and and prompting them?
    This tutorial was great!
    Thanks,

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน

      Personally I’m not a big fan of most of the web interfaces I have seen. But I should try again.

  • @mshonle
    @mshonle 9 หลายเดือนก่อน

    This is random and only tangentially related, but: do you know how to do that thing where your browser sees the localhost as usernames dot machine and there’s also some local CA cert configuration so you can test using https? (Random, yes, but I comment because I care.)

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน

      Not sure but I think you are talking about wildcard certs. You can do this with certs from letsencrypt using traefik. Technotim I think did a video about this a couple years ago as did Christian L something. Last name starts with l. Lempa? There are other alternatives I think with nginx instead of traefik.

  • @AINMEisONE
    @AINMEisONE 9 หลายเดือนก่อน

    Hi Matt, here are some questions: 1. I have Ollama running but it will not see my Egpu, it's an RTX 4090. I use LM Studio but it's always full of bugs and creates strange replies or does not even load the model. But when it does it sees the CUDA GPU and uses 100% and wow it runs the 70b model perfectly...2. I want to be able to also move from the Intel Macbook Pro Ollama models I downloaded so I do not need to re-download them again and again. Can you make a video for this? You know that most apps now are not on Intel unless you do bootcamp Windows then it runs and the best thing is the RTX 4090 eGPU works incredibly! Help, please.

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน

      I don’t have a good answer for you. I don’t think the team has put any effort into supporting Intel Macs. When we were first building it Apple silicon is what inspired us to create ollama. But supporting egpu may be a bit further out. If you wanted to load up Linux on there it may have a chance.

    • @AINMEisONE
      @AINMEisONE 9 หลายเดือนก่อน

      @@technovangelist I have an M2 MAX 96GB RAM, the intel is faster, with the RTX 4090... by a lot. with LM studio for now.. but too many bugs in their stuff. It has gotten a lot better..but still not as good as Ollama, big models do not work at all 34b on up.. they get stuck or have very low-level responses.. I will send maybe some other questions. in the meantime to fix some things.

    • @AINMEisONE
      @AINMEisONE 9 หลายเดือนก่อน

      @@technovangelist ok which Linux version to install? I tried Ubuntu 20.04/22.04 and it did not support Ollama it said.. I used the command prompt on the Ollama site and it never worked. Thanks!

    • @technovangelist
      @technovangelist  9 หลายเดือนก่อน

      Asking in the discord will have better results. I know nothing about that.

  • @joshkz
    @joshkz 6 หลายเดือนก่อน

    The link isn't working, but I was able to find it. Hopefully you see this comment so that the error can be changed

    • @technovangelist
      @technovangelist  6 หลายเดือนก่อน

      Interesting. I think I fixed it, though.

  • @florentflote
    @florentflote 9 หลายเดือนก่อน +1

  • @LucianoTonet
    @LucianoTonet 9 หลายเดือนก่อน

    I'm learning so much with this channel. @technovangelist it could be better to see if you add some syntax highlighting on your code I think. Thanks

  • @RickySupriyadi
    @RickySupriyadi 9 หลายเดือนก่อน

    Matt Williams Channel just got EXP! 🆙
    💫 +100 EXP
    💫 +100 EXP
    💫 +100 EXP
    💫 +100 EXP
    💫 +100 EXP
    LVL 2 (100/1000) EXP
    Matt Williams Unlocked Titles:
    👐 **The Caring One** Lv. 1 (Cares for all types of audiences, taking care of newbies)
    🧑‍🏫 **The Guru** Lv. 1 ("Clears Makes complex topics easy to understand)
    My likes for this channel have skyrocketed!