Level Up Your Typescript Skills: Adding Ollama To Your Apps!

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 มี.ค. 2024
  • The Javascript library for Ollama makes it so much easier to build cools applications with AI. This video will get you up to speed on everything you need to know.
    Find the code here: github.com/technovangelist/ then videoprojects then find the folder for this video.
    Be sure to sign up to my monthly newsletter at technovangelist.com/newsletter
    And if interested in supporting me, sign up for my patreon at / technovangelist
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 42

  • @userou-ig1ze
    @userou-ig1ze 2 หลายเดือนก่อน +6

    This channel is top on my list lol, I'm addicted to the casual seeming, yet perfect presentations (besides the interesting content and impressive successive iterations of code...). It's clear, concise, intuitive flow and so calm I find myself not always using 2x speed

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน +1

      Thanks so much. Nice to know it’s resonating with folks

  • @brogeby
    @brogeby หลายเดือนก่อน

    This is exactly the channel I was searching for. Your presentations are very well presented, great examples and you talk about very relevant topics. Thanks!

  • @yoemasey
    @yoemasey 2 หลายเดือนก่อน

    Thank you for explaining the details so carefully and easy to understand. Can't wait to try out Ollama :)

  • @DevynDowler-Lewis
    @DevynDowler-Lewis 2 หลายเดือนก่อน +6

    This is remarkably well presented

  • @photorealm
    @photorealm 2 หลายเดือนก่อน

    Best video I have seen on this subject by far. Great usable information.

  • @lucioussmoothy
    @lucioussmoothy 2 หลายเดือนก่อน +3

    nice over view -I'd love to see a view on how we can use JS to train and fine tune an Ollama model (if you haven't posted one yet)

    • @moonchildeverlasting9904
      @moonchildeverlasting9904 2 หลายเดือนก่อน

      this is great matt, you probably forgot that someone made a demo showing mario kart in JS. I think you come abit longer with knowledge, then ai, but we all think differently right?

  • @MuhammadAzhar-eq3fi
    @MuhammadAzhar-eq3fi 2 หลายเดือนก่อน +1

    Thanks a lot, Sir. ❤

  • @jacknimble8331
    @jacknimble8331 2 หลายเดือนก่อน +4

    Thank you very much for these helpful videos!
    If you have the bandwidth could you go over incorporating embedding for RAG in this app?

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน +5

      I'll have that in a video coming next week

  • @DC-xt1ry
    @DC-xt1ry 2 หลายเดือนก่อน

    Thank you for sharing, Matt! I did not realize that JS was an option :-). Any plans to add a video showing Ollama + Lanchain ?

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน

      Yes. But for most projects it doesn’t add anything. I have to think of a good use case to leverage it.

  • @Pdizzle-ic5sk
    @Pdizzle-ic5sk 2 หลายเดือนก่อน

    I downloaded the webUi docker container (Chat GPT like interface) about 3 months ago.
    Everything worked great. Downloading models was super convenient and easy.
    Tried to download the latest image on a new machine and the interface got a lot more complex. I dont even know how to download a model anymore.
    Any chance you can make a video on the web UI docker container?
    Easy docker compose startup.
    Downloading the diferenent models, and and prompting them?
    This tutorial was great!
    Thanks,

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน

      Personally I’m not a big fan of most of the web interfaces I have seen. But I should try again.

  • @truehighs7845
    @truehighs7845 2 หลายเดือนก่อน

    Great video as usual, but did you say we can infer with fine-tuned models from ollama?

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน

      Yes, absolutely. Ollama doesn’t do the fine tuning but it can use the adapters.

    • @truehighs7845
      @truehighs7845 2 หลายเดือนก่อน

      @@technovangelist How do you load them, with 'custom models' ?

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน +1

      The docs show how. Create a modelfile and use the adapter instruction

    • @truehighs7845
      @truehighs7845 2 หลายเดือนก่อน

      @@technovangelist ok thanks!

  • @CookerSingh
    @CookerSingh 2 หลายเดือนก่อน

    Which models currently support function calling, and is there a function calling proxy available for any of these models, or does Ollama provide one?

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน +1

      All of them. Well except the embedding models. But it’s not a feature the model itself needs to support.

  • @mshonle
    @mshonle 2 หลายเดือนก่อน

    This is random and only tangentially related, but: do you know how to do that thing where your browser sees the localhost as usernames dot machine and there’s also some local CA cert configuration so you can test using https? (Random, yes, but I comment because I care.)

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน

      Not sure but I think you are talking about wildcard certs. You can do this with certs from letsencrypt using traefik. Technotim I think did a video about this a couple years ago as did Christian L something. Last name starts with l. Lempa? There are other alternatives I think with nginx instead of traefik.

  • @abdulrahimannaufal5678
    @abdulrahimannaufal5678 2 หลายเดือนก่อน

    Ollama doesn't stream the output, instead prints all at once when I tried with ngrook over https. Is it the same with tailscale ?

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน

      Haven’t used ngrok before so don’t know how to configure it correctly. You should work on fixing that configuration. Tailscale definitely doesn’t screw with the output of applications

  • @florentflote
    @florentflote 2 หลายเดือนก่อน +1

  • @grigorikochanov3244
    @grigorikochanov3244 2 หลายเดือนก่อน

    Sorry, at 7:00 you say that in chat endpoint parameters are replaced with messages, and show a code sample with /api/generate endpoint. The chat endpoint for Ollama is /api/chat. Is it a mistyping?

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน

      Doh. Good catch. Yes a typo. It’s right in the GitHub repo.

  • @LucianoTonet
    @LucianoTonet 2 หลายเดือนก่อน

    I'm learning so much with this channel. @technovangelist it could be better to see if you add some syntax highlighting on your code I think. Thanks

  • @AINMEisONE
    @AINMEisONE 2 หลายเดือนก่อน

    Hi Matt, here are some questions: 1. I have Ollama running but it will not see my Egpu, it's an RTX 4090. I use LM Studio but it's always full of bugs and creates strange replies or does not even load the model. But when it does it sees the CUDA GPU and uses 100% and wow it runs the 70b model perfectly...2. I want to be able to also move from the Intel Macbook Pro Ollama models I downloaded so I do not need to re-download them again and again. Can you make a video for this? You know that most apps now are not on Intel unless you do bootcamp Windows then it runs and the best thing is the RTX 4090 eGPU works incredibly! Help, please.

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน

      I don’t have a good answer for you. I don’t think the team has put any effort into supporting Intel Macs. When we were first building it Apple silicon is what inspired us to create ollama. But supporting egpu may be a bit further out. If you wanted to load up Linux on there it may have a chance.

    • @AINMEisONE
      @AINMEisONE 2 หลายเดือนก่อน

      @@technovangelist I have an M2 MAX 96GB RAM, the intel is faster, with the RTX 4090... by a lot. with LM studio for now.. but too many bugs in their stuff. It has gotten a lot better..but still not as good as Ollama, big models do not work at all 34b on up.. they get stuck or have very low-level responses.. I will send maybe some other questions. in the meantime to fix some things.

    • @AINMEisONE
      @AINMEisONE 2 หลายเดือนก่อน

      @@technovangelist ok which Linux version to install? I tried Ubuntu 20.04/22.04 and it did not support Ollama it said.. I used the command prompt on the Ollama site and it never worked. Thanks!

    • @technovangelist
      @technovangelist  2 หลายเดือนก่อน

      Asking in the discord will have better results. I know nothing about that.

  • @joshuandala7669
    @joshuandala7669 14 วันที่ผ่านมา

    The link isn't working, but I was able to find it. Hopefully you see this comment so that the error can be changed

    • @technovangelist
      @technovangelist  7 วันที่ผ่านมา

      Interesting. I think I fixed it, though.

  • @DoNotKillThePresiden
    @DoNotKillThePresiden 2 หลายเดือนก่อน +1

    Thanks Matt 😁

  • @RickySupriyadi
    @RickySupriyadi 2 หลายเดือนก่อน

    Matt Williams Channel just got EXP! 🆙
    💫 +100 EXP
    💫 +100 EXP
    💫 +100 EXP
    💫 +100 EXP
    💫 +100 EXP
    LVL 2 (100/1000) EXP
    Matt Williams Unlocked Titles:
    👐 **The Caring One** Lv. 1 (Cares for all types of audiences, taking care of newbies)
    🧑‍🏫 **The Guru** Lv. 1 ("Clears Makes complex topics easy to understand)
    My likes for this channel have skyrocketed!