How to setup Ollama and run AI language models locally - Java Brains

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 ม.ค. 2025

ความคิดเห็น • 24

  • @mypage9389
    @mypage9389 5 หลายเดือนก่อน +1

    Coming back to your channel after 10 years. Feels like home ❤

    • @Java.Brains
      @Java.Brains  5 หลายเดือนก่อน

      Welcome home! 😊

  • @muhiuddinshahani349
    @muhiuddinshahani349 5 หลายเดือนก่อน

    i would say after ages finally something uploaded freely . btw thanks for teaching us sir 🙂

  • @mojeime3
    @mojeime3 3 หลายเดือนก่อน

    I found this tutorial very helpful. Thank you for uploading.

  • @hariomkuntal9520
    @hariomkuntal9520 6 หลายเดือนก่อน +1

    I love your videos. They are very informative. Thanks for publishing it.

  • @sairajulu4395
    @sairajulu4395 6 หลายเดือนก่อน +1

    This is amazing, gonna help a lot of people, technology booming day by day...

  • @lifeofanitguy
    @lifeofanitguy 7 ชั่วโมงที่ผ่านมา

    Can you do video on how to passing an image through Ollama to LLM and get the text response back for that image from LLM locally

  • @phinehasenakireru430
    @phinehasenakireru430 6 หลายเดือนก่อน +1

    Super as always! Thanks.

  • @sumeetsood232
    @sumeetsood232 5 หลายเดือนก่อน

    Hey,
    I am planning to buy your all access course, but I can see some courses for Spring and Sprint Security are missing, when are you planning to update them?

  • @Nilcha-2
    @Nilcha-2 6 หลายเดือนก่อน +2

    Can you customize it to use local data and files? I think that will be the primary reason why would you run it locally.

    • @BarathGDKrishnan
      @BarathGDKrishnan 5 หลายเดือนก่อน

      You can't. It is a text based AI model and doesn't have the capability to read/write files. It'd also have security implications to allow read write permissions to AI. RE:Skynet

  • @aturan-fo1qt
    @aturan-fo1qt 5 หลายเดือนก่อน +2

    Hey mate, you seem too tired. Thanks for sharing.

  • @nagesh007
    @nagesh007 5 หลายเดือนก่อน

    Awesome , Thanks 😍

  • @Munawar-Java
    @Munawar-Java 5 หลายเดือนก่อน

    Is ollama always local? If my data is sensitive and I dont want models to learn from my data. Should i be using this? Is there any way my data can be leaked?

    • @Java.Brains
      @Java.Brains  5 หลายเดือนก่อน

      Yes, it strictly works off of local language models.Your data isn't leaving your computer here.

  • @satyaranjannayak6283
    @satyaranjannayak6283 6 หลายเดือนก่อน +1

    Can you please create a playlist including ingestion of data and as well as chatbot

  • @AbsTht
    @AbsTht 6 หลายเดือนก่อน +2

    React basics too please. Thank you.

    • @kokunut9209
      @kokunut9209 6 หลายเดือนก่อน

      Hey java brains
      One video on kafka and message brokers. And when to use them when not to use them, when to "think" in terms of brokers. Please
      It's a request

  • @syedmohdnoman
    @syedmohdnoman 4 หลายเดือนก่อน

    Too good

  • @goodcourseavailable
    @goodcourseavailable 6 หลายเดือนก่อน +1

    Classic
    More spring boot react AI projects

  • @cv462-l4x
    @cv462-l4x 5 หลายเดือนก่อน

    The only thing I learned from the video is that the author has a very good internet speed)

  • @jyotiramkamble1842
    @jyotiramkamble1842 5 หลายเดือนก่อน +1

    Don't use paid version it difficult to use because we have community's version

  • @arunbandari8936
    @arunbandari8936 5 หลายเดือนก่อน

    Hi sir
    I have two youtube accounts one is my personal and the second one is for education.
    What I have done I have taken membership for your channel by using my personal mail id how can I cancel my membership. I want to take membership by using my second mail i.e. education TH-cam account. Please refund me . After refund I will take subscription through my education account

    • @Java.Brains
      @Java.Brains  5 หลายเดือนก่อน

      I don't have the ability to issue refunds for TH-cam subscriptions. TH-cam doesn't give that feature to creators. However, I can offer a free subscription on my website, where I do have the ability to do that. Would you mind writing an email by clicking the support menu on javabrains.io so that I can reply over there?