Install & Run Ollama on AWS Linux: Easily Install Llama3 or Any LLM Using Ollama and WebUI

แชร์
ฝัง
  • เผยแพร่เมื่อ 5 ม.ค. 2025

ความคิดเห็น • 36

  • @moizamjad1330
    @moizamjad1330 3 หลายเดือนก่อน +1

    Can i link the webui with my domain? So people can access the web ui through domain and of course SSL. I'd be really thankful if you explain or make a video on it.

    • @ScaleUpSaaS
      @ScaleUpSaaS  3 หลายเดือนก่อน

      I will make video about this very soon

  • @sudarshanggouda
    @sudarshanggouda 3 หลายเดือนก่อน +1

    I recently came across your video on installing and running Llama3 (or any LLM) using Ollama on AWS Linux. I was wondering if it's possible to interact with the deployed model programmatically by calling it as an API in code. Could you provide insights or a brief guide on how to achieve this?
    Thank you for the great content!

    • @ScaleUpSaaS
      @ScaleUpSaaS  3 หลายเดือนก่อน +2

      Yes. You can. You can call it as API
      All you need to do is to implement Python FastAPI and once you getting requests to fast API you can make an inner request to your local Ollama.
      So you want us to make video about it?

    • @sudarshanggouda
      @sudarshanggouda 3 หลายเดือนก่อน

      @@ScaleUpSaaS Can you please make a video or explain how we can do it. I have writtern the Fast API code but not getting how to call the local Ollama API

    • @sudarshanggouda
      @sudarshanggouda 3 หลายเดือนก่อน +1

      @@ScaleUpSaaS Yes please If there will be an video that would helpful.

    • @ScaleUpSaaS
      @ScaleUpSaaS  3 หลายเดือนก่อน +1

      Sure. We will be happy to share that with you.

    • @sudarshanggouda
      @sudarshanggouda 3 หลายเดือนก่อน

      @@ScaleUpSaaS Thank you

  • @cjoshy
    @cjoshy 5 หลายเดือนก่อน +1

    I followed entire turorial but when i type 'llama3' in 'select a model' , 'Pull "llama3" from Ollama ' option is not appearing.

    • @ScaleUpSaaS
      @ScaleUpSaaS  5 หลายเดือนก่อน

      Please try the tutorial again from scratch. We tried it many times with users. And it’s worked each time.

    • @squ34ky
      @squ34ky 18 วันที่ผ่านมา +1

      I was also facing the same issue. I solved it by logging into the container directly using
      docker exec -it open-webui bash
      then running
      ollama pull
      Then refreshing openwebui did the trick. The models were listed.

    • @cjoshy
      @cjoshy 17 วันที่ผ่านมา

      @@squ34ky Thanks, bro, I was planning to install it again tomorrow, you came at the right time 🫂

  • @pushkarsawant9789
    @pushkarsawant9789 5 หลายเดือนก่อน +1

    Hello, the video was nicely and clearly explained, step by step. I would like to see the same Ollma setup but on a serverless architecture. Could you please post a video on the Ollma serverless setup?

    • @ScaleUpSaaS
      @ScaleUpSaaS  5 หลายเดือนก่อน

      Thanks for sharing. Appreciated. Can you elaborate more…

    • @pushkarsawant9789
      @pushkarsawant9789 5 หลายเดือนก่อน +1

      @@ScaleUpSaaS I mean setting up ollma on serverless technology on AWS using lambda or other services. Or maybe on Google cloud functions for serverless

    • @ScaleUpSaaS
      @ScaleUpSaaS  5 หลายเดือนก่อน +1

      We don’t know if it’s possible. But we will check and let you know 🫡

    • @ScaleUpSaaS
      @ScaleUpSaaS  5 หลายเดือนก่อน

      We try to look for a solution for you. Unfortunately we didn't found one yet. We will let you know if something comes up...

    • @pushkarsawant9789
      @pushkarsawant9789 4 หลายเดือนก่อน

      @@ScaleUpSaaS, Thank You

  • @ywueeee
    @ywueeee 5 หลายเดือนก่อน +1

    How to run is private? Someone looking for those endpoint can find it on clear Web?

    • @ScaleUpSaaS
      @ScaleUpSaaS  5 หลายเดือนก่อน

      You can run it on your computer using docker as we showed in the tutorial. Or the next thing do what we did in the video and restrict access to the server only to your IP (config security group).

    • @ywueeee
      @ywueeee 5 หลายเดือนก่อน +1

      @@ScaleUpSaaS but what if you're WiFi Ip is not static and keeps on changing and you want access to the LLM from any device and any network but still keep it safe only accessible to you?

    • @ScaleUpSaaS
      @ScaleUpSaaS  5 หลายเดือนก่อน

      @wagmi614 in that case you can use Elastic IP address. In this video you can see how we are setting elastic IP address in AWS
      Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server
      th-cam.com/video/yhiuV6cqkNs/w-d-xo.html

    • @ScaleUpSaaS
      @ScaleUpSaaS  5 หลายเดือนก่อน

      Watch this. Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server
      th-cam.com/video/yhiuV6cqkNs/w-d-xo.html

    • @ywueeee
      @ywueeee 5 หลายเดือนก่อน +1

      @@ScaleUpSaaS wait i don't get it how elastic ip of aws helps when it's my ip that's changing and i want to input to be accepted from any ip?

  • @jimjohn1719
    @jimjohn1719 5 หลายเดือนก่อน +1

    Is this free to run on AWS?. If not, can you comment on the AWS cost incurred to run this application?.

    • @ScaleUpSaaS
      @ScaleUpSaaS  5 หลายเดือนก่อน

      Thanks for sharing. Ollama, llama3 or any other LLM that you can pull are free to use. But the server , because we are not using free tier instance type, it will cost you money for aws.