Docker Networking Made Simple: Connecting Containers Like a Pro

แชร์
ฝัง
  • เผยแพร่เมื่อ 2 ม.ค. 2025

ความคิดเห็น • 24

  • @JaredVBrown
    @JaredVBrown 4 หลายเดือนก่อน +1

    Such a calm, yet punctual voice.
    Like ASMR + tutorials.
    And the content is easy to follow and understand.
    Great Channel!

  • @shuntera
    @shuntera 4 หลายเดือนก่อน +5

    Yes some Docker networking content would be most welcome

  • @rundeks
    @rundeks 4 หลายเดือนก่อน +1

    Great overview and great job at keeping this simple for people.

  • @Aristocle
    @Aristocle 4 หลายเดือนก่อน

    1:30 it's a similar problem with LateX(latechi). But, in linguistics you cannot use different alphabets to write the same word. So I pronounce latex as the material.

  • @psi4j
    @psi4j 4 หลายเดือนก่อน

    Yes, container networking. Podman would be awesome to cover too. We’ve all seen the “Hitler uses Docker video”.

  • @Bangs_Theory
    @Bangs_Theory 4 หลายเดือนก่อน

    Hi @Matt Williams if time permits, can you create a video showing how to deploy Dify to a cloud service like Render?

  • @NLPprompter
    @NLPprompter 4 หลายเดือนก่อน

    since you are not in the Ollama dev anymore, dunno if this ethical to ask... could you cover something about llamafiles? is it really what that said faster Inference with cpu using their llamafiles instead of default llama.cpp

  • @dmbrv
    @dmbrv 4 หลายเดือนก่อน

    Great video

  • @hasaanirfan6073
    @hasaanirfan6073 4 หลายเดือนก่อน

    Great ❤

  • @zooloomuzik
    @zooloomuzik 4 หลายเดือนก่อน

    Hey Matt, I was really excited for this video, then only to realize theres no mention of Ollama in it!?!?
    I have a project on the go where I'm trying to build a multi container app using docker-compose where the containers are backend- fastAPI, frontend- Nextjs and llmServer- Ollama. I'm running into problems having the backend connect to the Ollama server ... I get the dreaded [Errno 111] Connection refused

    • @technovangelist
      @technovangelist  4 หลายเดือนก่อน

      This was 100% about using tools with ollama and docker.

    • @technovangelist
      @technovangelist  4 หลายเดือนก่อน

      Ahh. I see how you can think that. But I wouldn’t use ollama in docker anyway. This is just about the UIs.

    • @zooloomuzik
      @zooloomuzik 4 หลายเดือนก่อน

      @@technovangelist wow, thanks for responding Matt, much respect for what you're doing! considering your comment "... I wouldn’t use ollama in docker ..." might I be so bold as to ask... if you were me, and you needed to host this app on Azure (which I do) how would you go about hosting Ollama?

    • @technovangelist
      @technovangelist  4 หลายเดือนก่อน

      Got it. That makes sense. Docker on a host vs docker on localhost can be different. If you are running their container service rather than an instance then that makes sense. Have you had success with getting access to a real gpu? Last time I tried I could only get their generic named cards and not a real amd or nvidia card.

    • @zooloomuzik
      @zooloomuzik 4 หลายเดือนก่อน

      ​@@technovangelist hi again Matt and thanks for the continuing engagement! to answer your question, Yes we have had success in getting a vm with a gpu, there are some "N series" options on Azure for us mere mortal, e.g. NV4as_v4 which is a 4 core 14gb server with an AMD gpu (this is the smallest of 3 in this series and costs $170p/m).
      I've stood up an ollama server with one of these and I can test connection to it over the internet successfully, in my app I have the baseURL of the ollama server setup as an env, so I can swap it out... but when I do I get connection issues :( interestingly yesterday I also setup a serverless endpoint on Azure for a llama2 model and ran into the same problems, so this means the issue might be totally unrelated to ollama!?!

  • @sammcj2000
    @sammcj2000 4 หลายเดือนก่อน

    Naming things is hard 😂

  • @SlykeThePhoxenix
    @SlykeThePhoxenix 4 หลายเดือนก่อน +1

    It's not pronounced "en-gin-ex", it's pronounced "en-gin-chi".

    • @technovangelist
      @technovangelist  4 หลายเดือนก่อน +1

      That is definitely en gin ex. Has been since it started out in Russia. I have spoken many years at nginx conf.

    • @srivatsajoshi4028
      @srivatsajoshi4028 4 หลายเดือนก่อน

      @@technovangelist I think it was probably a joke

    • @SlykeThePhoxenix
      @SlykeThePhoxenix 4 หลายเดือนก่อน

      @@technovangelist I know haha, I was being facetious XD.

  • @archstanton
    @archstanton 4 หลายเดือนก่อน

    Ah yes, Chinese President “KAI JinPing”

    • @technovangelist
      @technovangelist  4 หลายเดือนก่อน

      i have no idea what this comment means