i'm sick and tired of you. You daily post new videos with even more good news for Freebies like me. You giving me new tools to play with and i'm extremely happy. Stay Happy.
Hello! I've another video idea if it's possible. Trying to containerize Ollama and a light llm inside it, with an electron desktop app. Basically the objetive is to provide a final user with the advantages of running an llm, without needing to install ollama and all the llms separately. I hope it's possible and also you think it's a good a idea. Greetings!
I could see this being easily done using docker images. I too would be interested in the concept at least also. I'm bad at docker but its power is awesome. The thing about that is that the users running the app would have to either have some insane hardware, or they'd have to settle for a smaller included model
I got the following error in mistral canvas 504 "It appears that an unexpected issue has occurred. We apologize for the inconvenience. Please try again later or refresh the page. If the problem persists, feel free to contact our support team for assistance."
I'm really thrilled to see that the team at Mistral have stepped up their game to provide us with a really good model with greater context..for free!
Great find, AiCodeKing! Would be awesome to understand how to connect this to VS Code (Cline AI)
Yes 😊
Yes I was thinking the same connection with Cline
Very useful. Your Windsurf tutorial and this one were very useful for me. I deeply appreciate your efforts!
i'm sick and tired of you. You daily post new videos with even more good news for Freebies like me. You giving me new tools to play with and i'm extremely happy. Stay Happy.
OK the Web search is indeed very good, what also is good is that everything is free! I'm loving mistral for this already.
you are so fast! this just came out! are you AI?
Excellent free option AI to create images
Vámonos rey! Gracias!
Hello!
I've another video idea if it's possible.
Trying to containerize Ollama and a light llm inside it, with an electron desktop app. Basically the objetive is to provide a final user with the advantages of running an llm, without needing to install ollama and all the llms separately.
I hope it's possible and also you think it's a good a idea.
Greetings!
I could see this being easily done using docker images. I too would be interested in the concept at least also. I'm bad at docker but its power is awesome. The thing about that is that the users running the app would have to either have some insane hardware, or they'd have to settle for a smaller included model
I got the following error in mistral canvas 504 "It appears that an unexpected issue has occurred. We apologize for the inconvenience. Please try again later or refresh the page. If the problem persists, feel free to contact our support team for assistance."
Thanks
can you sugest a aite where to input on how to see a react preview..
Try the new F1 reasoning models
Can it use img uploaded for img to img generation?
I don't think so.. But, I think that it can convert the image you upload to prompt and then create image from that prompt.
@ thanks!
Connection with Cline please 🙏
how tf do these companies pay for alle the servers wth
Magic VC money
@@ShaferHart E L A B O R A T E
👍👍👍👍👍👍👍👍👍👍👍👍
E
First!
Second