Ollama: How To Create Custom Models From HuggingFace ( GGUF )
ฝัง
- เผยแพร่เมื่อ 31 พ.ค. 2024
- In this video, I am demonstrating how you can create a custom models locally using the model from Huggingface with Ollama. By the end of the video, you will be able to create a simple ChatGPT like UI locally in your computer. It has never been so easy to create custom models and run it locally. Ollama and Huggingface Rocks 😎
👉🏼 Links:
Chainlit: docs.chainlit.io/get-started/...
LangChain: www.langchain.com/
Ollama: ollama.ai/
Huggingface model: huggingface.co/TheBloke/Capyb...
💻 GitHub repo for code: github.com/sudarshan-koirala/...
------------------------------------------------------------------------------------------
☕ Buy me a Coffee: ko-fi.com/datasciencebasics
✌️Patreon: / datasciencebasics
------------------------------------------------------------------------------------------
🔗 🎥 Other videos you might find helpful:
🔥 Databricks playlist: • 30 Days Of DataBricks
⛓️ Langflow: • ⛓️ langflow | UI For 🦜...
⛓️ Flowise: • Flowise | UI For 🦜️🔗 L...
🔥Chainlit playlist: • Chainlit
🦜️🔗 LangChain playlist: • LangChain
🦙 LlamaIndex Playlist: • LlamaIndex
------------------------------------------------------------------------------------------
🤝 Connect with me:
📺 TH-cam: www.youtube.com/@datascienceb...
👔 LinkedIn: / sudarshan-koirala
🐦 Twitter: / mesudarshan
🔉Medium: / sudarshan-koirala
💼 Consulting: topmate.io/sudarshan_koirala
#langchain #chainlit #ollama #capybarahermes #chatui #chatgpt #datasciencebasics - วิทยาศาสตร์และเทคโนโลยี
Excellent tutorial !!
Glad you liked it!
It would be interesting to see if you could Open-webUI instead of "Chat UI using ChainLit, LangChain, Ollama and Gemma"
How can I edit the Modelfile, so that it includes within it a context, a personality, or a precise way of answering questions?
gguf can be run directly with ollama, how to convert other formats to be converted to use with ollama
?
Phenomenal
Is there any way we can use Exl2 on Ollama?
how do i pimp my chainlit logo to mine to change it please
Hey mate, very good video with clear explanation. Do u mind to share what terminal app you are using? seems like very convenient with all that autocompletion and some hints
I am using Warp but you can watch this video for making terminal better
HOW To Make Your Mac Terminal Amazing
th-cam.com/video/ycapVWVl98M/w-d-xo.html
Could you please use llama index with hugging face
Thanks for sharing it, could you please create a video where we can do Q&A with local documents using Ollma models
Kindly do it , it’s really needed
Please check other videos in this channel, here u go ->
Chat With Documents Using ChainLit, LangChain, Ollama & Mistral 🧠
th-cam.com/video/2IL0Sd3neWc/w-d-xo.html
Good explanation! Is there a list of model architectures that are supported by Ollama?
I don’t think there is any architecture for Ollama, you can download any models and use with Ollama. Give a try !!
Nice video,
Can you make a video changing the system-prompt, temperature
If its possible.
Thankyou
Its possible, just provide those parameters in Modelfile.
how to create the Modefile on windows?
Just open vscode and create a file called Modelfile (not sure about capitalization) and insert the content into it. It doesnt need any extension
good, how to create the Modefile on windows?😁
it should work in the same way, hive a try !!
hello sir can we deploy this model
Yes you can !!
@@datasciencebasics Sir if possible please can u create one tutorial for deploying this model
hello, deploying a model is use case specific. It can be deployed locally, in different cloud services , etc. Please refer to other videos in my channel for help.
Far more hassle than it's worth. Just use any other app that uses normal GGUF files, like normal people.
Thanks for the comment. There are many ways to do and its just a preference which one to use🙂
Excellent tutorial !!
Thanks !!