Inference API: The easiest way to integrate NLP models for inference!
ฝัง
- เผยแพร่เมื่อ 1 ต.ค. 2024
- Learn How to use HuggingFace Inference API to easily integrate NLP models for inference via simple API calls.
Code: github.com/Pra...
NLP Beginner to Advanced Playlist:
• NLP Beginner to Advanced
I am a Freelance Data Scientist working on Natural Language Processing (NLP) and building end-to-end NLP applications.
I have over 7 years of experience in the industry, including as a Lead Data Scientist at Oracle, where I worked on NLP and MLOps.
I Share Practical hands-on tutorials on NLP and Bite-sized information and knowledge related to Artificial Intelligence.
LinkedIn: / pradipnichite
#nlp #bert #transformers #machinelearning #artificialintelligence #datascience
📌 Hey everyone! Enjoying these NLP tutorials? Check out my other project, AI Demos, for quick 1-2 min AI tool demos! 🤖🚀
🔗 TH-cam: www.youtube.com/@aidemos.futuresmart
We aim to educate and inform you about AI's incredible possibilities. Don't miss our AI Demos TH-cam channel and website for amazing demos!
🌐 AI Demos Website: www.aidemos.com/
Subscribe to AI Demos and explore the future of AI with us!
Great bro
Helpful❤
is there any free text to text generation api. i want it for my hackathon project
I tried a model, and api said max 10 candidates
Simple one. But a valuable for someone who is making the transition from Intermediate (codes own model) to Advanced (uses Pretrained model) Kudos🎉
Hello sir can you please tell how we can generate hugging face api key for building a react app, please reply with an elaborated answer sir.
Can you ask this thing in discord?
discord.gg/teBNbKQ2
Fantastic. The likes on the video dont do the justice how great the helpful the video is.
Glad it was helpful!
thanks for sharing :)
Thanks man! Didn't know it was this easy.
Glad I could help!
hey i am using one api which takes input as a text and gives its summary. but i am not getting output correct.
output = query({
"inputs": text,
})
summary = output
but not getting output in this case.
even when i am using this
summary = output[0]
still not getting
Can you post in discord?
SO USEFUL. THANK YOU SO MUCH!
Hey Pradip, that's very well presented. Good job!
Thank you so much 🙂
Nice
hello I would like the photos he describes to me to go directly to a Google, could it work with the api?
Thanks a lot
Please reply, İ don't get the deploy button shown to me on any model page
thanks sir for the work you do , however I want to know
1- how one can integrate specific set of models (pre-trained) ones in to Rstudio ? so that one can simply run examples on data "proprietary in my case " locally within R
2- is there a way to ask the inference API for tasks different from the typical sentiment classification of text for example "multi-entity tagging" , "modalities" ....etc
your input is highly appreciated
Hi not sure about Rstudio, but since its an api we should be able to use anywhere we want.
Regarding different task, if we use pipeline class they we have to use one of the valid task but you can always use model inference and then have your own custom logic.
You should finetune model if you have different req
Hello, can it handle real time Speech to Text with ASR models?
hi how can i use a huggingface api in my flutter app ?
Its an api so we can use in any language. Here is an example of Python and Javscript huggingface.co/docs/api-inference/quicktour
@@FutureSmartAI thank you so much for your reply and i really finf your video so interesting,!
how to give multiple inputs in query:
"output = query({
"inputs": "I like you. I love you",
})"
I am trying this:
"output = query({
"inputs": "I like you. I love you","I hate you",
})"
But I am getting error.
Please help me.
Can you ask this in discord. We also published blog post on inference api blog.futuresmart.ai/mastering-hugging-face-inference-api-integrating-nlp-models-for-real-time-predictions
Explain about read and write tokens
Can you please show how can I use a code generation model of Hugging face like DeepSeekcoder or StartCoder using Inference API, as I am getting error.
can you share error?
If you can't find the snippet code what does that mean?
how about the pricing, i didn't understand , i like to make it live, what if 10000 user use this api, how much cost
The Inference API is free to use, and rate limited. If you need an inference solution for production, check out our Inference Endpoints service.
huggingface.co/docs/inference-endpoints/index
Thank you soo much brother! My only question is that I don't have Colab Notebook. Do you know where I could possibly access it and is it also usable for Mac users or are there any alternatives. Blessings
You can use colab notebook which is free if you login with google
Hey, can you please answer this, if we want to use a custom model for inference, where do I specify my outputs and inputs? where do I host the code for the model?
you can push model to huggingface hub. I have shown in this video th-cam.com/video/HRGc6QFA_YU/w-d-xo.html
Thank you brother for the great tutorial 🙏
Glad it was helpful!
you left half the way, it is difficult for new users
Hi you can ask if you have any doubts in discord. We also recently published a blog on inference API that should be helpful.
blog.futuresmart.ai/mastering-hugging-face-inference-api-integrating-nlp-models-for-real-time-predictions
Hello sir, can you please illustrate how to use a question-answering model like t5 by js inference? Thank you
you mean JavaScript inference?