Custom AI Chatbot for Websites using any LLM | No-Code | Open-Source
ฝัง
- เผยแพร่เมื่อ 7 ก.ค. 2024
- In a world of pay-per-token LLMs you probably have looked everywhere to get your Local or Cloud-hosted LLM model to act like an AI assistant on your website - well now with AnythingLLM it is free and open source!
Today, we are going to create an AI chatbot for our website trained on the data from our own website! All of this is built into the AnythingLLM cloud or our self-hosted version you can run anywhere.
The embed takes minutes to set up and is a single HTML snippet you can embed into your website and get a powerful chatbot to handle customer questions.
With AnythingLLM, you can now publish smarter chatbots using any LLM trained on your custom documents and show them on any webpage will all the controls and tools you need to make sure the questions are on topic and costs are controlled!
AnythingLLM supports link-scraping, PDFs, text documents, and so much more then you can throw at it!
Learn about AnythingLLM Hosted: useanything.com
Open-source self-hosted AnythingLLM: github.com/Mintplex-Labs/anyt...
Chapters
0:00 Introduction to AnythingLLM Embeds
0:34 AnythingLLM Cloud or Github setup
0:52 Chatbots are everywhere - intro to demo!
1:34 What is an AnythingLLM workspace?
2:19 Aside: How to use any LLM for your AI chatbot
2:45 Create the workspace
3:25 Add custom documents to a workspace for knowledge
3:42 AnythingLLM has a free embedder? WOW
4:15 Showing how much smarter our workspace is
4:49 Creating the embed & available controls and settings
7:07 How to copy the embed to paste into your site
7:21 How to do this with your site (or via Framer)
8:03 Success!
8:39 Customizations for your embed
9:31 Thank you!
This is really something BIG !! Amaaaaaaaaaaazing tool
This is legendary. Thanks for your great work, you’re amazing!
Amazing ⭐⭐⭐⭐⭐
thats cool
Tried running AnythingLLM Desktop v1.5.5 on my MacBook. Seemed to install without errors.
Unfortunately, the Document import feature didn't work, which meant I couldn't import local files or websites.
Fortunately, the AnythingLLM Docker image worked. It allowed me to import local documents for analysis.
EZ loading of local documents is wonderful.
One more question, can I connect Claude 3 Opus API to AnythingLLM?
hello tim. i dont see much info about the "agents" in anything llm. how can we use them? secondly, groq integration seems to work for only mixtral and new models like llama3 doesn't appear in the groq option inside anythingllm. thanks
I tried to use anythingLLM in chat mode thinking I could just interact with the LLM without any docs. Instead of going over to ChatGPT or whatever. 5:28 Did not work. Any solution ?
First, great video
🏅
Can you show how to do it with Ollama, I'm not sure what to put in for Base URL, or token numbers.
i can't find the option, it's because i'm on windows?
I love it!, I already have it set up and it surprised me that it deduced from context something that I didn't explicitly mentioned in the document I'm testing (I'm using LM studio with a mistral model). But I don't know if I'm missing something or doing something wrong but I'm missing the "embedded chat" tab on the settings under "data connectors", also in the appearance tab I'm missing the option to change the branding the logo or pics for the chatbot or for my icon.
The desktop app does not have those features, that would be why those are missing. The desktop is purely focused on a local expierence. While the embed chat and such would require a sever to run anythingllm
Sorry I don't get it, so does that mean I have to pay for something or is it still open source? a tutorial would be awesome, thanks and sorry I'm new to all this @@TimCarambat
@@jiuvk8393The docker image (github.com/Mintplex-Labs/anything-llm) is the FULL AnythingLLM app that supports multi-user. That is the only difference from desktop.
You can self host it, we also offer hosting, but that is only if you dont want to set it up. It is open-source all the same.
damn it I just found out I don't have hyper-v (which is needed for docker) since I have windows home edition, any other way I can do this?@@TimCarambat
Great app, love it. Im not seeing the chat embed option in 1.5
Chat embed is only for docker. You cannot publish widget to the public web from a local desktop app.
docs.useanything.com/features/embedded-chat
Hi Tim - Question, is there any way to customize the 1000 token limit that we're seem to be encountering?
1000 token limit? Where at in the app?
Can it be in a flutter app
do I need to buy a computer with the GPU power? what is the hardware specifications needed
credit card
How to config implementation for huggingface open source model? without openAI API.
thanks for the insight anyway
LMStudio does this! Can download any HF model and then can connect to AnythingLLM. No OpenAI Key needed
thanks, i dont seem to have the chat embed feature . do i have to activate it somewhere .
If you are on Desktop it is not available (because how would it run if everything is on your local machine). If running on docker or from Github - it is present and has been since Feb 5.
Make sure you have the latest docker image running or latest code from Github.
ok thenks ill try to figute out the docker@@TimCarambat
Embedding option for publishing is not visible
you are probably using the desktop app. We dont allow you to expose your desktop to the world as that would not be safe. Use the docker image - it has what you are looking for.
Hi this is awesome and am newbie.. does this bot have voice to text functionality? Will it also help integrating streaming avatars such as heygen api?
Yes, we support native TTS and STT (if available) and for TTS we support OpenAi and 11Labs currently. Heygen i am unfamilar with
can you tell us the specs of your laptops or cpus to run this demo? thanks.
The laptop was just a Macbook pro (Intel) with 16GB of RAM, but if using a cloud-LLM you could run this on a RaspberryPi with 4Gb of RAM
Can i create this chatbot for messaging services like Telegram and WhatsApp?
I am waiting for an online version, no download&installation, no registration and no limit to use it for free.
We have an online version we host on cloud. We will not give it away for free - it costs us money to run and maintain.
If you want free we have the docker version you can run on a server you maintain
@@TimCarambatOk, i can wait. I am a poor man on an android tablet.
🙃 'Promo SM'
linux plzzzzzz
Working on an AppImage client _but_ we do have a docker image that is easy to run for those more technical and capable with a CLI
github.com/Mintplex-Labs/anything-llm/blob/master/docker/HOW_TO_USE_DOCKER.md#recommend-way-to-run-dockerized-anythingllm
@@TimCarambatdone sir