Absolutely. I am going to try this next week. I have a 4060 Ti (16Gb) GPU and am loving running Llama 3 8b uncompressed. So good! Hope I can get this running. I've not set up a python environment before by myself so it should be interesting.
No, but you should always post the link to the product or service your demoing. I know you think that by not posting the link it forces people to come to your website and then find the link, but in reality people just google the name and go directly to it.
🎯 Key points for quick navigation: 00:00 *🧠 Introduction to STORM AI* - New Stanford project for creating comprehensive Wikipedia pages on any topic - AI agents research and reference sources automatically - Can be run locally with limited functionality 02:05 *🛠️ Installation and Setup Process* - Cloning the GitHub repository - Creating and activating a Conda environment - Installing required dependencies - Setting up secrets.toml file with API keys 04:50 *🖥️ Running STORM AI* - Using the streamlit-based UI for easier interaction - Demonstrating the research process on the topic "dogs" - Showcasing the generated article with references 06:53 *🏠 Local Setup Options* - Possibility of running without OpenAI API - Support for VLM with Mistral - Potential for using Llama (not fully implemented) Made with HARPA AI
Ollama tutorial would be amazing. Please consider doing it. Also can you go through the storm paper? That can be an interesting series of going through key AI papers like REACT Storm, etc.
I bet you could relatively easily attach a web front-end model to generate a better looking output. He was raving about the wiki output, and I was thinking it looks like a blob of text; but if the information is good, then formatting should be fairly easy to solve.
I found it awful. Asked it about a specific fish and it writes about another animal altogether yet claims it's about the animal I asked for, fakes its references which lead to 404 pages and adds articles which don't even mention the animal yet says it does. 😅 Nice idea terrible execution
I ran into a problem that indicated a couple of modules were not available. I tried to install them but found they existed in python 3.10 when I restarted the conda environment with python=3.10 everything worked fine. Great results!
It was awful a few months back all incorrect info, their devs got very defensive for me highlighting them. Did they finally manage to get it working with a level of accuracy?
Really bad just trying their website demo. Hallucinated references leading to 404 pages and gave incorrect info. I asked it to write about the lunare wrasse and it wrote most of the article copying a blog post about a six line wrasse 😅 then banged on about marine mammal conservation when this is a fish and stitched my fish into articles that don't mention it all all. 2 out of 10.
4:40 that is a problem. Sorry but I cant trust a API service that doesn't allow for the deletion of keys. That is a pretty basic functionality that should be there.
But did anyone accomblished to point it to another "knowledge base"? This DSPy Framework under the hood is still a black box for me. But the approach is really awesome I like to cite STORM what Content can be created by AI.
While this is cool, what is the use case here? Just build another Wiki page that sources wiki pages? I think modifying this to web scrape and build blog posts or knowledge bases for internal company trainings would be interesting, but its needs a good bit of work to modify this.
Use this prompt to test reasoning with a known quantity with contextual inference John observed two snakes headed for his yard from the driveway. One was red and the other was blue. The blue snake passed his yardstick and was wider than the yardstick. The red one passed the yardstick and was longer but not wider. John caught one of the snakes and it measured 9.75 inches. Which snake did John catch ?
Would be interesting to see how good the results are using a smaller local 7B - 9B parameter model. Do the smaller models affect the quality of what subtopics or reasoning the program uses to generate the research pieces or how well the paper is cleaned and deduplicated?
Your videos are always enjoyable. I have some critiques of the app related to authoritativeness, but guys smart enough to create the app are almost certainly also smart enough to anticipate such issues. Quibbles aside, it's potentially a wikipedia-killer.
Yt comment filter has been f'ing up a lot in recent times; there are some channels I'm not even allowed to comment at anymore despite the channel owner being complete fine with me; and in other channels I have to keep checking in an private mode window if my comments have survived or not (and to make matters worse, direct links to comments don't always work anymore, i gotta also try sorting by recent and scrolling thru to see if I can find it sometimes, and to make matters worse, sometimes there's a delay before they show, or before they're gone)...
Not sure but this generated me an article full of hallucinated 404 references and incorrect info. I asked it about a specific fish lunare wrasse and it wrote about another fish entirely mostly from one person's blog on that other animal yet claimed otherwise.
I would appreciate if you could disclose the rates as well of these API calls, you have an amazing hands on channel, costs are a important factor in production, with your entrepreneur background you understand 😉
It would be extremely useful for literature review when writing article. I can't wait to see how AI is going to change the way journal papers are published.
It's not good try it on their website, looks good until you read the references content and find it's unrelated info it tries to claim as relevant (well the ones that actually exist and aren't 404 links to imagined ones)
I tried the demo and even if I asked to do the research with an Italian term, it replied me in English. The contents where excellent, but there's a way to maintain the original language?
Fix your audio please, it's very quiet in last few videos. When I put it in Davinci and set it to standard level, I have to add +10 or +12db to make it sound normal like other videos.
Worked today but article full of inaccuracies when read the referred articles which are often not related to what it's paraphrasing, if he checks some of his the articles links might actually be about cats not dogs. Also imagines links which go to 404
Hi sir can you please pin this message and create a video on how to use micro-agent, gptengineer and agency swarm with open source llm like groq, gemini and ollama.
Not impressed. Of course, I only used the online demo and I have very high standards (30+ years as a magazine writer and editor) but it failed to follow my prompt detail and went its own way on the general topic. I'll install locally (please do that tutorial!) and keep testing but the output I got wouldn't pass a high school English class. Maybe good enough for the average blog but not much more. Eager to see how it develops
I'll be honest, as much as I like knowing about these new technologies, I really couldn't care less about using them unless they're open source and/or run 100% locally on my machine and ideally whatever that tech is should have a GUI. The only thing stopping me from using Crew AI is just that, there's no easy to use GUI. I'm an average guy, I don't want to have to learn command line interfaces for a dozen new AI apps lol
No, they still do because this can be wrong and there's no liability to anyone. But if you publish something that's wrong, you might have bad results with your life, depending on what it is and how important it was
Should I do a follow up tutorial and show how to do everything locally?
Of course, now you have a powerful AI machine i don't understand why you even bother using paid api.
Full local would be awesome sauce
Absolutely. I am going to try this next week. I have a 4060 Ti (16Gb) GPU and am loving running Llama 3 8b uncompressed. So good! Hope I can get this running. I've not set up a python environment before by myself so it should be interesting.
Of course , we would like to see detailed demo.
No, but you should always post the link to the product or service your demoing. I know you think that by not posting the link it forces people to come to your website and then find the link, but in reality people just google the name and go directly to it.
🎯 Key points for quick navigation:
00:00 *🧠 Introduction to STORM AI*
- New Stanford project for creating comprehensive Wikipedia pages on any topic
- AI agents research and reference sources automatically
- Can be run locally with limited functionality
02:05 *🛠️ Installation and Setup Process*
- Cloning the GitHub repository
- Creating and activating a Conda environment
- Installing required dependencies
- Setting up secrets.toml file with API keys
04:50 *🖥️ Running STORM AI*
- Using the streamlit-based UI for easier interaction
- Demonstrating the research process on the topic "dogs"
- Showcasing the generated article with references
06:53 *🏠 Local Setup Options*
- Possibility of running without OpenAI API
- Support for VLM with Mistral
- Potential for using Llama (not fully implemented)
Made with HARPA AI
Ollama tutorial would be amazing. Please consider doing it. Also can you go through the storm paper? That can be an interesting series of going through key AI papers like REACT Storm, etc.
yes, please
Nice! It would be good to know how much the API calls cost for the dogs wiki just for a sense of scope.
I have been loving STORM. Although, I wish I could modify that format further and not just generate a wikipedia style output.
I bet you could relatively easily attach a web front-end model to generate a better looking output. He was raving about the wiki output, and I was thinking it looks like a blob of text; but if the information is good, then formatting should be fairly easy to solve.
I've been using it for a couple of days and it's amazing. You need to rewrite it of course but it can write amazing articles
I found it awful. Asked it about a specific fish and it writes about another animal altogether yet claims it's about the animal I asked for, fakes its references which lead to 404 pages and adds articles which don't even mention the animal yet says it does. 😅 Nice idea terrible execution
I ran into a problem that indicated a couple of modules were not available. I tried to install them but found they existed in python 3.10 when I restarted the conda environment with python=3.10 everything worked fine. Great results!
Thanks! Do you know approximately what the total token usage or API cost is to run each research instance?
Incredible, what he was able to archive in a short time is 99% usable, as always, you have to read it and correct it, but it was damn good.
Amazing!!! Thesis reseachs are gonna never be the same!!!
There is a good bit of overlap with this and Perplexity AI. I use Perplexity non stop!
Same pro is a beast I stop paying gpt sub and rock with perplexity and claude sonnet and claude api with claude engineer non stop
Perplexity is awesome
i talk to perplexity more than i do humans nowadays lmao
It was awful a few months back all incorrect info, their devs got very defensive for me highlighting them. Did they finally manage to get it working with a level of accuracy?
What I want to know is if it can factcheck itself so its output is reliable? How much does it suffer from hallucinations and errors?
If an article is wrong, the result might be wrong too I guess.
Really bad just trying their website demo. Hallucinated references leading to 404 pages and gave incorrect info. I asked it to write about the lunare wrasse and it wrote most of the article copying a blog post about a six line wrasse 😅 then banged on about marine mammal conservation when this is a fish and stitched my fish into articles that don't mention it all all. 2 out of 10.
4:40 that is a problem. Sorry but I cant trust a API service that doesn't allow for the deletion of keys. That is a pretty basic functionality that should be there.
But did anyone accomblished to point it to another "knowledge base"? This DSPy Framework under the hood is still a black box for me. But the approach is really awesome I like to cite STORM what Content can be created by AI.
Excellent data, I'm impressed by SmythOS's pre-built AI templates' versatility. It is like to having a personal helper! #SmythOS
Super cool but a wiki is not a Wikipedia, Wikipedia is a wiki, please don't mix these terms.
Difference?
if they make a tool that uses papers from top journals as references, it can be valuable
This is great. Amazing technology and a wonderful quick video showing it in action.
Matt, please do a ollama or llamacpp tutorial, I really want to run this totally locally and I don't have openai api anyway.
While this is cool, what is the use case here? Just build another Wiki page that sources wiki pages? I think modifying this to web scrape and build blog posts or knowledge bases for internal company trainings would be interesting, but its needs a good bit of work to modify this.
Мне очень нравиться как ты говоришь, и всегда стараешься быть максимально объективным и беспристрастным. Благодарю!😅
Stop asking and DO FULL TUTORIALS !!!
Yea, full local tutorials please. Even thou Mat has a beast comp and I don't.
Hehe, it's easy engagement call to action fodder... Carrot on a stick if you will. If he doesn't ask I am let down. 😂
Engagement... Blame the Google smooth brains for this useless algorithm.
The smooth brains are the ppl like y'all who encourage it. This is a cancer.
At least pretend to have some self respect
Use this prompt to test reasoning with a known quantity with contextual inference
John observed two snakes headed for his yard from the driveway. One was red and the other was blue. The blue snake passed his yardstick and was wider than the yardstick. The red one passed the yardstick and was longer but not wider. John caught one of the snakes and it measured 9.75 inches. Which snake did John catch ?
OooooooOOOoooo yeah need this local. This structure is really helpful for a few other greater implementations.😮
Would be interesting to see how good the results are using a smaller local 7B - 9B parameter model. Do the smaller models affect the quality of what subtopics or reasoning the program uses to generate the research pieces or how well the paper is cleaned and deduplicated?
This is awesome! I have a goal to get some AI tools running locally, but still learning how to use them on web
Your videos are always enjoyable. I have some critiques of the app related to authoritativeness, but guys smart enough to create the app are almost certainly also smart enough to anticipate such issues. Quibbles aside, it's potentially a wikipedia-killer.
why are my comments getting deleted that mention that the search engine requires to provide a card and says free for 60 days? What happens after that?
not sure why...and after 60 days my guess is you have to pay
@@matthew_berman I guess so, but how much? Paranoid me says 1 million, still... I had to try it, so I am in the 60 days now. 🙈
Yt comment filter has been f'ing up a lot in recent times; there are some channels I'm not even allowed to comment at anymore despite the channel owner being complete fine with me; and in other channels I have to keep checking in an private mode window if my comments have survived or not (and to make matters worse, direct links to comments don't always work anymore, i gotta also try sorting by recent and scrolling thru to see if I can find it sometimes, and to make matters worse, sometimes there's a delay before they show, or before they're gone)...
how much openAI credits did you end up spending for the dog research wiki?
I suspect as many credits he spend on the meth research article
Great find! Lets see if you can do a local, using Ollama with something like llama3 would be amazing!
Yes a local tutorial for this would be great!
Thanks, Great as usual, I sure like to see the full local setup (local model) of that project.
Thanks! Please do local installation!
How does it compare to GPT Researcher? I would say that this is probably more comprehensive.
Not sure but this generated me an article full of hallucinated 404 references and incorrect info. I asked it about a specific fish lunare wrasse and it wrote about another fish entirely mostly from one person's blog on that other animal yet claimed otherwise.
How does the research this performs compare to doing research with Perplexity AI’s Pro Search with a paid plan?
Yes, please do local with ollama... I want to then see it running as a tool for a mixture of agents within something like AutoGPT
this is great, id love to know how to create an agent swarm workflow with this included
Thanks Matthew, appreciate your kind efforts. Hope to see the local video made by you and it would be great❤️
Hi Matthew. Do a tutorial using Gemma 2 9B
This is fantastic! I can already think of using this for blogs and articles. Hopefully the source it referenced isn't full of misinformation! 🤣
Does it works with groq ??
Race ya to a fully local version :)
Hi bro, I am watching the Groq+CrewAI Rerun and didn't know where to comment, but just saying THANK YOU!
full local tutorial would be great!
Great video Matt, can you please do a video on storm working on Mixtral and Ollama ?
wow impressive, the most valuable work is done yet by another ai tools
I would appreciate if you could disclose the rates as well of these API calls, you have an amazing hands on channel, costs are a important factor in production, with your entrepreneur background you understand 😉
What is your terminal setup?
Thanks again. Local version if you can please.
I wonder if this could work with lm studio and firecrawl. 🤔
would you happen to know if it handles serpapi as opposed to you?
Wow! How much ram would this take please??
Amazing! what a time to be alive
Hey, why would i need it locally? I just tried it - it works as is. Do i get better results when it's local?
hi Matthew, how was your visit to QUALCOMM? any video over that?
Thanks for this video, looking forward to using ollama and local models.
is there a tool like this but for creating code documentation for your applications?
Yes please do the local tutorial!
Yea local Thanks for all the great wk U do. all the best
hey! what about VectorVein ?
It would be extremely useful for literature review when writing article. I can't wait to see how AI is going to change the way journal papers are published.
It's not good try it on their website, looks good until you read the references content and find it's unrelated info it tries to claim as relevant (well the ones that actually exist and aren't 404 links to imagined ones)
FULL LOCAL, including SCRAPE pretty please
I would like to see a full tutorial for example with groq
Please upload a local LLM support with Ollama set-up
I think it needs to output in paragraph form
If they do searxng, you can even self host the web search.
I tried the demo and even if I asked to do the research with an Italian term, it replied me in English. The contents where excellent, but there's a way to maintain the original language?
Extremely helpful! Thank you! 😎🤖
Please create more videos about STORM. Thank you
You did not say how much it will cost on OpenAI?
Thanks for the video! Let's try to run it with LMStudio)
ADD THE THE PUBLIC TEST LINK TO DESCRIPTION
Fix your audio please, it's very quiet in last few videos. When I put it in Davinci and set it to standard level, I have to add +10 or +12db to make it sound normal like other videos.
Imagine trusting a LLM specially with GPT-4o and compare it with Wikipedia, thats mental. You guys trusting LLMs too much.
Nah we are good. Just remember: “LLMs can make mistakes, double check important info.”
tried the demo site, it never did complete my request after waiting all night.
Worked today but article full of inaccuracies when read the referred articles which are often not related to what it's paraphrasing, if he checks some of his the articles links might actually be about cats not dogs. Also imagines links which go to 404
In for the full tutorial
Hi sir can you please pin this message and create a video on how to use micro-agent, gptengineer and agency swarm with open source llm like groq, gemini and ollama.
Please, make a full tutorial. It'll be interesting.
please set it up with ollama server too
nice. I would run this in a heartbeat if it works with ollama
pls make a deep local tutorial
Absolutely..Local!!
Please, full local tutorial!
FULL LOCAL NOW
AIpedia is cool. )))
please create full local tutorial.
do I need a paid chatGPT?
I created a blog from your video using my AI agent platform in minutes...
A comment for Ollama tutorial!😄
I feel like you should've shown making an article that doesn't exist already
Local tutorial yes
the ironic thing is that installing any AI project is something back from the 80s.
I know, right? It's pathetic
Not impressed. Of course, I only used the online demo and I have very high standards (30+ years as a magazine writer and editor) but it failed to follow my prompt detail and went its own way on the general topic. I'll install locally (please do that tutorial!) and keep testing but the output I got wouldn't pass a high school English class. Maybe good enough for the average blog but not much more. Eager to see how it develops
Where's the Ewok you promised?
I'll be honest, as much as I like knowing about these new technologies, I really couldn't care less about using them unless they're open source and/or run 100% locally on my machine and ideally whatever that tech is should have a GUI. The only thing stopping me from using Crew AI is just that, there's no easy to use GUI. I'm an average guy, I don't want to have to learn command line interfaces for a dozen new AI apps lol
Tbh they already have a UI, its not that you have to do it all in a terminal.
When its consumer ready you will have your GUI.
Off topic Claude has android app now
Instead of running STORM locally, you can try it out on the website, storm.genie.stanford.edu/
anyone have problems getting this to work based on his instructions?
So scientific review articles now have no value, right?
No, they still do because this can be wrong and there's no liability to anyone. But if you publish something that's wrong, you might have bad results with your life, depending on what it is and how important it was
what for?