This Chrome Extension Surprised Me
ฝัง
- เผยแพร่เมื่อ 12 พ.ค. 2024
- I didn't think a Chrome-Extension based Ollama front end could be this good. It does the basics really really well.
github.com/n4ze3m/page-assist
You can find the code for every video I make at github.com/technovangelist/vi.... Then find the folder name that starts with the date this video was published and a title that makes sense for what the video covers.
Be sure to sign up to my monthly newsletter at technovangelist.substack.com/... - วิทยาศาสตร์และเทคโนโลยี
Hey, I'm the creator of Page Assist. This is a really amazing video. Thanks for the feedback; I'll improve it :).
Also, regarding the side panel, there are issues with the Chrome API, which is why the side panel is a little buggy compared to the web UI.
As for the internet search, it's just a fun feature added; I'll improve it in the upcoming releases :)
Thanks. We briefly exchanged a few messages on your discord and I look forward to seeing the improvements.
Thanks for creating it. It has been my go-to Ollama client for a while. My favourite Page Assist feature is that when you go to Ollama website and navigate to any model, it adds a button to pull the model directly on the website.
WOW You really surprised us! Keep up the fantastic work! Your dedication and innovative features are truly appreciated!
Thank you so very much for your work. Sorry, I can't use it.
Pretty please, support Firefox. I'm done with Chrome.
@@BillyBobDingledorf I just installed in firefox, there is alink in the github page
I recently discovered this channel and I have to admit is one of the most interesting one about LLM and the way you are explaining everything and give a not biased review make one of the best contents you can find in YT in this niche!
Thanks for your hard work keep going.
Wow, thank you!
Nice find Matt, please keep sharing these.
Thanks, will do!
I literally installed this earlier today. I was looking for a copilot substitute that I could use for note taking around my hobbies. Let’s hope it keeps taking steps forward.
Another great video! Glad you saw my recommendation on this one, and appreciated the simple installation and functionality as I do. Everyone isn't interested in messing around with Docker, so it's a nice option. There's a few others I've seen that look interesting, but the installation process doesn't make them appealing to test. It's funny, I also passed on the side panel once I saw it didn't work with Arc, but thanks for showing me I'm not missing much there!
Thanks, a few mentioned this. I look forward to seeing how the author improves on it
A lot of excellent ideas in this, which takes advantage with intelligence of being a browser extension.
Most of the shortcomings you've shown look like bugs or work in progress, it's certainly promising.
Yup, and the author is working on fixing them.
has been released 2 weeks ago and you can see how much it already have improved!
I am right and lucky to have subscribed to your channel. Thanks for the innovative Chrome extension. Your work impressed me with its unique features and functionality. I appreciate your dedication to providing users like me with a seamless browsing experience.
Thank you, Matt. I wish there were a TTS option and RAG functionality for Msty. I really like it; for me, it's the easiest option for inference with a local LLM.
RAG apparently is coming very soon. I am on a Mac and Superwhisper is n amazing speech to text solution, but haven't found a great tool the other way around. Things that are platform agnostic are probably going to be a least common denominator offering. Many link to the system method which on the mac sounds like your back in 2000.
@@technovangelist I discovered OUI and lobechat, but they're not as straightforward as the plugin you mentioned in the video. Regarding to the browser extension, I'm impressed with the inference speed of Yi 1.5, which is quite fast on my 4GB VRAM card, achieving about 40 tokens per second, which isn't too bad. So, it seems we'll continue to wait for language learning TTS.
I look forward to firefox (or safari) getting this! Thank you for the video Matt.
How does data privacy factor into extensions like these? Part of the benefit of running LLMs locally is to segregate your data and prevent third parties from using or accessing your data, prompts, results (e.g., if you're using it to create proprietary code). I'm curious if all of these extensions are simply too risky to use for proprietary information or if they're as "local" as a locally downloaded model is and don't have the ability to send data externally?
I haven't looked at the code, but any app has the risk of doing things you don't want. I am fairly confident this app is not sharing anything, but you have to make your own decision. The safest way is to always use the built in ui from ollama on the command line.
Thank you for the good review :)
Than you so much for sharing!
I tested on my M1 8G with Microsoft Edge, works great, this is the simplest way to add a webUI :)
Is there any UI that allows Ollama models to interact with the internet?
This one does web searching but it’s not that great yet
This is freakishly impressive. Wow.
Yeah, it’s really cool
thanks!
How do you treat to openai new model?
I don’t understand the question
@@technovangelist I mean the new model for GPT-4o
What does treat mean here. I still don’t know what you are asking
This is the most newbie friendly UI, installation? chrome plugin! 🤯
I'm rapidly getting to the point where I'm going to need to set up a host to specifically handle running and serving from Ollama. A Chrome extension that's presumably going to be available over 7 or more computers that I'm using web browsers on, seems like a really good incentive to get that done. And of course the bad part is that none of my servers have GPUs in them, so either I pull one out of a gaming or streaming system, or spend money I don't really have available. Oh well. Or I take that system that I'm streaming with now, and turn it into my ollama host, after moving the stream rendering over to a system that's been sitting waiting for me for way too long... :-) Or I run it on my gaming platform or the streaming server, either way, and just not do a lot of research while I'm gaming and streaming. :-) That would work...
For me open-webui is still the best (it supports yt videos summarization) but a browser extensions can also be useful
It does? Like bulk videos or just one at a time?
I wish we could use extensions on an iPad!
Matt Williams Channel just got EXP! 🆙
💫 +100 EXP X 9 🎬 Live Streams
💎🆙️ Level Up! ‼️‼️
💫 +100 EXP
💫 +100 EXP
💫 +100 EXP
💫 +100 EXP
LVL 3 (400/1000) EXP
Matt Williams Unlocked Titles:
👐 The Caring One Lv. 9 (Cares for all types of audiences, taking care of newbies)
🧑🏫 The Guru Lv. 9 (Clears Makes complex topics easy to understand)
🧘 The Zen Lv. 1 (consistently Ollama topic)
Please make a video on AnythingLLM
Coming this week
@@technovangelist thank you so much! Great content! And really opened a world for me with ollama and local LLMs🙏
It's hard to please everyone :) for instance I do agree that I don't loke the idea this extension also brings ollama with it. Why on earth, if I already have one? But on a contrary I kinda understand the logic - it was targeted for ppl who don't have it and want to have one-click-install thing.
Not sure I understand. Msty included ollama but this does not.
@@technovangelist ah I see. Sorry I misheard you - thought this one comes with ollama
Wow, I very nicely asked for Firefox support and youtube instantly deleted my comment.
I see they've added Firefox support now.
Btw: This is my fourth attempt to thank the author and the prior three were automatically deleted by an unnamed super power.
I haven’t seen anything like that come across. TH-cam wouldn’t delete a comment
None of them are being deleted as far as I can see
@@BillyBobDingledorfHey, Page Assist support on Firefox.
@@technovangelist I don't think you're doing it. It's happening too quickly for a human to be doing it.
Also like danswer, but its a bit heavy for self hosting, unless you have 32gb+ ram/graphics card. But danswer and anythingllm current favorites for ease of use and features without too much initial set up.
I have hear a few suggest danswer
Google stole the ai contest idea an posted a 1000000 prize for winner. Your advice to tell as many people about my idea as possible was the worst advice. Within less than two weeks their go live today.
They most certainly didn’t get the idea from you. A project like that takes a month to organize. My advice was to get as much feedback on an idea to develop a software project before you start to build it. That’s great advice many folks will give you. If the idea was just a contest, just do it. Of course you also had the crazy suggestion that I somehow locked you out of your machines.
All these free versions are total crap. Use paid products, if only because the developers have an incentive to maintain and develop them!
I have used some of the paid products too. They are often worse.
crap for you but for us who knows how to use these things, its a godsend.
I'm pretty sure you used a q4 versions that's why it generates crap lols
thanks!