Using Ollama to Run Local LLMs on the Steam Deck
ฝัง
- เผยแพร่เมื่อ 7 ก.ค. 2024
- The Steam Deck is a fully fledged pc, which means it's possible to run Ollama on it too. Here I try to install it, run a few models and compare it to the speeds on the Raspberry Pi 5.
00:00 Intro
00:31 Installation
04:38 Model Runs
10:04 Conclusion
ollama.com
store.steampowered.com/steamdeck
Support My Work:
Check out my website: www.ianwootten.co.uk
Follow me on twitter: / iwootten
Subscribe to my newsletter: newsletter.ianwootten.co.uk
Buy me a cuppa: ko-fi.com/iwootten
Learn how devs make money from Side Projects: niftydigits.gumroad.com/l/sid...
Gear:
RPi 5 from Pimoroni on Amazon: amzn.to/4aoalOd
As an affiliate I earn on qualifying purchases at no extra cost to you. - วิทยาศาสตร์และเทคโนโลยี
2:54 love the honesty of hitting the bug and keeping it in the video ❤
Thanks Dave. Thought it was important to show that if you do go ahead and install stuff, it's going to potentially get wiped out with SteamOS updates.
Cool
I just read your article on running the brew package manager on the Deck!
Does it run on CPU or GPU ? Thx
Hi there, I mentioned toward the end, but yeah it's running on the CPU.
@@IanWootten I've also experimented with it and unfortunately wasn't able to get it running on the GPU. If you're successful, I'd be very interested in the results.