48GB of VRAM required makes this impractical for most local use cases at the moment but it is the best quality I've seen from a local video generator so that is encouraging. Still nowhere close to the big models but I imagine their VRAM requirements are even more extreme.
Good insights, though these days you can always rent the GPU and I really dont see any reason to buy the GPUs for individuals unless you are training models in your garage. thanks.
Can you please do a vedio about a model like pixtral or a quantized version of pixtral that works on low vram , and thanks for all the free tutorials 😊
Good job on your channel, especially for covering some rare hidden gems. But at some point, i believe you'll be more selective on models you cover as some are not really worth covering.
48GB of VRAM required makes this impractical for most local use cases at the moment but it is the best quality I've seen from a local video generator so that is encouraging. Still nowhere close to the big models but I imagine their VRAM requirements are even more extreme.
Good insights, though these days you can always rent the GPU and I really dont see any reason to buy the GPUs for individuals unless you are training models in your garage. thanks.
Can you please do a vedio about a model like pixtral or a quantized version of pixtral that works on low vram , and thanks for all the free tutorials 😊
Sure noted.
Fahad please talk loud and have some zeal in your voice... Great videos thanks
Noted
Huge difference in quality between what you generated and what they have on their web site
yep I mentioned that
Good job on your channel, especially for covering some rare hidden gems. But at some point, i believe you'll be more selective on models you cover as some are not really worth covering.
Exactly
cheers
not for windows ?
I dont think so
How to install on Windows 10 ????
I dont think so that would work.
H100 OMG Crazy amount of money ...
Not if you are just renting the GPU