Install Reflection-70B with Ollama Locally and Test Reflection-Tuning
ฝัง
- เผยแพร่เมื่อ 16 ก.ย. 2024
- This video shows how to locally install Reflection Llama-3.1 70B and test it on various benchmarks and understand how reflection tuning works.
🔥 Buy Me a Coffee to support the channel: ko-fi.com/fahd...
🔥 Get 50% Discount on any A6000 or A5000 GPU rental, use following link and coupon:
bit.ly/fahd-mirza
Coupon code: FahdMirza
▶ Become a Patron 🔥 - / fahdmirza
#reflection70b #reflectiontuning
PLEASE FOLLOW ME:
▶ LinkedIn: / fahdmirza
▶ TH-cam: / @fahdmirza
▶ Blog: www.fahdmirza.com
RELATED VIDEOS:
▶ Resource huggingface.co...
All rights reserved © 2021 Fahd Mirza
They gave this specific example in demo and it worked :-))
Reflection succeeded in their goal: millions in funding for a dataset that was trained on benchmarks.
And even the benchmarks results are not reproducible.
Oh, and its not even an FFT, it's a LoRA.
I openned all notifications!
cheers
Love when you give your opinions
cheers
Nice one - thanks for the great channel :) BTW, I was playing with Gemma2:27B while listening to this video, so tried the 10 sentences ending with the word 'beauty' test. It succeeded. It failed the "how many 'r's in 'strawberry'" test, but was able to count letters correctly if I asked it to first list each letter of the word line-by-line, as a numbered list. Anyway, just fun :)
Hi Fahd Well said the thing about Hype :)
thanks.
I smell progress.
are you using proxmox for virtualization ?
not create with modelfile....why?
But work with llama-cpp-python but not with ollama modelfile
why is mine running so slow?
simple prompt fails:
"Write a script that implements the “tree -L 2” functionality, but in bash without using tree. make a one-line script, without arguments, for the current directory."
ANY other LLM can do this more or less correct, except reflection (70b-q8_0 tested). Reflection code just do something useless.
thanks for sharing.
Stopped watching 5 minutes in, but I think it would help if you actually used the system prompt that is mentioned you should use.