How to Fine-tune Mixtral 8x7B MoE on Your Own Dataset

แชร์
ฝัง
  • เผยแพร่เมื่อ 15 ก.ย. 2024
  • In this video, we show you how to fine-tune Mixtral, Mistral's 8x7B MoE (Mixture of Experts) model, on your own dataset.
    You'll be directed to another video where we fine-tune Mistral 7B (standard Mistral) on your own dataset, but you'll be using the notebook here:
    github.com/bre...
    My explanation on how QLoRA works: brev.dev/blog/...
    Fine-tune Mixtral MoE on a HuggingFace Dataset: • Fine-Tune Mixtral 8x7B...
    More AI/ML notebooks: github.com/bre...
    Join the Discord: / discord
    Connect with me on 𝕏: x.com/HarperSC...

ความคิดเห็น • 2