(AI Tinkerers Ottawa) Fine tuning Whisper with PEFT LORA w/ Rishab Bahal

แชร์
ฝัง
  • เผยแพร่เมื่อ 27 ม.ค. 2025

ความคิดเห็น • 7

  • @SureshBikmal
    @SureshBikmal 13 วันที่ผ่านมา

    Good presentation. Will you be able to post your program?

  • @ishansharma6198
    @ishansharma6198 หลายเดือนก่อน +1

    At 15:18, the author says that Full Fine-Tuning is a subset of the LORA technique, when in reality, I believe, it should be the opposite. In full fine-tuning, all the parameters of the model are considered, whereas when doing LORA, it is a subset.

    • @rishabbahal
      @rishabbahal หลายเดือนก่อน +2

      In Full fine tuning all parameters are trained. In LORA we train subset of parameters which is based on parameter r. When r increases number of trainable parameters also increases. But when not doing transfer learning, we end up choosing r so high that hypothetically we are training 100% of parameters which is equivalent to full fine tuning. That's why LORA only works well while transfer learning. Otherwise you will end up with r which will make you train all parameters (Full fine tuned).
      Hence Full fine tuning is a special case of LORA.
      I hope I was able to explain.

    • @ishansharma6198
      @ishansharma6198 18 วันที่ผ่านมา +1

      @ Hi Rishabh,
      Thanks for your response, I think now I understand, in LORA, the number of parameters being trained is directly proportional to the value of r.
      Hypothetically, if the value of r is too high, it will amount to 100% of trainable parameters which is called full fine tuning.
      So, all this said, Full Fine tuning becomes a special case of LORA where value of r is too high
      I believe now I understand, Thanks for your insights!

  • @Mohankumardash
    @Mohankumardash 3 หลายเดือนก่อน +2

    Great presentation, can we have the notebook for this application?

  • @AI_DOL_
    @AI_DOL_ 2 หลายเดือนก่อน

    Source Code please

  • @thomashuynh6263
    @thomashuynh6263 3 หลายเดือนก่อน +1

    Do you have demo notebook or source code, thanks.