How to Add Attention to LSTM Network! : PyTorch Deep Learning Tutorial Section13

แชร์
ฝัง
  • เผยแพร่เมื่อ 19 มิ.ย. 2024
  • TIMESTAMPS:
    0:00 - Introduction
    0:25 - Previous video overview: Attention Mechanism
    1:43 - LSTM's memory buffer limitations
    3:02 - Incorporating attention with LSTM
    4:56 - Diagram: Storing LSTM outputs for attention
    6:45 - Architecture overview: Multi-headed attention
    8:23 - Training loop adjustments
    10:12 - Text generation examples
    12:58 - Using attention alone in future
    14:37 - Conclusion
    In this video I show how we can add the Attention Mechanism to our LSTM text generator to improve it's performance!
    Donations
    www.buymeacoffee.com/lukeditria
    The corresponding code is available here!
    github.com/LukeDitria/pytorch...
    Discord Server:
    / discord
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น •