Thank you for this clear explanation! I found it really helpful. Could you please make a video covering multi-head attention? I'd greatly appreciate it! 🙏
Hi, u are doing a great work please do keep uploading more videos..btw I have a small doubt why @11.36 for multihead attention you are giving (x,x) instead of (x) or (x,x,x) since there are three lines is going as input to Multi head attention.
Thank you brother, well explained.
Thank you very much. Hope have many videos.
Thanks for this amazing work... !!! Waiting for more work in Transformers...!!!
Thank you for your great work!
Great Explanation !! Please make a video on Multihead Attention
Thank you for this clear explanation! I found it really helpful. Could you please make a video covering multi-head attention? I'd greatly appreciate it! 🙏
great work most of your works are implemented using tensorflow if possible would you upload pytorch implementation ....
Hi, u are doing a great work please do keep uploading more videos..btw I have a small doubt why @11.36 for multihead attention you are giving (x,x) instead of (x) or (x,x,x) since there are three lines is going as input to Multi head attention.
you are the best
is it a very big model? when i run it in kaggle it gives "resource exhaust error". I used P100 GPU and kaggle provides 29 gb of ram.
Thank you so much,
How can we adjust the image size without getting errors?
thank u
U need a patch extractor I think for an input image to be fed to the model?
yeah
i guess arham, we can make one @@arhamfarooq2267
@@arhamfarooq2267 could you tell me how to use this custom dataset and giving full training process
@@SethmiyaAbeyrathna yeah , could you share your mail id and linkedin
@@SethmiyaAbeyrathna yeah , tell your email or linkedin