Hi. Manisha. Great content. Hope we can chat in the email as we are searching for an AI researcher that can built models that can be applied to financial markets. Thanks and warm regards. Lim
I have a project code, wanted to make an AI to index my project files and analyse the code in local environment. On final which can give response based on prompts. Can you make a video on that?
That's an interesting question. I think, replacing MLP with KAN in CNN would need significant modifications to adapt to KAN's unique function approximation capabilities. As KAN is fundamentally different from MLP, not sure if it can be implemented in the same way.
Given the propensity for gradient vanishing in deep recurrent neural architectures, how would one hypothetically mitigate catastrophic forgetting in a sequence-to-sequence model while maintaining stochastic gradient descent convergence and ensuring orthogonality in the weight matrices of a bidirectional LSTM with layer normalization?
Hi, I think to address catastrophic forgetting in a bidirectional LSTM with layer normalization while ensuring SGD convergence and maintaining weight matrix orthogonality, implement Elastic Weight Consolidation for stability, use orthogonal regularization, adjust learning rates, and apply gradient clipping. Hope this paper might be helpful: Kirkpatrick, J., Pascanu, R., Rabinowitz, N., Veness, J., Desjardins, G., Rusu, A. A., ... & Hadsell, R. (2017). Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 114(13), 3521-3526.
Hi … I am from Pakistan a Mathematician n Python+AI aspirant . Good at Python C , C++ and ML … I want to join my first job as a ML candidate could you please help me or is there anyways we can connect. I shall be very thankful to you . I wish I grow with you n your team . Thanks . Waiting
I am sorry this might be disrespectful and don't mean to this is just a question I am curious about do you even have jobs in pakistan that to this high end jobs like ai/ml
I am not sure what any of this information contained within this video is from appearances since I am an average layman when it comes to Neural Networks, but I am interested in the real truth with this science.
Thank you for sharing your thoughts. It's great that you're interested in neural networks.. I understand that the technical details of neural network can be a bit complex, but I will make a detailed video on it in the future 👍
@@airesearcher24 I am looking forward to it, I am just only now stumbling across this information. You're presentation was great and makes the topic understandable. Keep up the good work. I am soon going to start studying computer science.
Great channel! These are some of the *best* videos on the topic! Subbed!!!
Glad that you are enjoying the content. Welcome to the community:)
Good explanation 👍
Thanks:) Glad you liked it and stay stunned for more interesting topics 👍
Love you mam ❤
Hi. Manisha. Great content. Hope we can chat in the email as we are searching for an AI researcher that can built models that can be applied to financial markets. Thanks and warm regards. Lim
Hi Lim, thanks for reaching out. Your interest is appreciated.
I have a project code, wanted to make an AI to index my project files and analyse the code in local environment. On final which can give response based on prompts. Can you make a video on that?
Sure, I will make a video on that. Thanks for the suggestion. Keep watching:)
how can i implement the KAN model in place of MLP in CNN ? if it's possible please answer , thank you
That's an interesting question. I think, replacing MLP with KAN in CNN would need significant modifications to adapt to KAN's unique function approximation capabilities. As KAN is fundamentally different from MLP, not sure if it can be implemented in the same way.
Given the propensity for gradient vanishing in deep recurrent neural architectures, how would one hypothetically mitigate catastrophic forgetting in a sequence-to-sequence model while maintaining stochastic gradient descent convergence and ensuring orthogonality in the weight matrices of a bidirectional LSTM with layer normalization?
Hi, I think to address catastrophic forgetting in a bidirectional LSTM with layer normalization while ensuring SGD convergence and maintaining weight matrix orthogonality, implement Elastic Weight Consolidation for stability, use orthogonal regularization, adjust learning rates, and apply gradient clipping. Hope this paper might be helpful: Kirkpatrick, J., Pascanu, R., Rabinowitz, N., Veness, J., Desjardins, G., Rusu, A. A., ... & Hadsell, R. (2017). Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 114(13), 3521-3526.
@@airesearcher24 Thank you
Hi … I am from Pakistan a Mathematician n Python+AI aspirant . Good at Python C , C++ and ML … I want to join my first job as a ML candidate could you please help me or is there anyways we can connect. I shall be very thankful to you . I wish I grow with you n your team . Thanks . Waiting
I am sorry this might be disrespectful and don't mean to this is just a question I am curious about do you even have jobs in pakistan that to this high end jobs like ai/ml
It's good to hear about your skills. Sure, let's connect on LinkedIn.
@@airesearcher24 Thanks . Please let me know ur LinkedIn…
I am not sure what any of this information contained within this video is from appearances since I am an average layman when it comes to Neural Networks, but I am interested in the real truth with this science.
Thank you for sharing your thoughts. It's great that you're interested in neural networks.. I understand that the technical details of neural network can be a bit complex, but I will make a detailed video on it in the future 👍
@@airesearcher24 I am looking forward to it, I am just only now stumbling across this information. You're presentation was great and makes the topic understandable. Keep up the good work. I am soon going to start studying computer science.
Please mam make videos in hindi
Definitely, I will.. and keep visiting for more interesting videos on AI:)