I'm still a beginner, I'm studying statistics at USP Brazil and I'm interested in machine learning. Your videos are excellent and this topic covered in this video seems like it will revolutionize the field. I'll be waiting for the next videos.
I think it's really fascinating, honestly. The Kolmogorov Arnold Representation is something that is entirely new to me, even though it's been around since the 1950s. KAN has renewed interest in this representation, and I think there is a lot to do with it from both the AI implementation side of things and from the theory side.,
This could allow the training of lower parameter models, and then scaling it up by adding more spline points, effectively creating greater resolution in the neural network
Eagerly awaiting your next video! I wonder if some assumptions like symmetry ( f(x_1 , x_2) = f(x_2 , x_1)) were made on the multivariable function, does the KA representation simplify? Maybe all the "inner" functions will turn identical? I don't know. Do let me know if you guys know anything! :)
Arnold loved taking the piss out of French mathematicians, especially those of the Bourbaki school. When he was at a conference in San Francisco he went for a swim (water was about 10 deg C or so) and there are very strong currents. He said it was "refreshing".
@@alxlg it’s still really in the early days of this method, and we haven’t quite figured out all the limitations yet. It has great potential and I think there is a lot to explore with them.
The hack for the whiteboard on the tv was ingenious 😅. I'll steal the idea. Keep feeding us KANs :)
Haha thanks! Not a lot of options in a hotel room! More KAN to come
I'm still a beginner, I'm studying statistics at USP Brazil and I'm interested in machine learning. Your videos are excellent and this topic covered in this video seems like it will revolutionize the field. I'll be waiting for the next videos.
I think it's really fascinating, honestly. The Kolmogorov Arnold Representation is something that is entirely new to me, even though it's been around since the 1950s. KAN has renewed interest in this representation, and I think there is a lot to do with it from both the AI implementation side of things and from the theory side.,
This could allow the training of lower parameter models, and then scaling it up by adding more spline points, effectively creating greater resolution in the neural network
Eagerly awaiting your next video!
I wonder if some assumptions like symmetry ( f(x_1 , x_2) = f(x_2 , x_1)) were made on the multivariable function, does the KA representation simplify? Maybe all the "inner" functions will turn identical?
I don't know. Do let me know if you guys know anything! :)
Arnold loved taking the piss out of French mathematicians, especially those of the Bourbaki school. When he was at a conference in San Francisco he went for a swim (water was about 10 deg C or so) and there are very strong currents. He said it was "refreshing".
@@peterhall6656 haha that’s hilarious!
Is it me or this just sounds too good to be true? There must be a catch or a big downside...
@@alxlg it’s still really in the early days of this method, and we haven’t quite figured out all the limitations yet. It has great potential and I think there is a lot to explore with them.