Here is an English poem about our multi-talented and super-competent UCL Go Society President, Dan Tan: " In the heart of the land, where the sand meets the fan, Who can stand hand-in-hand, make a plan, understand? Not just any clan, nor the average man, It's the Dan Tan Man, with his Dan Tan Man Can! He zips and zams, like no one else can, With a pan that's so grand, and a golden wristband. From Japan to Sudan, to the mountains of Bhutan, Everyone knows it's the Dan Tan Man Can. In a van, on the lam, or just getting a tan, Wherever he ran, he had fans, to a man. For in all of the span, from Milan to Tehran, It's proclaimed and exclaimed: "Dan Tan Man Can!"
I don't think you can discount the importance of Darwinian evolution and the gradual education of the mind as humans age. There's a minimum of such training and capacity to be trained before you get a well formed human mind. The evolution of LLMs is a great parallel to this. Maybe the idea here is to use them as jump off points rather than finding architectures that use lesser data.
Awesome again! Looking forward to listening to this later on my walk :D
Super interesting! Dan Tan is the Man! 🤖🤖🤖🤖🤖🤖🤖🤖🤖🤖🤖🤖🤖
Dan Tan is also the President of the UCL Go Society!
💽🖱🖥💻
Here is an English poem about our multi-talented and super-competent UCL Go Society President, Dan Tan:
"
In the heart of the land, where the sand meets the fan, Who can stand hand-in-hand, make a plan, understand? Not just any clan, nor the average man, It's the Dan Tan Man, with his Dan Tan Man Can!
He zips and zams, like no one else can, With a pan that's so grand, and a golden wristband. From Japan to Sudan, to the mountains of Bhutan, Everyone knows it's the Dan Tan Man Can.
In a van, on the lam, or just getting a tan, Wherever he ran, he had fans, to a man. For in all of the span, from Milan to Tehran, It's proclaimed and exclaimed: "Dan Tan Man Can!"
Slides: github.com/tanchongmin/TensorFlow-Implementations/blob/main/Paper_Reviews/LLMs%20%2B%20Robotics%20Daniel%20Tan.pdf
I don't think you can discount the importance of Darwinian evolution and the gradual education of the mind as humans age. There's a minimum of such training and capacity to be trained before you get a well formed human mind. The evolution of LLMs is a great parallel to this. Maybe the idea here is to use them as jump off points rather than finding architectures that use lesser data.
well said, perhaps large amounts of data could have helped in getting the initial architecture