Arbitrary model differentiation is a major paradigm shift that I don't think people realize. ML Libraries/services are extremely declarative and/or black right now. Even something like tensorflow gradient tape relies on a lot of magic, and it gets stuck often when your use cases are less than standard. This makes it hard to: 1) Deeply customize your model without working in odd or byzantine declarative paradigms (LOOKING AT YOU TENSORFLOW) 2) Learn a model that's not currently in already well-known Machine Learning cannon This truly is a game changer and I think those of us who are learning how to do it now have a head start
Arbitrary model differentiation is a major paradigm shift that I don't think people realize. ML Libraries/services are extremely declarative and/or black right now. Even something like tensorflow gradient tape relies on a lot of magic, and it gets stuck often when your use cases are less than standard. This makes it hard to:
1) Deeply customize your model without working in odd or byzantine declarative paradigms (LOOKING AT YOU TENSORFLOW)
2) Learn a model that's not currently in already well-known Machine Learning cannon
This truly is a game changer and I think those of us who are learning how to do it now have a head start
So few people here. Surprised.
Julia is cool!!
Awesome stuff!
Is Julia the same thing as Common Lisp?
"Ruby is rubbish! PHP is phpantastic!" author ~ "Nikita Popov"