Amazing! The concepts of memorization and generalization have been explained superbly. I am really fascinated by what we can achieve when the concepts are used together.
so, the model tries various possibilities and checks if the output is 1/0. In case the output is 0, another combination is tried until the output is 1 before finally recommending the app to user? Is my understanding correct?
Amazing! The concepts of memorization and generalization have been explained superbly. I am really fascinated by what we can achieve when the concepts are used together.
Thanks so much to you guys and google for open-sourcing this!
Loved this beautiful way of explaining such an elegant concept!
basically the wide model is a factorization machine right?
so, the model tries various possibilities and checks if the output is 1/0. In case the output is 0, another combination is tried until the output is 1 before finally recommending the app to user? Is my understanding correct?
Thank you, do you have sample code implementation for wide & deep network? I Couldn't find a sample for it
Is the Google play store example on Github?
Hi, Did you find the example? Thanks.
lovely! like rnns
really interesting idea, in essence, what they are doing is ensemble learning. But one function is complex while the other is not.
NVM they join train
FUNK SVD might be a better approach
15:02 Source code to Wide & Deep model...
I can see this doing wonders with NLP.
can u guide me to build chatbot with NLP in windows platform.
It's like how genetics features are decided for a to-be-born baby. Most of parents, and a few mutations 😜
nice explanation
I want to develop chatbot with deep learning in windows platform. Does tensorflow help..??
yesss