ขนาดวิดีโอ: 1280 X 720853 X 480640 X 360
แสดงแผงควบคุมโปรแกรมเล่น
เล่นอัตโนมัติ
เล่นใหม่
Well done Professor Susan, Thank you Harvard
much easier to understand it than the original paper >_
She is really good at her explanation!
Really passionate presentation
I'm curious how causal trees compare with targeted maximum likelihood estimation developed by Mark van der Laan at Berkeley?
Do you have a paper to look at?
thanks for uploading this!
Are these slides accessible to the public?
Keval shah Sure,you are free to download those papers in her personl website
thanks for the upload!
Who is behind this camera? Please, please, just zoom out a bit so you don't have to constantly keep moving it. Very annoying and disruptive.
It was me, sorry. i'm only a teaching assistant. I'll do better next time.
"Running regressions in parallel" aka neural networks(?)
thanks for helping me notice :))
Yes, but note that after she said, "there is a better way of doing this" ;)
No,it is literally several regressions independently trained in subsets or clusters of data , I think.
Well done Professor Susan, Thank you Harvard
much easier to understand it than the original paper >_
She is really good at her explanation!
Really passionate presentation
I'm curious how causal trees compare with targeted maximum likelihood estimation developed by Mark van der Laan at Berkeley?
Do you have a paper to look at?
thanks for uploading this!
Are these slides accessible to the public?
Keval shah Sure,you are free to download those papers in her personl website
thanks for the upload!
Who is behind this camera? Please, please, just zoom out a bit so you don't have to constantly keep moving it. Very annoying and disruptive.
It was me, sorry. i'm only a teaching assistant. I'll do better next time.
"Running regressions in parallel" aka neural networks(?)
thanks for helping me notice :))
Yes, but note that after she said, "there is a better way of doing this" ;)
No,it is literally several regressions independently trained in subsets or clusters of data , I think.