This is a clip from a conversation with Jeremy Howard from Aug 2019. New full episodes every Mon & Thu and 1-2 new clips or a new non-podcast video on all other days. If you enjoy it, subscribe, comment, and share. You can watch the full conversation here: th-cam.com/video/J6XcP4JOHmk/w-d-xo.html (more links below) Podcast full episodes playlist: th-cam.com/play/PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4.html Podcasts clips playlist: th-cam.com/play/PLrAXtmErZgOeciFP3CBCIEElOJeitOr41.html Podcast website: lexfridman.com/ai Podcast on iTunes: apple.co/2lwqZIr Podcast on Spotify: spoti.fi/2nEwCF8 Podcast RSS: lexfridman.com/category/ai/feed/
I'm not sure what Jeremy is thinking. Swift doesn't support Windows and probably never will; I'm not even sure how good the Linux support is. He said it himself at the end: Apple doesn't care. I think Swift is just an awful choice especially considering FastAI's goal of making AI more accessible. Accessibility is the complete opposite of what Apple stands for.
The only thing keeping me around TF is its compatibility with large TPUs. If PyTorch can start nicely interfacing with something like a V3-512 then that'd be great.
Eddie Huang That would be a HUGE mistake! I develop in TF, FAI, Pytorch and Matlab, and for most things beyond eager experimentation and prototyping, or poorly written TF 2.0 function code, TF is on par with most things out there- and much better supported in several critical areas for GPU/TPU training. Now, the “best” choice depends on use case, and whether training, inference, maintainability, deployment, scaleability or something else is your bottleneck or overriding priority; but dropping Tensorflow at this time, is like dropping Java in the 90’s- a huge mistake. Also, forget about what he said about being able to migrate over to “pretty much any other library” within a few days. We work with a lot of new as well as seasoned AI and data science people, and it takes a seasoned AI person weeks to start to truly get a decent grip of how to use a modern, complex framework on a professional level (i.e. knowing the landscape of how to semi-optimally use that framework)- and we can definitely tell the difference in the development time and quality of the work done by a professional, let alone a newer AI developer, who has worked in something like TF for several months versus a few years. It takes several thousand hours of commitment in any of the major frameworks to begin to truly stsrt to master it. You can do great work well before that, but the real curve is long. The most important thing is to choose a good framework that (1) supports what you plan to do; (2) choose one that has legs, and will stay at least relevant and ideally dominant- as that will ensure a solid base of support on many levels. Tensorflow, especially with Google and Chris Lattner (the developer of Swift) definitely fits that bill. So, it is more important choose well and then go deep and long with a major framework first. Things like training time, inference performance, eager execution performance and such will be taken care of by the industry, as they have to be competitive to survive and thrive- and if you don’t think that Google has and will continue to make a major long term commitment of top resources, you’re mistaken.
It is not. Tensorflow 2.0 is a fantastic, extremely fast framework without any kind of regression behind PyTorch. I don’t even understand that statement from Howard...
Yeah. Tf might be slow but in all honesty a beginner will see no difference. A 6 hour training for me in tf is still a 5.5 hour training in torch... 10% faster != 10x...
I agree. It's great for beginners to cut their teeth on but with Pytorch I feel you get a much better idea of what you're doing and your knowledge is tested and increased
Anyone can answer me? What deep learning framework should i install if my laptop is core i3 and my VGA is intel and nvidia 820m? Well, i want to work on my thesis but unfotunately i just realize that my laptop is not too good to train with some framework. Or anyone can suggest some advice for what should i do?😭
My experience usually was that splitting the definition of a computation from its execution is beneficial. Can someone shed light on why it wasn't the case with tensorflow? And how is tf.eager slow, when - given I understand correctly - the python overhead may only apply to the construction of a computation graph, which is not that crucial?
I love MXNet more than any DL framework by far. It's the fastest framework I've ever used. Also it comes with both imperative(useful for research) as well as declarative (useful for production and deployment) paradigm. Not just that you can convert your code written in dynamic graph to static graph in just one line. Also...... It's got more pretrained models than any other framework. I'm glad that I made my switch to MXNet.
can someone reassume what they say in video? I'm not english first language and i understood small piece of what they sayd. I understood that python is limited,and tensorlow is a mess. But,if he created fast ai in python,and he already saying that can be slow,what is the purpose of fastai? is fast? it can use speed of gpus?
He was correct then, but now Tensorflow kicks butt. Although we have used several of the major AI frameworks, 90% of what we and most others do in AI are more and more best done with TF. This video is a few years old, and it has only become more the case.
Why would you execute it with tf eager activated? I would use that only for debugging until I get a stable version. Then disable all sugarcoating and stay with the static graph that is so well optimized in TF 1.x
Jax, very flexible, but also fast, faster than PyTorch. And you can still run on TPU. Yes, now support for TPU has been added to PyTorch, but it was recently done, I tried it, it's still raw.
Mostly great information, but it seems like he was speaking about 4 year old TensorFlow and Keras about 2 years ago. We use 12 different frameworks, 4 on a regular basis. Many of the supporting libraries and extensions are able to be used across most of the frameworks in fairly equivalent form. The appropriate development frameworks and supporting libraries really depends on the specific problem space, requirements of the delivered solutions and other factors. Swift and various ancillary language environments, including Node.js (used correctly), enables amazing productivity and performance. I totally do NOT agree with starting on Fast and PyTorch, as it is like when they taught Pascal because it was a better learning language with fast ramp-up, even though it was built with a limited professional carry-forward in mind. The basis of PyTorch is somewhat self-limiting, and even Jeremy asserts the same in this and many videos. I have generally found that the best tool at the time is the one you know best that can solve the given problem. If that is the case, why not learn the most robust long-term viable solution set first, and become familiar with secondary environments like PyTorch along the way? That way you get the best possible long-term reusability of skills and are buying into an architecture that is not edging towards marginalization or end-of-life. You can and many should learn Py-based frameworks along the way, while they are building a development skill set that they can grow with for many years.
Even though TF is slower, it is okay as TF Lite is more useful but thing is TF is very difficult to debug. but TF is difficult to debug. I started with TF but later switched to PyTorch. Maybe Lex can improve that .
Umm! Tensorflow provides much more customisation than PyTorch. PyTorch is okay, if you're doing Engineering but in terms of experimentation Tensorflow is a gem!
I am an aspiring Data Scientist. I have invested a lot of time in coding. I have a seen a lot people telling that TensorFlow doesn’t help us understand what happens behind the scene. But PyTorch, the model is much more transparent. I am not an expert in both the libraries. But if this is true, I’d pick PyTorch.
This is a clip from a conversation with Jeremy Howard from Aug 2019. New full episodes every Mon & Thu and 1-2 new clips or a new non-podcast video on all other days. If you enjoy it, subscribe, comment, and share. You can watch the full conversation here: th-cam.com/video/J6XcP4JOHmk/w-d-xo.html
(more links below)
Podcast full episodes playlist:
th-cam.com/play/PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4.html
Podcasts clips playlist:
th-cam.com/play/PLrAXtmErZgOeciFP3CBCIEElOJeitOr41.html
Podcast website:
lexfridman.com/ai
Podcast on iTunes:
apple.co/2lwqZIr
Podcast on Spotify:
spoti.fi/2nEwCF8
Podcast RSS:
lexfridman.com/category/ai/feed/
Rust is future of deeplearning,isn,t it?
I wish I was on a level where I can say "I am limited by python" LOL
I totally did not expect to hear him mention Swift
I'm not sure what Jeremy is thinking. Swift doesn't support Windows and probably never will; I'm not even sure how good the Linux support is. He said it himself at the end: Apple doesn't care. I think Swift is just an awful choice especially considering FastAI's goal of making AI more accessible. Accessibility is the complete opposite of what Apple stands for.
Sub that is not true, swift 5.3 was just released and windows will be officially supported in the comming weeks.
Plus Swift is now open source.
"The whole model makes sense in PyTorch" beautiful words
Tensorflow code is quite the mess. The whole tree structure of your model makes so much more sense in pytorch.
When TF engineers made their libraries, they did not expect it to be used by millions of people all around the world one day.
Who needs TensorFlow when you have assembly
Assembly is for pussies, I train my own biological neural network!
😅😂
@@juliansuse1 Personally, I don't like biological neural networks. They are too bised and half of the time they output gibberish.
@@juliansuse1 only takes 8 billion years
Who needs assembly when you can move electrons with your bare hands
If you want advise as a new student, go to 6:16
I’ve been noticing lately that Pytorch seems to be gaining popularity and Tensorflow is getting less so. Maybe just me though, idk.
Which one would u prefer,?
Yes.
@@WW-to5rc Why?
Industry is tensorflow research is pytorch.
Trying to put pytorch into production is a hassle.
@@billy818 which one is for good for hobbyists
The only thing keeping me around TF is its compatibility with large TPUs. If PyTorch can start nicely interfacing with something like a V3-512 then that'd be great.
I've been using TF 2.0 for a while and after hearing this, I'm switching to PyTorch. I didn't know it was x10 slower...
Eddie Huang I think he was specifically talking about eager executions in TF.
This is a very quick decision :D
Eddie Huang That would be a HUGE mistake! I develop in TF, FAI, Pytorch and Matlab, and for most things beyond eager experimentation and prototyping, or poorly written TF 2.0 function code, TF is on par with most things out there- and much better supported in several critical areas for GPU/TPU training. Now, the “best” choice depends on use case, and whether training, inference, maintainability, deployment, scaleability or something else is your bottleneck or overriding priority; but dropping Tensorflow at this time, is like dropping Java in the 90’s- a huge mistake.
Also, forget about what he said about being able to migrate over to “pretty much any other library” within a few days. We work with a lot of new as well as seasoned AI and data science people, and it takes a seasoned AI person weeks to start to truly get a decent grip of how to use a modern, complex framework on a professional level (i.e. knowing the landscape of how to semi-optimally use that framework)- and we can definitely tell the difference in the development time and quality of the work done by a professional, let alone a newer AI developer, who has worked in something like TF for several months versus a few years. It takes several thousand hours of commitment in any of the major frameworks to begin to truly stsrt to master it. You can do great work well before that, but the real curve is long.
The most important thing is to choose a good framework that (1) supports what you plan to do; (2) choose one that has legs, and will stay at least relevant and ideally dominant- as that will ensure a solid base of support on many levels. Tensorflow, especially with Google and Chris Lattner (the developer of Swift) definitely fits that bill. So, it is more important choose well and then go deep and long with a major framework first. Things like training time, inference performance, eager execution performance and such will be taken care of by the industry, as they have to be competitive to survive and thrive- and if you don’t think that Google has and will continue to make a major long term commitment of top resources, you’re mistaken.
It is not. Tensorflow 2.0 is a fantastic, extremely fast framework without any kind of regression behind PyTorch. I don’t even understand that statement from Howard...
It's pretty good if you don't use eager execution.
With the introduction of keras API tensorflow coding has become a lot easier and accessible
Yeah. Tf might be slow but in all honesty a beginner will see no difference. A 6 hour training for me in tf is still a 5.5 hour training in torch... 10% faster != 10x...
@@MsFearco you can take advatange of free gpu and tpu with tensorflow.
I agree. It's great for beginners to cut their teeth on but with Pytorch I feel you get a much better idea of what you're doing and your knowledge is tested and increased
Sir Jeremy Howard always reminds me of Gale Boetticher from Breaking Bad.
The Advil in the background explains a lot
I learnt almost everything I know about deep reinforcement learning is due to reading code in Pytorch .....thanks Pytorch
I wonder if Tensorflow customizable features has changed anything in tf vs torch equilibrium.
Anyone can answer me? What deep learning framework should i install if my laptop is core i3 and my VGA is intel and nvidia 820m? Well, i want to work on my thesis but unfotunately i just realize that my laptop is not too good to train with some framework. Or anyone can suggest some advice for what should i do?😭
Just try using keras/tensorflow/pytorch in Google Colab. You can get quite a bit of work done using their GPUs. It's free upto a certain level.
Google collab
I'm happy to hear that Jeremy shares my opinion of Tensorflow.
My experience usually was that splitting the definition of a computation from its execution is beneficial.
Can someone shed light on why it wasn't the case with tensorflow? And how is tf.eager slow, when - given I understand correctly - the python overhead may only apply to the construction of a computation graph, which is not that crucial?
I love MXNet more than any DL framework by far. It's the fastest framework I've ever used. Also it comes with both imperative(useful for research) as well as declarative (useful for production and deployment) paradigm.
Not just that you can convert your code written in dynamic graph to static graph in just one line.
Also...... It's got more pretrained models than any other framework.
I'm glad that I made my switch to MXNet.
The problem is that the last version running on Windows is too ancient
Support for CUDA is also limited if you choose to run it in a container
can someone reassume what they say in video?
I'm not english first language and i understood small piece of what they sayd.
I understood that python is limited,and tensorlow is a mess.
But,if he created fast ai in python,and he already saying that can be slow,what is the purpose of fastai? is fast? it can use speed of gpus?
Thanks Lex, very informative
Gosh, Lex. I was just thinking. What are you doing with all this information you are receiving? The great minds of the modern era.
A Practical Deep Learning for Coders course focusing on Swift for TensorFlow would be amazing!
He was correct then, but now Tensorflow kicks butt. Although we have used several of the major AI frameworks, 90% of what we and most others do in AI are more and more best done with TF. This video is a few years old, and it has only become more the case.
After listening to that, I need some Advil.
Right behind Jeremy :D
With a nice cold bottle of Essentia water.
A great example of product placement ;)
I have tried both tf and pytorch, tf was easy to get in but hard to progress, I like and use pytorch these days
I used FANN a lot. Wich framework can do the same in an equal easy way?
Im assuming this is based on tensorflow 1 and not tensorflow 2
"Swift does dumb things" and "Python isn't going to cut it". Julia it is! Please just invest your OSs time in Julia!
Super-informative, thank you!
Why would you execute it with tf eager activated? I would use that only for debugging until I get a stable version. Then disable all sugarcoating and stay with the static graph that is so well optimized in TF 1.x
Why Swift, why not Rust or Go?
Jax, very flexible, but also fast, faster than PyTorch. And you can still run on TPU. Yes, now support for TPU has been added to PyTorch, but it was recently done, I tried it, it's still raw.
I like the honesty. Hard to come by such people these days in the era of political correctness.
Tensorflow's eager execution is what bugs me. It limits what you can do in calculating gradients & layers.
Almost all Reinforcement Learning libraries are written in pytorch now .....I think it says something
Does anyone use Julia programing language?
Yeah. Flux.jl is more awesome than pytorch. Just needs more development!
Yes, it works great for me to write non-standard neural networks from scratch for research purposes.
Mostly great information, but it seems like he was speaking about 4 year old TensorFlow and Keras about 2 years ago. We use 12 different frameworks, 4 on a regular basis. Many of the supporting libraries and extensions are able to be used across most of the frameworks in fairly equivalent form. The appropriate development frameworks and supporting libraries really depends on the specific problem space, requirements of the delivered solutions and other factors. Swift and various ancillary language environments, including Node.js (used correctly), enables amazing productivity and performance. I totally do NOT agree with starting on Fast and PyTorch, as it is like when they taught Pascal because it was a better learning language with fast ramp-up, even though it was built with a limited professional carry-forward in mind. The basis of PyTorch is somewhat self-limiting, and even Jeremy asserts the same in this and many videos.
I have generally found that the best tool at the time is the one you know best that can solve the given problem. If that is the case, why not learn the most robust long-term viable solution set first, and become familiar with secondary environments like PyTorch along the way? That way you get the best possible long-term reusability of skills and are buying into an architecture that is not edging towards marginalization or end-of-life. You can and many should learn Py-based frameworks along the way, while they are building a development skill set that they can grow with for many years.
I didn't notice upspeak until watching Joe organ. The upspeak is strong here
Any comments on Julia and it's future in machine learning ecosystem ?
Isn't Swift from Apple? Why not take Kotlin which has support for native. It's open source too.
Even though TF is slower, it is okay as TF Lite is more useful but thing is TF is very difficult to debug. but TF is difficult to debug.
I started with TF but later switched to PyTorch. Maybe Lex can improve that .
PyTorch is a beast.
Sometimes when you cage the beast, the beast gets angry.
By now, I think a lot have been changed.
What do you think? Has tensorflow overcome the disadvantage?
Why you have not change u r suit in during quarantine ....!
Man, I was interested in Python until now.
Use C bro.
What are the C Deep Learning Frameworks out there ?
@@alefratat4018 write your own, bro
@@keedt Nevermind, I found one that is really good
@@alefratat4018 PyTorch does have a C++ interface tho
Ouch! 3:58
Umm! Tensorflow provides much more customisation than PyTorch.
PyTorch is okay, if you're doing Engineering but in terms of experimentation Tensorflow is a gem!
Swift for Tensorflow is officially closed. Kind of disheartening tbh :/
Even after 3 years, Swift for Tensorflow hasn't caught up 😅
Pytorch is much simpler and more powerful rather than TF.
This guy is honest. TensorFlow is still difficult to use, especially when you are a Data scientist with no solid programming background.
How are you a data scientist without a solid programming background?
MultiMediaUploads probably using just GUI data visualisation tool
Just say programming is hard.
I am an aspiring Data Scientist. I have invested a lot of time in coding. I have a seen a lot people telling that TensorFlow doesn’t help us understand what happens behind the scene. But PyTorch, the model is much more transparent. I am not an expert in both the libraries. But if this is true, I’d pick PyTorch.
TF 2.0 still suxx compared with PyTorch
A new vista
I feel sad he is working on Swift. Switch to flutter where the phone market shines :)
eww swift..