I was inspired and learned the basics of TensorFlow after I completed the TensorFlow specialization on coursera. Personally I think these videos I created give a similar understanding but if you wanna check it out you can. Below you'll find both affiliate and non-affiliate links, the pricing for you is the same but a small commission goes back to the channel if you buy it through the affiliate link which helps me create more future videos. affiliate: bit.ly/3JyvdVK non-affiliate: bit.ly/3qtrK39
Thanks for the great video! I am wondering is the call() method in a class a built-in function to convert a class instance into a function? I know there is a __call__() method, but is the call() method the same with the __call__() method?
to get a summary when using subclassing you can use the build command and specify the shape in the parameter: # example: dtc2vec.build(input_shape=(BATCH_SIZE,2)) dtc2vec.summary() for some reason i still get the output "multiple" for embedding layers but the other layers show the output shape
I remember when I first saw tf allowing this class sub classing like pytorch I immediately fell in love it after first usage. Still lovin it, the only complaint I have is when we summarize the model we don't get to see the shapes of intermediate blocks :} that's why sometimes I prefer functional api. Anyways, great video. 🙂
Yeah it reminds me so much of PyTorch.. so I love it 😅 Thank you for the comment, let me know if you have any feedback, what I can improve in making these videos, etc.
@@AladdinPersson Your videos are exceptionally excellent and unique already, even I planned to create such channel, but kinda busy for now, will see. :)
Newbie question: @Aladdin Persson why are they called 3 times in a row? I was guessing (None, 28, 28, 32) (None, 28, 28, 64) (None, 28, 28, 128) would be called once every Epoch
Thanks a ton for this excellent series of tutorial. I have a question. Can we make execution of some layers in the subclass conditional? For example if i added maxpool layer to subclass , and I didn't want to run it for the last CNNBlock, then how can I do that.
Can you please make a video on "custom padding" like adding a single row above the image and a single col. to the left of the image of 28 X 28 X 1 since keras does not give such flexibility, thanks in advance.
Thanks for the video. Just one question. Normally we always need to specify the input shape/input_dim of the data we throw into the model. Here we did not do it and it still ran. Can you elaborate on that
From my understanding we do not need to specify the input shape for the model to run, for example if we create a simple sequential model it is still able to run without specifying the input with keras.Input. However if you do specify the input you're able to run (for example) model.summary() and I'm assuming other functionality that I'm not familiar with
I found the answer here. stackoverflow.com/questions/59473267/why-in-keras-subclassing-api-the-call-method-is-never-called-and-as-an-alternat __call__ method is implemented in the Layer class, which is inherited by CNNBlock class So actually what we do is overriding the inherited call method, which new call method will be then called from the inherited __call__ method. So when we call our model instance, the inherited __call__ method will be executed automatically, which calls our OWN call method.
Hi Aladdin! I was going through the GitHub repo so that I could go over the code that you have written in this tutorial; it seems the code is missing for this tutorial. Could you please post it on Github as well ?. Thank you.
Hi, I constructed the subclassed layer/block and tried using it in a functional API (like you have used in sequential API). It throws an error for me (graph discnnected). Basically, its not accepting the output of the block as an input for the next layer. Can you please help me with it or better still, make a video on it?
@Aladdin Persson: amazing video once again... thanks for this great tensorflow playlist. One question, what you think when we subclassing the tf.keras.Model, can we use tf.keras.Sequential() inside of that class? And second question is can we use __call__ instead of call()??
Yes from my understanding and experiments you can mix between sequential, functional and subclassing seemlessly and that's really powerful in my opinion. Just trying the second part of replacing __call__ instead of call it doesn't work, but I guess it's more interesting to ask why and scimming through the source code for layers. Layer does a lot of things under the hood, such as creating the graph, context managers and a bunch of things. You can read some more about it's __call__ here: github.com/tensorflow/tensorflow/blob/b36436b087bd8e8701ef51718179037cccdfc26e/tensorflow/python/keras/engine/base_layer.py#L875 But so self.call() (that we've implemented) is just a small part of the __call__ of layers.Layer that we're inheriting and in that sense I guess it would make sense that we cannot simply replace call with __call__
Please Please, can you explain how to export Keras subclassed model with 2 args as a SavedModel? I didn't can't save my subclass model ,can you help me?
Could you please specify why you have considered channels[1] rather than channels [0] and channels[2] ? For example, self.identity_mapping= layers.Conv2D(channels[1], 3, padding='same')
Just what i was looking for. I was wondering, maybe you could make a video about how to use M = tf.keras.Model() from tensorflow.keras.utils import plot_model plot_model(M, to_file='Example.png') maybe that would show the dimensions?
what is the meaning of super(CNNBlock, self).__init__() ? how do we know our CNNBlock class has conv and bn attributes, where is the base CNNBlock class?
The class CNNBlock inherits all the attributes of layers.Layer from the line: class CNNBlock(layers.Layer) and then when you call super, you're initializing each instance of CNNBlock with all the attributes of layers.Layer
I was pretty confuse when he calls for base class constructor by passing arguments the name of subclass and self in super methods, this method is used only for Python 2.x versions, we don't need to pass the parameters for super method in Python 3.X anymore.
Hi! thank you for this amazing series. However, when applying the ResNet_Like model, I get the following warning : WARNING:tensorflow:AutoGraph could not transform and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Any idea what would be causeing it? Should I ignore it? I'm running TF2.1 Thank you! Pedro
I haven't done any Udemy courses so I don't have any suggestions there. I don't know this might just be me, but I don't like the feeling I get from Udemy. Feels scammy in my opinion. Just this "5 hours left at this price!!!, 99% off, BUY now!" shit makes me super annoyed. But I should probably get some experience with it before I judge it :)
def call(self,input_tensor,training=False): y=self.cnn1(input_tensor,training=training)#CNN Layer and will output x as output from relu CONV,BatchNorm,Relu (this is call) x=self.cnn2(y,training=training) x=self.cnn3(x+self.identity_mapping(y),training=training) i think in Reset would be like that as i understand --not self.identity_mapping(tensor_input) as this would be the a[0] not a[1] as we have 3 Layers
i copy your code from the beginning part(CNNBlock), goes to [ E:/pythonProject1/subclass.py:20 call * x = self.conv(inputs) AttributeError: 'CNNBlock' object has no attribute 'conv'] cant figure out whats wrong in the Class part.
@@HallockW all part of the process. That half a day figuring out how to find that mistake will save you tons of time finding similar mistakes in the future
I was inspired and learned the basics of TensorFlow after I completed the TensorFlow specialization on coursera. Personally I think these videos I created give a similar understanding but if you wanna check it out you can. Below you'll find both affiliate and non-affiliate links, the pricing for you is the same but a small commission goes back to the channel if you buy it through the affiliate link which helps me create more future videos.
affiliate: bit.ly/3JyvdVK
non-affiliate: bit.ly/3qtrK39
Lucky that I bumped on this video, wish to see more code oriented deep learning material like this one. Great job!
Thanks for the kind words 🙏
I Searched a lot fot Subclassing like pytorch in tensorflow..
I Finally Found the Best..
Thanks!
Thank you very much for the content! It is better than some articles in the net!
This Tutorial was AWESOME!!!! TNX A TON!!!
Amazing, i hated tensorflow before but now i really like it. it looks like pytroch in logic
Thanks for the great video! I am wondering is the call() method in a class a built-in function to convert a class instance into a function? I know there is a __call__() method, but is the call() method the same with the __call__() method?
does anyone can answer this quetion? how Call method automatically called without calling outside of the Class?
Man, i can't believe you don't have more followers, i love your videos and ill be here untill you reach 3.000.000 subs
to get a summary when using subclassing you can use the build command and specify the shape in the parameter:
# example:
dtc2vec.build(input_shape=(BATCH_SIZE,2))
dtc2vec.summary()
for some reason i still get the output "multiple" for embedding layers but the other layers show the output shape
Just what I was looking for Thanks ✌
Thanks for the amazing videos of this playlist!
May I ask you which IDE are you using and what theme is that?
I remember when I first saw tf allowing this class sub classing like pytorch I immediately fell in love it after first usage. Still lovin it, the only complaint I have is when we summarize the model we don't get to see the shapes of intermediate blocks :} that's why sometimes I prefer functional api. Anyways, great video. 🙂
Yeah it reminds me so much of PyTorch.. so I love it 😅 Thank you for the comment, let me know if you have any feedback, what I can improve in making these videos, etc.
@@AladdinPersson Your videos are exceptionally excellent and unique already, even I planned to create such channel, but kinda busy for now, will see. :)
Newbie question:
@Aladdin Persson
why are they called 3 times in a row?
I was guessing
(None, 28, 28, 32)
(None, 28, 28, 64)
(None, 28, 28, 128)
would be called once every Epoch
I never thought that the MNIST datasets could also be used this way. Haha. Taking it to the next level.
Haha I guess it's bit of an overkill :)
@@AladdinPersson That's an understatement lol
@7:05 Why is it rerun again?
Thanks a ton for this excellent series of tutorial. I have a question. Can we make execution of some layers in the subclass conditional? For example if i added maxpool layer to subclass , and I didn't want to run it for the last CNNBlock, then how can I do that.
Can you please make a video on "custom padding" like adding a single row above the image and a single col. to the left of the image of 28 X 28 X 1 since keras does not give such flexibility, thanks in advance.
Thanks for the video. Just one question. Normally we always need to specify the input shape/input_dim of the data we throw into the model. Here we did not do it and it still ran. Can you elaborate on that
From my understanding we do not need to specify the input shape for the model to run, for example if we create a simple sequential model it is still able to run without specifying the input with keras.Input. However if you do specify the input you're able to run (for example) model.summary() and I'm assuming other functionality that I'm not familiar with
@@AladdinPersson Thanks, do you think this true for all networks layers when we use them as the first layer, e.g. Embedding, LSTM, GRU,...)
What is the use of call function? Please explain
I found the answer here.
stackoverflow.com/questions/59473267/why-in-keras-subclassing-api-the-call-method-is-never-called-and-as-an-alternat
__call__ method is implemented in the Layer class, which is inherited by CNNBlock class
So actually what we do is overriding the inherited call method, which new call method will be then called from the inherited __call__ method.
So when we call our model instance, the inherited __call__ method will be executed automatically, which calls our OWN call method.
Hi Aladdin! I was going through the GitHub repo so that I could go over the code that you have written in this tutorial; it seems the code is missing for this tutorial. Could you please post it on Github as well ?. Thank you.
Hi, I constructed the subclassed layer/block and tried using it in a functional API (like you have used in sequential API). It throws an error for me (graph discnnected). Basically, its not accepting the output of the block as an input for the next layer. Can you please help me with it or better still, make a video on it?
How does the call method work ? I know that you can call a class with MyClass(), but how does it know to run the call method ?
The __call__ method of the parent class layers.Layer calls the call() function, hopefully that wasn't confusing lol
@Aladdin Persson: amazing video once again... thanks for this great tensorflow playlist. One question, what you think when we subclassing the tf.keras.Model, can we use tf.keras.Sequential() inside of that class? And second question is can we use __call__ instead of call()??
Yes from my understanding and experiments you can mix between sequential, functional and subclassing seemlessly and that's really powerful in my opinion. Just trying the second part of replacing __call__ instead of call it doesn't work, but I guess it's more interesting to ask why and scimming through the source code for layers. Layer does a lot of things under the hood, such as creating the graph, context managers and a bunch of things. You can read some more about it's __call__ here: github.com/tensorflow/tensorflow/blob/b36436b087bd8e8701ef51718179037cccdfc26e/tensorflow/python/keras/engine/base_layer.py#L875
But so self.call() (that we've implemented) is just a small part of the __call__ of layers.Layer that we're inheriting and in that sense I guess it would make sense that we cannot simply replace call with __call__
Please Please, can you explain how to export Keras subclassed model with 2 args as a SavedModel? I didn't can't save my subclass model ,can you help me?
Could you please specify why you have considered channels[1] rather than channels [0] and channels[2] ?
For example,
self.identity_mapping= layers.Conv2D(channels[1], 3, padding='same')
Sir, make videos on question answering in pytorch and seq to seq in keras
Will look into it!
Aladdin Persson sequence to sequence from scratch in Keras.
thank you
Just what i was looking for. I was wondering, maybe you could make a video about how to use
M = tf.keras.Model()
from tensorflow.keras.utils import plot_model
plot_model(M, to_file='Example.png')
maybe that would show the dimensions?
How to train with mixed inputs and outputs ?
what is the meaning of super(CNNBlock, self).__init__() ?
how do we know our CNNBlock class has conv and bn attributes, where is the base CNNBlock class?
The class CNNBlock inherits all the attributes of layers.Layer from the line:
class CNNBlock(layers.Layer)
and then when you call super, you're initializing each instance of CNNBlock with all the attributes of layers.Layer
I was pretty confuse when he calls for base class constructor by passing arguments the name of subclass and self in super methods, this method is used only for Python 2.x versions, we don't need to pass the parameters for super method in Python 3.X anymore.
So what would be the problem with writing this in functional API?
The skip connections for example
@@AladdinPersson that should be ok: stackoverflow.com/a/42391339/336527
Hi! thank you for this amazing series. However, when applying the ResNet_Like model, I get the following warning :
WARNING:tensorflow:AutoGraph could not transform and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Any idea what would be causeing it? Should I ignore it?
I'm running TF2.1
Thank you!
Pedro
I think you need to install your tensorflow-estimator from Anaconda 3 Navigator
Did you ever figure out the issue? because I am having the same problem
The adobe premiere all of a sudden pops up makes me very shifting about the focus of the latest study aims
Can you suggest some udemy courses?
I haven't done any Udemy courses so I don't have any suggestions there. I don't know this might just be me, but I don't like the feeling I get from Udemy. Feels scammy in my opinion. Just this "5 hours left at this price!!!, 99% off, BUY now!" shit makes me super annoyed. But I should probably get some experience with it before I judge it :)
Books + Internet >> Udemy course
def call(self,input_tensor,training=False):
y=self.cnn1(input_tensor,training=training)#CNN Layer and will output x as output from relu CONV,BatchNorm,Relu (this is call)
x=self.cnn2(y,training=training)
x=self.cnn3(x+self.identity_mapping(y),training=training)
i think in Reset would be like that as i understand --not self.identity_mapping(tensor_input) as this would be the a[0] not a[1] as we have 3 Layers
i copy your code from the beginning part(CNNBlock), goes to [
E:/pythonProject1/subclass.py:20 call *
x = self.conv(inputs)
AttributeError: 'CNNBlock' object has no attribute 'conv']
cant figure out whats wrong in the Class part.
nevermind just find out that i mistake __int__ as __init__, and it take me half a day to know, fml
@@HallockW all part of the process. That half a day figuring out how to find that mistake will save you tons of time finding similar mistakes in the future