Honestly By far the best video explaining input shape and output shape well done!! . freakin brilliant . Could u explain the other parameters/arguments used in lstms and dense layers please!! it would really help me alot!!
great explanation, with plug-and-chug approach using the library, I don't really understand what it is doing, it has been blackbox to me all the time until I found your video
At last someone explained the input and output shapes, batch_size, sequence_length and input_dim arguments of LSTM CELL. It is not explained in documentation and it took me 2 days until I watched this video to understand what these dimensions and shapes mean.
Can you please make a series on how to build Simple Rnns and Lstms from scratch. It will be very helpful for beginner, I agree with what you said , keras documentation is poorly documented.Luckily people like you , ahlad kumar and others help us understanding the practical aspects.
Beautifully explained. I've a question how to reshape our data having shape 11800,400 for X training and Y train is 11800,2. The purpose is to convert the data into 3 dimensions to feed into TimeDistribution layer and than LSTM. Model is CNN + TIMEDISTRIBUTION + LSTM.
Excellent video sir, and excellent explanation. If you ever do a video on TimeDistributedDense layer or RepeatVector Layer please let me know. Thank you very much sir
Thank you sir for the helpful video. I have a doubt however: so, does the predict() function in Python give the output of the LSTM layer ? I do not understand why this function should give the LSTM output...
Hello, Thank you for your video and I have one question: When 1 of 7 sentences goes into LSTM, does it read first column, then the second column, until it reaches 10th column? Then it repeats the process with 2 of 7 sentences until it reads all 7 sentences?
very nice video. thank you. Can you please do this for some other type data like stock forecasting, temperature prediction, etc. It will be really helpful.
Awesome video! thank you so much. If you write model.add(Embedding (1000, 500, input_length =X.shape[1])), Is the number of neurons in the embedded layer 500? or is it 1000? Also is the embedded layer the same as the input layer? thanks so much !
thank you for this informative video. but i have an issue, when fit my lstm model the input_shape is divided by the batch_size. i am not able to find out the solution for that. i am glad if you help to find the solution.
Check the input shapes in the plot_model() diagram. What is the insput shape shown there? Input shape of LSTM is a 3D tensor with shape (batch_size, time-steps, input_dimension or sequence_length). from tensorflow.keras.utils import plot_model plot_model(model, show_shapes=True, to_file='model.png')
Thanks a lot for this wonderful video! I have a question regarding the size of input- In my data, I have a time-series of size 8x12000 (where 8 is number of features and 12000 is the time samples). Also I have corresponding labels or targets of size 1x12000. I would like to predict the labels for a new test time-series. So, how should I reshape this, since it also has a temporal dependency; is it okay to split it into 20 samples (for example) each, such that it becomes 600 observations each of size 8x20? What would you suggest? Thanks in advance!
Hi, I'm getting this error ValueError: Input 0 of layer lstm_5 is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: (None, 64, 13, 250). my train_X.shape is (1130, 13, 250) train_Y.shape is (1130, 59) My code: model = Sequential() model.add(LSTM(64,return_sequences=False,input_shape=(64,13,250))) model.add(Dropout(0.3)) model.add(LSTM(64,return_sequences=False)) model.add(Dense(59, activation='softmax')) I'm not sure where I'm going wrong, could you please look at this once. Thanks in advance
thank you for this informative video. but i have an issue, when fit my lstm model the input_shape is divided by the batch_size. i am not able to find out the solution for that. i am glad if you help to find the solution.
Check the input shapes in the plot_model() diagram. What is the insput shape shown there? Input shape of LSTM is a 3D tensor with shape (batch_size, time-steps, input_dimension or sequence_length). from tensorflow.keras.utils import plot_model plot_model(model, show_shapes=True, to_file='model.png')
after struggling for almost one year, i finally understood lstm clearly today :) thanks sir
Glad to hear that
At start, a voice that comes from beyond
Thank you for this video . This is one of the best explanations given on input and output shapes
Thanks.
After long time, my all doubt's cleared about Lstm practical things...thanks a lot
Glad to hear that
Units parameter was so confusing. Thanks for making it clear.
Glad it helped!
Honestly By far the best video explaining input shape and output shape well done!! . freakin brilliant . Could u explain the other parameters/arguments used in lstms and dense layers please!! it would really help me alot!!
Thanks. Will try to create some videos on Keras in April.
Thanks sir for the wonderful explanation.........
Welcome.
Thank you so much sir! you gave best and simple explaination which made me understand about embedding layer,lstm layer and dense layer in the coding.
Glad it helped.
Just the video that's needed. Thanks
Glad it helped
Brilliant! Looking forward to the next video.
Brilliant. Thanks
Thanks.
great explanation, with plug-and-chug approach using the library, I don't really understand what it is doing, it has been blackbox to me all the time until I found your video
Glad it helped!
Thank you for this clear explanation
Glad it was helpful!
this cleared things up for me. Thanks!
Glad to hear it!
At last someone explained the input and output shapes, batch_size, sequence_length and input_dim arguments of LSTM CELL. It is not explained in documentation and it took me 2 days until I watched this video to understand what these dimensions and shapes mean.
Nicely explained, Thank You!
Glad it was helpful!
Great video for the detailed explanation. Thanks.
Thanks.
Can you please make a series on how to build Simple Rnns and Lstms from scratch. It will be very helpful for beginner, I agree with what you said , keras documentation is poorly documented.Luckily people like you , ahlad kumar and others help us understanding the practical aspects.
One more great video, as always...Thank you so much.
Thanks.
Awesome.. it helped
Glad to hear that
thank you so much
Glad it helped.
At @10:40 you say, this is the output of the first word. Do you mean the first group of 10 words, the first sentence review?
Beautifully explained.
I've a question how to reshape our data having shape 11800,400 for X training and Y train is 11800,2. The purpose is to convert the data into 3 dimensions to feed into TimeDistribution layer and than LSTM. Model is CNN + TIMEDISTRIBUTION + LSTM.
Excellent video sir, and excellent explanation. If you ever do a video on TimeDistributedDense layer or RepeatVector Layer please let me know. Thank you very much sir
when we get output of LSTM as 7x64 matrix, Is it get flattened before feeding to fully connected layer?
Amazing...At 2:30, in line 3, are "units" same as "neurons" ?
Do we need to flatten (7,64) before sending it to dense layer?Please help
Thank you sir for the helpful video. I have a doubt however: so, does the predict() function in Python give the output of the LSTM layer ? I do not understand why this function should give the LSTM output...
Hello, Thank you for your video and I have one question:
When 1 of 7 sentences goes into LSTM, does it read first column, then the second column, until it reaches 10th column? Then it repeats the process with 2 of 7 sentences until it reads all 7 sentences?
yes it goes word by word of a sentence.
very nice video. thank you. Can you please do this for some other type data like stock forecasting, temperature prediction, etc. It will be really helpful.
At the time stamp 5:34, Why you are keeping the batch size (3, 80, 100), shouldn't it be (3, 100, 80) ? BTW Thanks for the explanation !
it is Keras convention, maybe!
very useful
Glad to hear that
Awesome video! thank you so much. If you write model.add(Embedding (1000, 500, input_length =X.shape[1])), Is the number of neurons in the embedded layer 500? or is it 1000? Also is the embedded layer the same as the input layer? thanks so much !
thank you for this informative video. but i have an issue, when fit my lstm model the input_shape is divided by the batch_size. i am not able to find out the solution for that. i am glad if you help to find the solution.
Check the input shapes in the plot_model() diagram. What is the insput shape shown there?
Input shape of LSTM is a 3D tensor with shape (batch_size, time-steps, input_dimension or sequence_length).
from tensorflow.keras.utils import plot_model
plot_model(model, show_shapes=True, to_file='model.png')
Still don't understand why output dimension is 64, if LSTM cell output only hidden state which is only one value, not 64
Thanks a lot for this wonderful video! I have a question regarding the size of input- In my data, I have a time-series of size 8x12000 (where 8 is number of features and 12000 is the time samples). Also I have corresponding labels or targets of size 1x12000. I would like to predict the labels for a new test time-series. So, how should I reshape this, since it also has a temporal dependency; is it okay to split it into 20 samples (for example) each, such that it becomes 600 observations each of size 8x20? What would you suggest? Thanks in advance!
no videos on this topic. thanks
You're welcome
Hi, I'm getting this error ValueError: Input 0 of layer lstm_5 is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: (None, 64, 13, 250).
my train_X.shape is (1130, 13, 250)
train_Y.shape is (1130, 59)
My code:
model = Sequential()
model.add(LSTM(64,return_sequences=False,input_shape=(64,13,250)))
model.add(Dropout(0.3))
model.add(LSTM(64,return_sequences=False))
model.add(Dense(59, activation='softmax'))
I'm not sure where I'm going wrong, could you please look at this once. Thanks in advance
why 100 embeddings ? why 64 outputs ? just adhoc ?
fkin brilliant
Thanks.
rebiew
thank you for this informative video. but i have an issue, when fit my lstm model the input_shape is divided by the batch_size. i am not able to find out the solution for that. i am glad if you help to find the solution.
Check the input shapes in the plot_model() diagram. What is the insput shape shown there?
Input shape of LSTM is a 3D tensor with shape (batch_size, time-steps, input_dimension or sequence_length).
from tensorflow.keras.utils import plot_model
plot_model(model, show_shapes=True, to_file='model.png')