I like how you explain what each portion of the code means and where it comes from as you write it. It makes understanding what everything is doing so much better!
Very nice. How can I extract the best parameters , for each layer. For example : how many filters in each cnn layer. What is chosen activation function , what is the size of the filters ? Etc. Basicly get the structure of the model. Thanks Eran
This was a fantastic tutorial. I am curious to know if I could do the same with signal data using a 1D CNN Model? If so, how would the input shape differ in that scenario since I am not working with photo data?
Hey Greg, I'm applying your techniques to model time series data and I'd like to ask when we call tuner.search() and pass it the training data and validation data, does it randomly shuffle the data? You probably already understand why I'm asking thank you. And if it does, how can we preserve the time series ordering?
Hello. What if I want to optimize the batch size, the number of epochs or use callbacks such as ReduceLROnPlateau or EarlyStopping? Can I use keras tuner for that? Or is it already included in the search procedure?
If I want to add k-fold cross validation during hyperparameter tuning, do I add the tuner.search() (i.e. simply replacing model.fit() command with this) inside my loop for k-fold ? Thanks for the great vid!
Hi Greg! thanks for create this tutorial, it help me so much. but I'm currious about the possibility of implementing the keras-tuner, after i augmenting the number of image samples, by ImageDataGenerator. is it possible??
Hi I am having Problems with the Epochs and Number of test run. For e.g. when i want to have it adjust the dropout from 0 to 0.5 steps 0.1 it runs the search for each options with 2 epochs and then stops. So basically nothing is really tested it and it stops prematurely. max_epochs could be set to 100 or even 1000000000 it does nothing. Settings the epoch in search does literally nothing i could even set it to -1 or even a string of gibberish. But when i run a search where i configured a lot in Hyperband(including cnn stride, filter sz, kernel, etc.) it works fine and does what is should. The more i try the more im confused.....
Hi Greg! I hope you are doing well, wanted to ask you if there is a way to use keras tunner with an stacked model, which comes from two models. And if there is which is the best way to do it
How could I implement this with CNN? I'm working with my own dataset adn it seems like the keras tuners don't like the tf.data.Datasets yet. They're still expecting (x_train, y_train), (x_test, y_test). Is my thinking correct there? Essentially I'm loading my data using tf.keras.preprocessing.image_dataset_from_directory and would like to feed this into the tune. How could I split my own data in (x_train, y_train), (x_test, y_test)?
Yeah I didn't do it but I'm guessing you could mix the functional API with one of those random integers as the number of iterations of a loop. I don't think this is often done in practice though
Take my courses at mlnow.ai/!
I like how you explain what each portion of the code means and where it comes from as you write it. It makes understanding what everything is doing so much better!
I’m using this for a CNN. Thanks for the clear presentation and walk through. You’re the man
Sounds great. No problem! Thanks!
Did you get good results
thank you for this extremely useful video!! it cleared up all my confusions regarding hyperparameter tuning!!
Thank you Greg. That's very helpful.
Great to hear!
actually incredibly useful thank you so much
Oh I'm really glad to hear that! :)
Brilliant thanks Greg👍
You're very welcome!
Amazing Viedo! Thank you so much! Saved my ass of my assignment 😂
So glad to hear it :)
Great session
Excellent topic
Thanks!
Very nice.
How can I extract the best parameters , for each layer. For example : how many filters in each cnn layer. What is chosen activation function , what is the size of the filters ? Etc.
Basicly get the structure of the model.
Thanks
Eran
Hello. Did you get the answer ? If yes, please share the answer. Thanks :)
@@mohammeddanishreza4902 No. I did not. However , I figured it out by myself. I uploaded some videos about it.
@@eranfeit Please share the link of the video. It would be helpful. Thank you : )
@@mohammeddanishreza4902 please this playlist :
th-cam.com/play/PLdkryDe59y4ZO3WjBjcEUP1drjPcVvRBA.html
The third video is dealing with this subject.
Great video! Thank u!
You're very welcome :)
your coding tutorial are way better than that review content
Thanks for the feedback!
Great video
this is 🔥
Great to hear!
Thanks a lot!
You're welcome!
This was a fantastic tutorial. I am curious to know if I could do the same with signal data using a 1D CNN Model? If so, how would the input shape differ in that scenario since I am not working with photo data?
Yes you could use something like this for 1d CNN's
Hey Greg, I'm applying your techniques to model time series data and I'd like to ask when we call tuner.search() and pass it the training data and validation data, does it randomly shuffle the data? You probably already understand why I'm asking thank you.
And if it does, how can we preserve the time series ordering?
Hello.
What if I want to optimize the batch size, the number of epochs or use callbacks such as ReduceLROnPlateau or EarlyStopping? Can I use keras tuner for that? Or is it already included in the search procedure?
If I want to add k-fold cross validation during hyperparameter tuning, do I add the tuner.search() (i.e. simply replacing model.fit() command with this) inside my loop for k-fold ?
Thanks for the great vid!
Thank you!!
Hi Greg!
thanks for create this tutorial, it help me so much.
but I'm currious about the possibility of implementing the keras-tuner, after i augmenting the number of image samples, by ImageDataGenerator. is it possible??
It should be I think?
Hi I am having Problems with the Epochs and Number of test run. For e.g. when i want to have it adjust the dropout from 0 to 0.5 steps 0.1 it runs the search for each options with 2 epochs and then stops. So basically nothing is really tested it and it stops prematurely.
max_epochs could be set to 100 or even 1000000000 it does nothing. Settings the epoch in search does literally nothing i could even set it to -1 or even a string of gibberish.
But when i run a search where i configured a lot in Hyperband(including cnn stride, filter sz, kernel, etc.) it works fine and does what is should.
The more i try the more im confused.....
Hi Greg!
I hope you are doing well, wanted to ask you if there is a way to use keras tunner with an stacked model, which comes from two models.
And if there is which is the best way to do it
Hey Greg, do you prefer more tensorflow or pytorch and why? Sorry for bothering you
I haven't learned pytorch.
Hi, I want to know how can I get the name of AF and other info of the best model. for example for the best mode, which AF used
How could I implement this with CNN? I'm working with my own dataset adn it seems like the keras tuners don't like the tf.data.Datasets yet. They're still expecting (x_train, y_train), (x_test, y_test). Is my thinking correct there? Essentially I'm loading my data using tf.keras.preprocessing.image_dataset_from_directory and would like to feed this into the tune.
How could I split my own data in (x_train, y_train), (x_test, y_test)?
is it possible i use this menthod lstm and xlstm
Is there a way to get best number of layers also?
Yeah I didn't do it but I'm guessing you could mix the functional API with one of those random integers as the number of iterations of a loop. I don't think this is often done in practice though
Would have been nice to see model.summary()
You can run the colab code and do that if you'd like :)
max_epochs is 10 but epochs is 50. Isn't it contradictory?
Hi may i know how to contact you?
Check the about section of my TH-cam channel
imparator