would you make a video about how to cloudmasking an image not an imagecollection? iam processing landsat 9 image using expression. Lately i just found out that the cloud are affecting the min, max, and mean value. while cloudmasking the imagecollection cant processed using the expression. where i can contact you? thanks
Thank you for this tutorial, this is my guide on a project I am currently working on. But I have a question, In a situation where the tiles returned by your query are many and multiple tiles covering the same spot within the AOI but with varying quality of images. How do i identify the image covering some part of my AOI? Is there a script to display the tile number on the map when visualized?
Thanks for your prompt response. I am talking about filtered images. In your case you have two tiles but in my case I have 2041 tiles, but I want to select some specific tiles based on their position on my AOI.
@@OlawaleArowolo email me at muddasirshah@outlook.com with screenshots of the error you’re facing. I’ll tell you a solution but right now I am not clearly understanding your question
Hi, how area you? Congratulations for tutorial, is great help to community of remote sensing. So, i would like know if you payed anything for running the code in cloud and bucket? I need know because i want test a classification to my area of study. Thanks
Yes, I Paid a very very very small amount for GCS bucket. That’s 0.003 / GB / month I believe. Also the colab pro purchasing was totally intentional to speed up the process. One last thing you can do it without GCS buckets by mounting google drive and exporting imagery and train test samples to gdrive as TF records
You just have to define I/O paths. I am sorry I am occupied a lot these days. Else would’ve loved to do meeting with you. Just remove the gcloud credentials and authentication part. Like manually remove each variable associated with gcloud. And give the google drive folder path by copying it from colab sidebar. It’s easy I am sure you can do it
Thank you so much for sharing this valuable learning resource.
My pleasure!
nice
Great tutorial, and thank you so much!
Questions were sent via the email box :)
hi would appreciate it if you could share how you collected your training data or a link to a video that explain that. thanks
@@reginaldotoo3155 Hi, I simply visualized the image in TCC. Picked randomly few points for each class and exported to GCS :)
Update:
Watch this awesome tutorial to understand the TensorFlow part of this code
th-cam.com/video/HMR_2VkDE9s/w-d-xo.html&ab_channel=RobinCole
would you make a video about how to cloudmasking an image not an imagecollection? iam processing landsat 9 image using expression. Lately i just found out that the cloud are affecting the min, max, and mean value. while cloudmasking the imagecollection cant processed using the expression. where i can contact you? thanks
muddasirshah@outlook.com
Thank you for te tutorial! Just curious, does it gonna work if we use T4 CPU instead of A100 GPU? It seems that A100 is not free.
Yes it will work , even it will work with CPU but that will take time
Thank you for this tutorial, this is my guide on a project I am currently working on. But I have a question, In a situation where the tiles returned by your query are many and multiple tiles covering the same spot within the AOI but with varying quality of images. How do i identify the image covering some part of my AOI? Is there a script to display the tile number on the map when visualized?
Are you talking about .TF record files or filtered images?
Thanks for your prompt response. I am talking about filtered images. In your case you have two tiles but in my case I have 2041 tiles, but I want to select some specific tiles based on their position on my AOI.
@@OlawaleArowolo email me at muddasirshah@outlook.com with screenshots of the error you’re facing. I’ll tell you a solution but right now I am not clearly understanding your question
Hi, how area you?
Congratulations for tutorial, is great help to community of remote sensing. So, i would like know if you payed anything for running the code in cloud and bucket? I need know because i want test a classification to my area of study. Thanks
Yes, I Paid a very very very small amount for GCS bucket. That’s 0.003 / GB / month I believe.
Also the colab pro purchasing was totally intentional to speed up the process.
One last thing you can do it without GCS buckets by mounting google drive and exporting imagery and train test samples to gdrive as TF records
@@MuddasirShah Thanks ! I Will try run a similar Code!
Hi sir can u please guide me with the procedure to use Google drive insted of Google cloud service
You just have to define I/O paths. I am sorry I am occupied a lot these days. Else would’ve loved to do meeting with you.
Just remove the gcloud credentials and authentication part. Like manually remove each variable associated with gcloud. And give the google drive folder path by copying it from colab sidebar. It’s easy I am sure you can do it
@@MuddasirShah thank you so much for your valuable response