Awesome. Your video recording is really helpful. As a fish, it will be more helpful for fish if you can add some tips on operating buttons in your video.
First of all would like to commend you Mihir for such a brilliant video explaining the GSC API, I could automate a lot of stuff my team does following you. Just one suggestion the condition you put in the beginning startRow == 0 or startRow % 25000 == 0 are the same you can do with only startRow % 25000 == 0 because when startRow is Zero then the parameter 0 % 25000 will also evaluate to 0. Therefore two conditions are not required.
Hi Mudit, thanks for liking the content. I completely agree with you. It's a great catch. When I published this code, I was very very early stage of my coding journey. Eventually I got little better structured my code better. I haven't published a new version of the GSC code yet, but in the new one I have taken care of the same.
@TheMihirNaik Thanks again for putting this together. Really insightful. Quick question, does the query limit of 25k applies to per query sent across to API or is it per property? Please share what the dead end of the query limit is.
25k rows limit is per query. It means the maximum rows of data Google will return per API call. But there are other property level, account level and project level limits as well. I will have to check them. I will include it in one of the video with sampling.
Thank you so much for this great content! Can you please do a tutorial next to show how to we can push this into Adobe Analytics? i.e. import organic google search data to Adobe Analytics?
Keep up the good work my friend. How do you like to work with python? I have tried options like PythonAnywhere and running a linux system on my windows PC.
Thanks Marc! For quick scripting I use Colab. I am on Windows 10 and I have installed python locally as well. I use VS Code to write code in flask or fastapi framework. I use Github Desktop for version control and push/pull requests. I use Heroku for deployment. I spend around $7 a month for heroku dynos. I tried PythonAnywhere but then I like Heroku better because it allows me to use Redis and bunch of other stuff.
Should the email that is used to create Google Search Console Account and the email used to create GCP account be identical? or we can use separate email to create GCP account?
Thank you for the helpful video. Could you help me in extracting data that has been 'excluded by the 'noindex' tag'? Can you provide guidance on the appropriate filter and dimensions to use?
Thanks a lot again! Awesome, straightforward video 🙌
Really Thanks bro for Helping. These starters Guide is not actually for Starts, you made it simple for us. Thanks :)))))
Amazing, thank you! I was suffering with this for quite some time =D
Glad you found it helpful.
Great job man. Thank you for the great content.
Glad you enjoy it!
Thanks again Mihir
Awesome. Your video recording is really helpful. As a fish, it will be more helpful for fish if you can add some tips on operating buttons in your video.
First of all would like to commend you Mihir for such a brilliant video explaining the GSC API, I could automate a lot of stuff my team does following you. Just one suggestion the condition you put in the beginning startRow == 0 or startRow % 25000 == 0 are the same you can do with only startRow % 25000 == 0 because when startRow is Zero then the parameter 0 % 25000 will also evaluate to 0. Therefore two conditions are not required.
Hi Mudit, thanks for liking the content.
I completely agree with you. It's a great catch. When I published this code, I was very very early stage of my coding journey. Eventually I got little better structured my code better.
I haven't published a new version of the GSC code yet, but in the new one I have taken care of the same.
Really helpfull 😄
Glad it was helpful!
Thank you so much for sharing.
You are so welcome!
@TheMihirNaik Thanks again for putting this together. Really insightful. Quick question, does the query limit of 25k applies to per query sent across to API or is it per property? Please share what the dead end of the query limit is.
25k rows limit is per query. It means the maximum rows of data Google will return per API call.
But there are other property level, account level and project level limits as well. I will have to check them. I will include it in one of the video with sampling.
@@TheMihirNaik thanks for the reply.
Thank you so much for this great content! Can you please do a tutorial next to show how to we can push this into Adobe Analytics? i.e. import organic google search data to Adobe Analytics?
I don’t think you can import GSC data into Adobe Analytics. But also I dont know Adobe Analytics alot so likely I wont be doing it.
Keep up the good work my friend. How do you like to work with python? I have tried options like PythonAnywhere and running a linux system on my windows PC.
Thanks Marc!
For quick scripting I use Colab.
I am on Windows 10 and I have installed python locally as well. I use VS Code to write code in flask or fastapi framework. I use Github Desktop for version control and push/pull requests.
I use Heroku for deployment. I spend around $7 a month for heroku dynos. I tried PythonAnywhere but then I like Heroku better because it allows me to use Redis and bunch of other stuff.
Should the email that is used to create Google Search Console Account and the email used to create GCP account be identical? or we can use separate email to create GCP account?
You could create GCP Account with a separate address.
Thank you for the helpful video. Could you help me in extracting data that has been 'excluded by the 'noindex' tag'? Can you provide guidance on the appropriate filter and dimensions to use?
There is no way you can pull that data as of yet, but you can take all the URLs, get their inspections status, and then filter by the status you want.
Great video, Thank you for sharing with us. I have created file, but its not downloading.
What happened?
And how to pull this data in Bigquery ? Kindly help
I don’t really know BigQuery alot. But there are tutorials on that.
Can you share this collab with us?
Yes the github link is in the description.