Please join my free Skool community, I post the workflows from my videos in there! If you have further questions shoot me a message and I can get you what you’re looking for
found it. I was able to debug it. EXTRACT code (in case of google search API) const results = $json; // This represents the JSON response from the Google API. if (!results.items || results.items.length === 0) { return []; // Return an empty array if there are no results. } // Parse the 'items' array const parsedResults = results.items.map(item => { return { json: { url: item.link, } }; }); return parsedResults; // Return the parsed array of results. PAGINATION code (adjust the batch size based on your need) let start = 0; // Initial page const limit = 50; // Limit to fetch 50 profiles (adjust as needed) const batchSize = 10; // Google Search returns 10 results per page const output = []; // loop through pages by increments of 10 while (start < limit) { output.push({ json: { start: start } }); start += batchSize; // Increment by 10 for each page } return output;
Problem is: Google API costs money. And the costs can get very big, very fast, because every request costs money. You can get around this using rotating proxies and user agents and setting a random delay of a couple of seconds on every request. But even than, you can get blocked. Google is very clever in that regard. To know this is basically webscraping 101.
Amazing video Nate, thanks for share bro. Only one question, what url use in your configuration in the time of crata new search engine. Please tell me!
Your idea is good but unfortunatelly this will not work. HTTP request will fail because of "Too many requests", even with headers, n8n is a bit limited in that regard.
Well done Nate. Please can you also share the code for pagination node and extract url node? thanks in advance. Keep it up.
Please join my free Skool community, I post the workflows from my videos in there! If you have further questions shoot me a message and I can get you what you’re looking for
found it. I was able to debug it.
EXTRACT code (in case of google search API)
const results = $json; // This represents the JSON response from the Google API.
if (!results.items || results.items.length === 0) {
return []; // Return an empty array if there are no results.
}
// Parse the 'items' array
const parsedResults = results.items.map(item => {
return {
json: {
url: item.link,
}
};
});
return parsedResults; // Return the parsed array of results.
PAGINATION code (adjust the batch size based on your need)
let start = 0; // Initial page
const limit = 50; // Limit to fetch 50 profiles (adjust as needed)
const batchSize = 10; // Google Search returns 10 results per page
const output = [];
// loop through pages by increments of 10
while (start < limit) {
output.push({ json: { start: start } });
start += batchSize; // Increment by 10 for each page
}
return output;
@@lemateu Awesome man glad you got that figured out
Problem is: Google API costs money. And the costs can get very big, very fast, because every request costs money. You can get around this using rotating proxies and user agents and setting a random delay of a couple of seconds on every request. But even than, you can get blocked. Google is very clever in that regard. To know this is basically webscraping 101.
Hey Nate I was doing something similar and getting blocked. The fix for me was to send headers with the HTTP requests
Gotcha. Thanks for sharing this will come in handy
Amazing! Is it possible to get sponserd pages results using google custom search engine? I would like to check ads competitors...
That’s a great question. I will have to look into this. Thank you!
Amazing video Nate, thanks for share bro. Only one question, what url use in your configuration in the time of crata new search engine. Please tell me!
Shoot me an email, I’m not sure exactly what you’re asking. nateherk@uppitai.com
Your idea is good but unfortunatelly this will not work. HTTP request will fail because of "Too many requests", even with headers, n8n is a bit limited in that regard.
Keep up the consistency Nate, great video's 🫡
Thank you much!
Hey Nate I just wanted to