Scrape Google for LinkedIn Profiles in Seconds with n8n

แชร์
ฝัง
  • เผยแพร่เมื่อ 3 ม.ค. 2025

ความคิดเห็น • 16

  • @lemateu
    @lemateu 2 หลายเดือนก่อน +2

    Well done Nate. Please can you also share the code for pagination node and extract url node? thanks in advance. Keep it up.

    • @nateherk
      @nateherk  2 หลายเดือนก่อน +1

      Please join my free Skool community, I post the workflows from my videos in there! If you have further questions shoot me a message and I can get you what you’re looking for

    • @lemateu
      @lemateu 2 หลายเดือนก่อน +1

      found it. I was able to debug it.
      EXTRACT code (in case of google search API)
      const results = $json; // This represents the JSON response from the Google API.
      if (!results.items || results.items.length === 0) {
      return []; // Return an empty array if there are no results.
      }
      // Parse the 'items' array
      const parsedResults = results.items.map(item => {
      return {
      json: {
      url: item.link,
      }
      };
      });
      return parsedResults; // Return the parsed array of results.
      PAGINATION code (adjust the batch size based on your need)
      let start = 0; // Initial page
      const limit = 50; // Limit to fetch 50 profiles (adjust as needed)
      const batchSize = 10; // Google Search returns 10 results per page
      const output = [];
      // loop through pages by increments of 10
      while (start < limit) {
      output.push({ json: { start: start } });
      start += batchSize; // Increment by 10 for each page
      }
      return output;

    • @nateherk
      @nateherk  2 หลายเดือนก่อน

      @@lemateu Awesome man glad you got that figured out

  • @foofourtyone
    @foofourtyone 18 วันที่ผ่านมา

    Problem is: Google API costs money. And the costs can get very big, very fast, because every request costs money. You can get around this using rotating proxies and user agents and setting a random delay of a couple of seconds on every request. But even than, you can get blocked. Google is very clever in that regard. To know this is basically webscraping 101.

  • @OceanCityNJ-ie2hg
    @OceanCityNJ-ie2hg หลายเดือนก่อน

    Hey Nate I was doing something similar and getting blocked. The fix for me was to send headers with the HTTP requests

    • @nateherk
      @nateherk  หลายเดือนก่อน

      Gotcha. Thanks for sharing this will come in handy

  • @louisevasconcelos9649
    @louisevasconcelos9649 2 หลายเดือนก่อน

    Amazing! Is it possible to get sponserd pages results using google custom search engine? I would like to check ads competitors...

    • @nateherk
      @nateherk  2 หลายเดือนก่อน +2

      That’s a great question. I will have to look into this. Thank you!

  • @tobiasreyna-ai
    @tobiasreyna-ai 2 หลายเดือนก่อน

    Amazing video Nate, thanks for share bro. Only one question, what url use in your configuration in the time of crata new search engine. Please tell me!

    • @nateherk
      @nateherk  2 หลายเดือนก่อน

      Shoot me an email, I’m not sure exactly what you’re asking. nateherk@uppitai.com

  • @SamuelEdi80
    @SamuelEdi80 หลายเดือนก่อน

    Your idea is good but unfortunatelly this will not work. HTTP request will fail because of "Too many requests", even with headers, n8n is a bit limited in that regard.

  • @lars15499
    @lars15499 2 หลายเดือนก่อน +1

    Keep up the consistency Nate, great video's 🫡

    • @nateherk
      @nateherk  2 หลายเดือนก่อน

      Thank you much!

  • @oyamasjourney34
    @oyamasjourney34 16 ชั่วโมงที่ผ่านมา

    Hey Nate I just wanted to