This Upwork Automation System Will Change Your Business (Make.com Tutorial)

แชร์
ฝัง
  • เผยแพร่เมื่อ 13 ม.ค. 2025

ความคิดเห็น • 29

  • @muadh5942
    @muadh5942 2 หลายเดือนก่อน

    Thanks Jono, the apify Upwork Jobs Scraper is under maintenance, I will wait until maintenance done then test this again!
    Keep Going ⚡

    • @jonocatliff
      @jonocatliff  2 หลายเดือนก่อน

      No worries, my pleasure! Happy to hear, best of luck :)

  • @ehsanveisi3291
    @ehsanveisi3291 หลายเดือนก่อน

    Thanks man!
    Helped me a lot.

    • @jonocatliff
      @jonocatliff  27 วันที่ผ่านมา

      No worries, happy to hear it helped! Best of luck :)

  • @scherpea1
    @scherpea1 8 วันที่ผ่านมา

    So do we remve the JSON module after setup is complete? I was assuming the json parser was just for test while building this automation?

  • @konstantin2090
    @konstantin2090 3 วันที่ผ่านมา

    Nice!

  • @galneryus5678
    @galneryus5678 2 หลายเดือนก่อน

    Thanks for all the value you provide🥰

    • @jonocatliff
      @jonocatliff  2 หลายเดือนก่อน

      My pleasure, hope it helps :)

  • @ImmovableMasonry
    @ImmovableMasonry 2 หลายเดือนก่อน

    Great video! I'm a little confused as to why we don't have to use an iterator to iterate over the results one by one.

    • @jonocatliff
      @jonocatliff  2 หลายเดือนก่อน

      Hey, this is a great question - and something that stumped me when I first saw this too. Make.com doesn't make this very clear, but it's because the data is already returned from the web scrape in an iterable array, meaning that it's already going through the results one at a time.
      So instead of having to throw all the data into an array to go through it one at a time, some modules already return multiple results and when they do, you'll go through each at a time automatically.
      Another example of this is Google Sheets. When you search for rows, and it returns, multiple results (say 5), it'll automatically iterate (go through one by one) through all these results immediately without you having to put it into an iterator.
      Personally, I think they just made this super confusing. Hope this helps :)

  • @crisbot666
    @crisbot666 2 หลายเดือนก่อน

    Hi, great ideas! I'm trying to figure out the Parse JSON code in step 3 (after Apify get dataset items). You copy and pasted the output from the Apify (via download output) but didn't show what JSON string is needed for the Parse JSON.
    Can you provide the JSON string that goes into the Parse JSON? Thanks

    • @jonocatliff
      @jonocatliff  2 หลายเดือนก่อน +1

      Hey there! Thanks for the comment. So think about the Parse JSON module as a 'test trigger'. It has the exact same data as the 'Apify get dataset items' data. I'm just copying everything in the Apify 'get dataset items' module and pasting that directly into the JSON module, without changing anything. The only purpose for this is so that while we test, we don't have to wait 2-5 minutes every run for the web scraper to finish.
      So all that is needed is to copy the data from the Apify 'get dataset items', and paste it in the JSON module, then run it. The results will be identical in the Apify 'get dataset items' and the parse JSON block.
      I have a link to the free blue print in the description with the test data there!
      Hope this helps :)

  • @woody31676
    @woody31676 2 หลายเดือนก่อน

    Exactly what I've been trying to figure out! Thank you! I am experimenting with adding multiple keywords. I wonder if there's a way to keep the scraper from returning any jobs it found in previous runs?
    I'm running into a problem where it is returning the same job over and over again, sometimes multiple times in a row

    • @jonocatliff
      @jonocatliff  2 หลายเดือนก่อน +1

      Hey there! Thanks for bringing this up. Excellent question - I should've included this in my video, I'll keep that in mind for next time. You have 2 options:
      1. You can create a 2 part set up where you schedule the scraper to run daily, and you filter results based on those that are less than 1 day old. That way, you shouldn't receive the same job twice
      2. You can set up a Google Sheets search step before publishing it to Google Sheets, where you look up the ID of the job, or the title of the job post. If the search step returns a result, meaning that it already exists, then you filter it out, so that you don't publish a duplicate to your Google Sheet.
      Hope this helps, and best of luck!

  • @heyhey5963
    @heyhey5963 หลายเดือนก่อน

    Thanks Jono - are you concerned at all getting banned using Apify given Upwork has a no scraping clause in their ToS?

    • @jonocatliff
      @jonocatliff  หลายเดือนก่อน +1

      Hey there! Personally, no as I don't use it. However, for anyone watching/implementing this, the answer is still no, because you're not web scraping using your credentials, you're using an external service

  • @ataimebenson
    @ataimebenson 27 วันที่ผ่านมา

    Its not getting the most recent posts, its bringing post from weeks back and days back

    • @jonocatliff
      @jonocatliff  24 วันที่ผ่านมา

      Hey there, thanks for the qeustion! You'll want to set up the "schedule" feature in Make.com, so that it runs the scenario every so often (i.e. once per day). In the Apify module, you'll want to set the scraper to pull data for the same period of time (i.e. once per day). That way, every day, you find jobs for the past day. You can also integrate into the system a two-step approach for looking up existing jobs in your Google Sheets module, and filtering them out if they already exist. Best of luck :)

    • @ataimebenson
      @ataimebenson 24 วันที่ผ่านมา

      @jonocatliff Thank you, I appreciate the response

  • @hammadyounas2688
    @hammadyounas2688 2 หลายเดือนก่อน

    Great Video Sir.

    • @jonocatliff
      @jonocatliff  2 หลายเดือนก่อน +1

      Thank you very much Hammad :)

    • @hammadyounas2688
      @hammadyounas2688 2 หลายเดือนก่อน

      @@jonocatliff Welcome Can i paste my request here?

    • @jonocatliff
      @jonocatliff  2 หลายเดือนก่อน

      Hey, yes of course

  • @Arkfiit
    @Arkfiit 2 หลายเดือนก่อน

    Amazing❤❤❤❤❤❤

    • @jonocatliff
      @jonocatliff  2 หลายเดือนก่อน

      Thank you!

  • @martinsoluwatobi60
    @martinsoluwatobi60 2 หลายเดือนก่อน

    Thank you for this

    • @jonocatliff
      @jonocatliff  2 หลายเดือนก่อน

      No worries, best of luck :)