My new step-by-step program to get you your first automation customer is now live! Results guaranteed. Apply fast: skool.com/makerschool/about. Price increases every 10 members 🙏😤
Thats truly mind-blowing! What would you do of you needed to add only new unique listings to google sheets ignoring the ones that are already there? You still have to run the same search, but will you be using some filter step to look for existing google sheet records and add only records that are not in the google sheet?
Taught myself to do a similar thing with selenium and python the other day (w/ chatgpt) …but I had to get it to send 2fa code to a email and then fill that to log in. Then goes into the site and scrapes the data I need daily. And sends to a spreadsheet via make. Your process seems easier…and you don’t have to run it locally on your computer like I am.great video as always! Gives me new ideas on how I can approach future projects.
I just stumbled upon your channel and watched couple of videos. This was great and I learn something new, even I have done some scraping for clients. I think you do know how to dump all that data to the Google Sheets using only 2-4 operations. There's one idea for you for your future video's 👌🏻
I really appreciate this 🙏 yes I was considering recording a vid on efficient Google Sheets operations (CSVs let you do so many workarounds). But thought I'd keep it simple for now. Thank you!
Insomnia works too and there are defo better solutions for high volume scraping. But I have a hammer (Make.com subscription) and it makes everything looks like a nail :-)
Just stumbled across your video freaking awesome video. Can I scrape anything within the website? How hard would it be to set up something where we could tell it exactly what we wanted from the website, like specific data?
I fell like Neo after he had all that stuff downloaded into his brain... One question, Nick; at 13:28, for Request Content, did you paste the information from "copy value" (which in this case was the same as "copy object")?
First, shame on these Javascript devs for putting the API calls in the front-end like that. Second, not sure if I'd be making a video explaining how to steal other people's API keys like that. This poor site owner is probably going to get a $80k API bill next month now, lol. Anyway, hello from Calgary.
Hi fellow Calgarian! Hope you enjoyed the snow dump today 😤 Agree it's bad programming practice. You'd be surprised at how many sites still do this though (majority of listing services atm). To your second point, you're referring to the fact anyone can query Crexi without authentication? If so ya that sucks and they should fix ASAP.
Could you help think of a way to take in a google doc with headers and how to extract each section? Or make a video on it? I’ve tried with JSON and text parser but no dice.
Hmm. I'm not sure if you mean "Heading" or "Header" here, but here's a Loom video I just recorded for you that walks through how I personally work with Google Docs modules: www.loom.com/share/c95078e365be4855bb1eb52cd030cd15?sid=7cb55fd1-3593-4ac1-8798-621f7eca0679 I hope this helps you! 🙏
Hey Nick, you are amazing! Yes this is exactly what I was looking for. I unfortunately do need all of the text under a specific header section, so I will play around with this to see if I can figure it out. My goal is that I am then taking the paragraphs and putting it into a google sheet (with the source document being a google doc). I have spent hours playing with this using parsing and converting to HTML but it was a mess and didnt work at the end. I will let you know if I cannot figure it out. Thank you again and your videos are fire! Keep it up, these are super helpful!!! @@nicksaraev
You are awesome and kind! Thank you for filming the loom. I am doing exactly what you wrote and even understand the path and can see the HEADER_1 in the get contendts from the googledoc but it keeps coming back as an empty in the get multiple variables. Any ideas? @@nicksaraev
Yes for sure! You'd sign up for a Hugging Face account and use their Hosted Inference API. Here's a guide: huggingface.co/docs/api-inference/en/quicktour TLDR: you get your API key and then call this URL: api-inference.huggingface.co/models/ There are thousands of models you can choose from and you just swap the as needed. I'd use the "Make a request" HTTP module in Make like I showed you. Hope this helps bro 🙏
Pretty much every site has a hidden API! It's just a question of how much they safeguard their data, and whether the rate limits make scraping logistically feasible. I would jump into network requests ("Inspect Element") next time you're on any site that might be worthwhile.
My new step-by-step program to get you your first automation customer is now live! Results guaranteed.
Apply fast: skool.com/makerschool/about. Price increases every 10 members 🙏😤
I feel so lucky myself to discover you at the beginning of your channel videos. So, learning with you.
I was thinking the same. Was looking for someone who would teach Make without the bullshit. This dudes a legend 💜🔥
Appreciate the support! iyi günler 🙏
Thats truly mind-blowing!
What would you do of you needed to add only new unique listings to google sheets ignoring the ones that are already there? You still have to run the same search, but will you be using some filter step to look for existing google sheet records and add only records that are not in the google sheet?
Taught myself to do a similar thing with selenium and python the other day (w/ chatgpt) …but I had to get it to send 2fa code to a email and then fill that to log in. Then goes into the site and scrapes the data I need daily. And sends to a spreadsheet via make. Your process seems easier…and you don’t have to run it locally on your computer like I am.great video as always! Gives me new ideas on how I can approach future projects.
I just stumbled upon your channel and watched couple of videos. This was great and I learn something new, even I have done some scraping for clients. I think you do know how to dump all that data to the Google Sheets using only 2-4 operations. There's one idea for you for your future video's 👌🏻
I really appreciate this 🙏 yes I was considering recording a vid on efficient Google Sheets operations (CSVs let you do so many workarounds). But thought I'd keep it simple for now. Thank you!
Bro I’m a sales guy from India and I love your upwork RSS video ❤ thank you so much all this is WILD
As a sales guy from Canada I love this comment ❤️ best of luck man
I've used a couple of hidden apis with insomnia + online free tools to convert json to csv. Never thought of using make 🤔. Thanks for sharing
Insomnia works too and there are defo better solutions for high volume scraping. But I have a hammer (Make.com subscription) and it makes everything looks like a nail :-)
Thanks for this! I hit {"error":"Anti forgery validation failed"} ... thoughts on how to go around it?
Just stumbled across your video freaking awesome video. Can I scrape anything within the website? How hard would it be to set up something where we could tell it exactly what we wanted from the website, like specific data?
Thanks for including the part about the headers, this was the missing piece of the puzzle for me.
Same here! Blew my mind when I figured it out
@@nicksaraev how can one get to your mentorship/course please ?
Thank you man! Thank you for everything you're doing.
My pleasure bro!
I fell like Neo after he had all that stuff downloaded into his brain...
One question, Nick; at 13:28, for Request Content, did you paste the information from "copy value" (which in this case was the same as "copy object")?
First, shame on these Javascript devs for putting the API calls in the front-end like that. Second, not sure if I'd be making a video explaining how to steal other people's API keys like that. This poor site owner is probably going to get a $80k API bill next month now, lol. Anyway, hello from Calgary.
Hi fellow Calgarian! Hope you enjoyed the snow dump today 😤
Agree it's bad programming practice. You'd be surprised at how many sites still do this though (majority of listing services atm).
To your second point, you're referring to the fact anyone can query Crexi without authentication? If so ya that sucks and they should fix ASAP.
Could you help think of a way to take in a google doc with headers and how to extract each section? Or make a video on it? I’ve tried with JSON and text parser but no dice.
Hmm. I'm not sure if you mean "Heading" or "Header" here, but here's a Loom video I just recorded for you that walks through how I personally work with Google Docs modules:
www.loom.com/share/c95078e365be4855bb1eb52cd030cd15?sid=7cb55fd1-3593-4ac1-8798-621f7eca0679
I hope this helps you! 🙏
Hey Nick, you are amazing! Yes this is exactly what I was looking for. I unfortunately do need all of the text under a specific header section, so I will play around with this to see if I can figure it out. My goal is that I am then taking the paragraphs and putting it into a google sheet (with the source document being a google doc). I have spent hours playing with this using parsing and converting to HTML but it was a mess and didnt work at the end. I will let you know if I cannot figure it out. Thank you again and your videos are fire! Keep it up, these are super helpful!!! @@nicksaraev
You are awesome and kind! Thank you for filming the loom.
I am doing exactly what you wrote and even understand the path and can see the HEADER_1 in the get contendts from the googledoc but it keeps coming back as an empty in the get multiple variables. Any ideas? @@nicksaraev
Quick Question -- How do you resolve a status code 202 error on this flow?
Oh wow, refreshing the token is genius!!!!
Hey man, another great video - absolutely insane! Is there a way to do this if the content-type is "application/x-www-form-urlencoded"?
Great video. Is there a way to use huggingface api instead of openai?
Yes for sure! You'd sign up for a Hugging Face account and use their Hosted Inference API. Here's a guide: huggingface.co/docs/api-inference/en/quicktour
TLDR: you get your API key and then call this URL: api-inference.huggingface.co/models/
There are thousands of models you can choose from and you just swap the as needed. I'd use the "Make a request" HTTP module in Make like I showed you.
Hope this helps bro 🙏
@@nicksaraev I did gave it a try using hugging face module but seems it's giving 404 route issue. I'll try HTTP module. Thanks for sharing.
STRONG CONTENT❤🔥❤🔥
STRONGER COMMENT❤️🔥❤️🔥
What is the tool you use for browser automation?
Apify! Added to the description.
Where can I find a list of sites with hidden APIs like this? Are there Facebook groups for this topic? Can anyone recommend one?
Pretty much every site has a hidden API! It's just a question of how much they safeguard their data, and whether the rate limits make scraping logistically feasible. I would jump into network requests ("Inspect Element") next time you're on any site that might be worthwhile.
@@nicksaraev is zillow worth looking into or the rate limits do not make it worth it?
That's fucking class, Nick! How did you figure that out?
First! Thanks for the awesome content
First reply! Thanks for the awesome comments
الله يهديك ويرضى عليك يا قلبي
Amazing 🔥🔥🔥🔥🔥🫡❤