@nate The question you asked related to http request has to do with pagination. Not sure if n8n has pagination in http get request. That will help to go down page if you are looking more than 10 results at one go
That is definitely possible! Check out this video where I show AI generating personalized messages for leads: th-cam.com/video/fz1RduhHeks/w-d-xo.htmlsi=id3TDgLdMMtGsJx6
You can test out all of them and they should all get the job done. I personally have only tested out OpenAI and ollama and I’ve just been sticking with OpenAI because I think long term it will be the most advanced and most consistent.
oh hey, o am trying to build this tools. Would you mind sharing the code you used to get the linkedin urls from the scraped html? Hitting a snag here with gpt
Here's the code! // Get the HTML content from the previous HTTP Request node const htmlContent = $node["HTTP Request"].json.data; // Adjust if the content is in a different field // Convert the content to a string (in case it isn't already) const htmlString = String(htmlContent); // Regular expression to match LinkedIn profile URLs in href attributes const regex = /https:\/\/www\.linkedin\.com\/in\/[a-zA-Z0-9\-_]+/g; let matches; const linkedinUrls = []; // Extract all LinkedIn URLs found in the HTML while ((matches = regex.exec(htmlString)) !== null) { linkedinUrls.push(matches[0]); // Add the LinkedIn URL to the array } // Remove duplicates by converting the array to a Set and back to an array const uniqueLinkedinUrls = [...new Set(linkedinUrls)]; // Return the list of extracted LinkedIn URLs if (uniqueLinkedinUrls.length > 0) { return uniqueLinkedinUrls.map(url => ({ json: { linkedinUrl: url } })); } else { return [{ json: { message: 'No LinkedIn profiles found' } }]; }
Thank you for this tutorial. It's a great basis for developing leads. I assume that I can add more information to be gathered such as company name, contact details, business address etc.? The big issue that I currently have is parsing the HTML. I have tried using the ChatGPT n8n and it failed on quite a few occasions. can you please provide the code that you used? Thank you so much
Here's the code! // Get the HTML content from the previous HTTP Request node const htmlContent = $node["HTTP Request"].json.data; // Adjust if the content is in a different field // Convert the content to a string (in case it isn't already) const htmlString = String(htmlContent); // Regular expression to match LinkedIn profile URLs in href attributes const regex = /https:\/\/www\.linkedin\.com\/in\/[a-zA-Z0-9\-_]+/g; let matches; const linkedinUrls = []; // Extract all LinkedIn URLs found in the HTML while ((matches = regex.exec(htmlString)) !== null) { linkedinUrls.push(matches[0]); // Add the LinkedIn URL to the array } // Remove duplicates by converting the array to a Set and back to an array const uniqueLinkedinUrls = [...new Set(linkedinUrls)]; // Return the list of extracted LinkedIn URLs if (uniqueLinkedinUrls.length > 0) { return uniqueLinkedinUrls.map(url => ({ json: { linkedinUrl: url } })); } else { return [{ json: { message: 'No LinkedIn profiles found' } }]; }
I made a similar one a couple months ago that scrapes REALLY well. But my method was WAY WAY different than your method. I would love to connect with you about a current challenge I am facing though and maybe we could work through it together?
@@AutoFlow_AI Yes that will happen with APIs, usually not too expensive with some API services. Definitely worth it when you think about how much time you are saving the business!
By instagram version do you mean scraping google for Instagram profiles? If so, yes! You would just have to change the parameters a bit within the HTTP Request node.
Nice stuff. But just out of curiosity. I see You always using GPT for parsing. WHy not just use a Function node and save the GPT credits ? So, like: // Get the query string from the input const queryString = $json["query"]; // Create an object to hold the parameters const params = {}; // Split the query string into key-value pairs queryString.split('&').forEach(pair => { const [key, value] = pair.split('='); if (key) { params[key] = decodeURIComponent(value); } }); // Return the parsed parameters return [{ jobTitle: params["jobTitle"], companyIndustry: params["companyIndustry"], location: params["location"] }];
Thank you so much bro . I’m brand new to this stuff, ngl it was overwhelming. You simplified it so much!!! 🎉 thank you , keep up the great content ❤❤
Great content, man!! Will try this week!!
Great tutorial!
Man!!! Really great and Thanks for sharing your knowledge... Much appreciated!!!
Awesome content. Extremely helpful!
Glad it helped!
Man you are so smooth. I am wondering what screenrecording software you use to recover youself and the screen.
Thanks for the conetne 🎉
Appreciate that! I use OBS to record my screen
@nate The question you asked related to http request has to do with pagination. Not sure if n8n has pagination in http get request. That will help to go down page if you are looking more than 10 results at one go
Ah okay, I tried adding a paginated URL but you may be right and something I will have to play around with again. Thanks for the feedback!!
Thanks 👍 probably if add one more feature that analyse this profile and prepare unique outreach message for the oerson some services will hate you 😅
That is definitely possible! Check out this video where I show AI generating personalized messages for leads: th-cam.com/video/fz1RduhHeks/w-d-xo.htmlsi=id3TDgLdMMtGsJx6
What other free LLM would u prefer instead of gpt ?
could we use gemini ?
You can test out all of them and they should all get the job done. I personally have only tested out OpenAI and ollama and I’ve just been sticking with OpenAI because I think long term it will be the most advanced and most consistent.
oh hey, o am trying to build this tools. Would you mind sharing the code you used to get the linkedin urls from the scraped html? Hitting a snag here with gpt
I can get this to you tomorrow
@@nateherk Thanks Nate
Here's the code!
// Get the HTML content from the previous HTTP Request node
const htmlContent = $node["HTTP Request"].json.data; // Adjust if the content is in a different field
// Convert the content to a string (in case it isn't already)
const htmlString = String(htmlContent);
// Regular expression to match LinkedIn profile URLs in href attributes
const regex = /https:\/\/www\.linkedin\.com\/in\/[a-zA-Z0-9\-_]+/g;
let matches;
const linkedinUrls = [];
// Extract all LinkedIn URLs found in the HTML
while ((matches = regex.exec(htmlString)) !== null) {
linkedinUrls.push(matches[0]); // Add the LinkedIn URL to the array
}
// Remove duplicates by converting the array to a Set and back to an array
const uniqueLinkedinUrls = [...new Set(linkedinUrls)];
// Return the list of extracted LinkedIn URLs
if (uniqueLinkedinUrls.length > 0) {
return uniqueLinkedinUrls.map(url => ({
json: {
linkedinUrl: url
}
}));
} else {
return [{ json: { message: 'No LinkedIn profiles found' } }];
}
@@nateherk Thank youuuu!!!!!!
Thank you for this tutorial. It's a great basis for developing leads. I assume that I can add more information to be gathered such as company name, contact details, business address etc.?
The big issue that I currently have is parsing the HTML. I have tried using the ChatGPT n8n and it failed on quite a few occasions. can you please provide the code that you used? Thank you so much
Here's the code!
// Get the HTML content from the previous HTTP Request node
const htmlContent = $node["HTTP Request"].json.data; // Adjust if the content is in a different field
// Convert the content to a string (in case it isn't already)
const htmlString = String(htmlContent);
// Regular expression to match LinkedIn profile URLs in href attributes
const regex = /https:\/\/www\.linkedin\.com\/in\/[a-zA-Z0-9\-_]+/g;
let matches;
const linkedinUrls = [];
// Extract all LinkedIn URLs found in the HTML
while ((matches = regex.exec(htmlString)) !== null) {
linkedinUrls.push(matches[0]); // Add the LinkedIn URL to the array
}
// Remove duplicates by converting the array to a Set and back to an array
const uniqueLinkedinUrls = [...new Set(linkedinUrls)];
// Return the list of extracted LinkedIn URLs
if (uniqueLinkedinUrls.length > 0) {
return uniqueLinkedinUrls.map(url => ({
json: {
linkedinUrl: url
}
}));
} else {
return [{ json: { message: 'No LinkedIn profiles found' } }];
}
@@nateherk Legend! Thank you so much
@@JonathanBarber-hi3vj No problem
I made a similar one a couple months ago that scrapes REALLY well. But my method was WAY WAY different than your method. I would love to connect with you about a current challenge I am facing though and maybe we could work through it together?
Hey man! Thanks for the feedback. Feel free to shoot me and email and we could set something up! nateherk@uppitai.com
@@AutoFlow_AI Yes that will happen with APIs, usually not too expensive with some API services. Definitely worth it when you think about how much time you are saving the business!
please can you share the javascript code to scrape the urls i am facing the issue in it
Hey! You can find the code in my free Skool community. Just navigate to the classroom and look for Google Scraping AI Agent!
Thank you so much for such great content. Appreciate it!!!!!!
Thank you!!
Amazing work bro! Is it easy to convert to a Instagram version?
By instagram version do you mean scraping google for Instagram profiles? If so, yes! You would just have to change the parameters a bit within the HTTP Request node.
@@nateherk Thank for the quick response brother!! I will test it out this week!
Nice stuff. But just out of curiosity. I see You always using GPT for parsing. WHy not just use a Function node and save the GPT credits ?
So, like:
// Get the query string from the input
const queryString = $json["query"];
// Create an object to hold the parameters
const params = {};
// Split the query string into key-value pairs
queryString.split('&').forEach(pair => {
const [key, value] = pair.split('=');
if (key) {
params[key] = decodeURIComponent(value);
}
});
// Return the parsed parameters
return [{
jobTitle: params["jobTitle"],
companyIndustry: params["companyIndustry"],
location: params["location"]
}];
This is very true and something I should’ve mentioned. I try to make these tutorials as “Low code” as possible, but you’re 100% correct. Thank you!