This video tutorial is now about 16 months old, but it's going to help me in pulling in the needed data from my UberEats Deliveries. I realized I needed to automate the data to help maximize my taxes, and the Power Query as a function to bring in multiple pages is really what I need. So thank you very much. I look forward to getting it implemented, so that I can link the Excel files in MS Access for further data analysis.
Hello. I had difficulty in self-study because there were not many Korean books that taught me about power query editing or parameters. Thanks to your video, I have integrated multiple APIs into a single query. I clicked Like and Subscribe. Have a nice day!!
I normally don't leave comments but just hit the "like" (or not) button. But this video honestly, is very well made and helped me a lot. Thank you Computergaga.
Excellent video. I used it to download 150 tabs of info on the WSJ website with all public companies listed in USA and it worked like a charm. Thanks a lot.
Awesome Video! Seriously Days of searching and then you explained everything I needed in 12 minutes! To Fix "Expression.Error" - If you're pulling a lot of dates, or websites, and maybe some of them don't exist or don't exist yet you'll get this error and it'll mess up everything: Solution: @9:20 in the video simply right click your FetchMovie equivalent column header and select Remove Errors.
This was my first fx table. Thank you so much. Please follow this video up with a 5min time intelligence table video if you want. I'm going to watch your other videos.
Wow excellent information by simple teaching ... Even a layman understand this module... Creating more interest to learn...Great Work...Non computer literate man from South India Coimbatore Tamil nadu... Slowly learning computers
I am grateful for this... I had dataset of 40,000+ rows in 82 pages and it worked perfectly. Thank you for sharing the knowledge. I'm gonna try this in power bi too and see if it works. Thanks again!
I am also REALLY happy as it helped me with the following use case. I have addresses I want to scrape off web. They are nicely in a table but there is hiearchy as such: umbrella company -> daugther companies -> branch addresses. I got url's from a sitemap and needed to load everything at once...and this helped me to do exactly that. Hopefully it helps someone else too!
This is gold man, thank you, I was able to fetch currencies from last 7 years from xe site and they love me now. I was also able to get data from steamships lines and so, thank you so much Manz God bless you
This is so easy that it seems impossible. I've tried so many solutions from the internet that were not even close to what you've demonstrated. Huge credit! Thx!
I searched all over the internet and finnally found your video. It is exactly what I need. I cant thank you enough for this well explained and details video. However when I try to create my own thing, I got issue when there is/are pages with no data, the queries stop working from that date. I hope you can help me on the issue. Many thanks in advance.
Thank you so much for such an incredibly informative, easy to follow and helpful tutorial. That was absolutely brilliant for helping me challenge my council tax band increase. Liked and Subcribed.
You guy are awesome, this video is what I needed, and completely blow my mind, I got quite confuse regarding URLs table and URL column, but I handle it and apply exactly as you on example that concern me, big THX.
Brilliant! Excellent teaching skills; very clear and understandable. Would it be possible to do a video showing how to get data from a username & passworded website, please?
This is brilliant! I love how you go through the steps, seems like you're commenting a football match! :) Don't get me wrong it's really well explained! Thanks for that.
FANTASTIC vid. Thank you!!! I’m scanning for Zip Codes and the address in one of the URLs isn’t valid and I get an error. How do I tell the function to ignor it and continue?
Hi, thank you for the video. I have a quick query, If I want to fetch a table after clicking on "Jun. 28-Jul. 4" @0:53 (like for all weeks dynamically) which gives me another table... how to handle this? Please suggest. Thank you.
Great info, what is my URLs are 2 columns instead of 1, in my example I have one for the actuals and one for the goals... thank you I've been searching for this solution for a while
Hi, I know this video is a few years old, but hopefully you occasionally check in for questions/comments. Btw, thanks you for this tutorial, it has made my Excel life so much easier. I was trying to create a new worksheet using over 300 URLs, and I'm wondering if that is just too many because it always gets stuck about a third of the way through loading the rows. Any ideas why?
Depends on the variable. If you're referring to the URL, we will need a way to calculate it using this method. If you're referring to the page, we would build into the query a method to find the table.
This tutorial is so clear and easy to understand. I have some problem, the table result only show 10 data per row (because the website default is show 10 row, but we can change to show all) how can we fetch all data?
Hi Alan happy new year I hope have a good one. You've been a great help so far but on this I'm stuck I want to get certain pages of a paper sent to my excel sheet but all I'm getting is the hyperlink to the paper, can you help also I have 6 sets of 4 figures that I want to check against each other then back track a 1000 times do you think I'm being over optimistic on trying this exercise
Excellent video, question about the data source. If I use a api URL and api-key headers and get results and all. And then pass this file to someone else. Would they be able to extract the API-key. In this case the api-key is not part of the URL, but seperate headers
Not only this is incredibly useful, but is also wonderfully explained. Thanks man!
My pleasure. Thank you very much.
SAME HERE...BUT UR REACTION..WHILE DELETING URLs IN TABLE..!!!!
This video tutorial is now about 16 months old, but it's going to help me in pulling in the needed data from my UberEats Deliveries. I realized I needed to automate the data to help maximize my taxes, and the Power Query as a function to bring in multiple pages is really what I need. So thank you very much. I look forward to getting it implemented, so that I can link the Excel files in MS Access for further data analysis.
Excellent! Thank you.
You, sir, ARE A LEGEND. Searched for ages to accomplish what this allows me and came up empty. When I landed on you, I was home free. Thank YOU!!!
Great to hear, Kurt. Happy to help.
Searched for this all over . This has been the best example I've seen and very well explained. Thank you very much
Glad it was helpful! Thank you 😊
Clear. Concise. To the point. This might be 5 years old, but it just helped pull data from 150 pages and saved me who knows how much time. Thanks!
Become one of my top instructors and youtube channels with only one single video! thanks a million!
Thank you Isparoz.
Hello.
I had difficulty in self-study because there were not many Korean books that taught me about power query editing or parameters.
Thanks to your video, I have integrated multiple APIs into a single query.
I clicked Like and Subscribe.
Have a nice day!!
You are a professional data scientist and we are looking for people like you to join us in the Ummah.
I can't thank you enough for this detailed video (step by step). Its 3 years old but still valid and valuable. God Bless you!
Glad it helped! Thank you 💓
This Video is still the best when it comes to Excel Web Scraping! For me it was a breakthrough. Many thanks!
You're very welcome! Thank you for your comments Johann.
I can't thank you enough, Computergaga! My work project just became a breeze.
You're very welcome.
I never see videos with this many views that have 0 dislikes but after trying this solution, I understand why. Great video!
Thank you very much.
Dude these steps still works until now and easy to understand. Thanks man!
Glad it helped!
I'm overwhelmed by the incredible level of clarity this tutorial displays! Kudos
Wow, thanks!
I normally don't leave comments but just hit the "like" (or not) button. But this video honestly, is very well made and helped me a lot. Thank you Computergaga.
You're very welcome. Thank you for the comment Luigi.
you make my date on time when looking for immediate resolution.Thanks
Excellent video. I used it to download 150 tabs of info on the WSJ website with all public companies listed in USA and it worked like a charm. Thanks a lot.
Awesome! Thanks for sharing.
I couldn't be more grateful to find this channel. Dear Computergaga, you explain everything so clear and understandable :) Real life saviour!
Thank you very much 😊 That's great!
You saved me several hours of manual operational work
Great! 👍
Awesome Video! Seriously Days of searching and then you explained everything I needed in 12 minutes!
To Fix "Expression.Error" - If you're pulling a lot of dates, or websites, and maybe some of them don't exist or don't exist yet you'll get this error and it'll mess up everything:
Solution: @9:20 in the video simply right click your FetchMovie equivalent column header and select Remove Errors.
Excellent Dusty.
Thank you! Thank you! Thank you!
WOW. It works perfectly. It is exactly my need. Such a perfect example. You save me hours and hours of a boring task. Many thanks
Great to hear. Thank you 👍
Easily one of the best videos on TH-cam
Thank you very much 😊
This was my first fx table. Thank you so much. Please follow this video up with a 5min time intelligence table video if you want. I'm going to watch your other videos.
One of the best tutorials I've ever seen! Thanks a lot!
You're welcome. Thank you.
Wow excellent information by simple teaching ... Even a layman understand this module... Creating more interest to learn...Great Work...Non computer literate man from South India Coimbatore Tamil nadu... Slowly learning computers
So nice of you. Thank you Niranjan.
One of the best tutorials. Does the job in matter of minutes.
Thank you, Chirag.
superb , they way you taught is really great, I was searching this all over net
You're welcome. Thank you.
I am grateful for this... I had dataset of 40,000+ rows in 82 pages and it worked perfectly. Thank you for sharing the knowledge. I'm gonna try this in power bi too and see if it works. Thanks again!
How much time does it take to fetch the data
I am also REALLY happy as it helped me with the following use case.
I have addresses I want to scrape off web. They are nicely in a table but there is hiearchy as such: umbrella company -> daugther companies -> branch addresses. I got url's from a sitemap and needed to load everything at once...and this helped me to do exactly that. Hopefully it helps someone else too!
Awesome! Thank you for leaving that comment.
Im not writing comments very often, but I have to say that this video is so useful, interesting yet well eyplained, that I had to.
Thank you 👍 I appreciate your comments, very much.
Worked like a charm!! No tutorial has ever worked for me in 1-go!! Thanks!
Happy to help, Gaurav. Excellent!
Perfectly explained! Why to dislike this !?
Thank you Adr 😀
This is gold man, thank you, I was able to fetch currencies from last 7 years from xe site and they love me now. I was also able to get data from steamships lines and so, thank you so much Manz God bless you
You're very welcome. Nice work Kenneth.
@@Computergaga Can you show us how to do the same thing but with a PDF file? meaning the source would be the folder where the pdf's are located
Awesome tutorial breaking down this complex process in easy to follow steps along with a good example. Thank you sir.
This is so easy that it seems impossible. I've tried so many solutions from the internet that were not even close to what you've demonstrated. Huge credit! Thx!
You're welcome Szymon.
Thank you very much, just spent hours trying to get this to work on other sites.
No worries, Paul. Very glad to help.
I searched all over the internet and finnally found your video. It is exactly what I need. I cant thank you enough for this well explained and details video. However when I try to create my own thing, I got issue when there is/are pages with no data, the queries stop working from that date. I hope you can help me on the issue. Many thanks in advance.
Love it ! Thank you so much! With the new Web Connector, this became so much better!
Oh yes! 👍Thank you. You're welcome.
This was a huge help! Needed to download over 5000 rows of data. You saved me so much time. Thank you!
Awesome to hear. Happy to help Brock.
Thank you! Very useful and it changes the structure of my work a lot. More efficient.
Great to hear. Thank you.
TH-camrs should look to this to see how to do it properly.....excellent.
Thank you, Don.
this is amazing, easy and powerful, many thanks
You're very welcome! Thank you 😊
Thank you so very much...beautifully explained, Highly underrated video. Just Brilliant
Thank you 😊
thank you so much i was able to compile 28k records ...each page had only 100 records per page. was so useful and so well explained
Brilliant! Glad it helped!
I have never seen the ratio between like and dislike in any video. Well done sir
Thank you so much Afzal 😀
Thanks, you helped me finish important part of my Business Intelligence Internship Project
Happy to help 👍
Mahalo! You solved my whole problem just a few minutes into looking into how this works.
👍
Hi can only say thank you !! you're an amazing teacher. Cheers
Thank you! 😃
What a piece of knowledge...very useful for me..thank you very much
You are most welcome. Glad that it was useful.
Life saver.... Can't thank you enough... Been using this for months....
You're welcome, Serkan 👍
Thank you so much for such an incredibly informative, easy to follow and helpful tutorial. That was absolutely brilliant for helping me challenge my council tax band increase. Liked and Subcribed.
Glad it was helpful, Darren.
Very helpful video! Useful technique, and explained in an easy to understand way.
Thank you, Bill.
I was desparately in need of this and your video incrediblely came out to me. Besides your presentation is perfect... thank you
You're very welcome 😊
Very clearly explained. Very good presentation.
Thank you 🙂
You guy are awesome, this video is what I needed, and completely blow my mind, I got quite confuse regarding URLs table and URL column, but I handle it and apply exactly as you on example that concern me, big THX.
Cool tutorial! Thank you. Bombastic ! I have learned how to make a function out of a procedure!
Excellent! No worries, buddy.
This is an excellent video. It helped me overcome a major hurdle today. Thank You!
Great to hear. Thank you, James 😊
Brilliant! Excellent teaching skills; very clear and understandable.
Would it be possible to do a video showing how to get data from a username & passworded website, please?
This is brilliant! I love how you go through the steps, seems like you're commenting a football match! :) Don't get me wrong it's really well explained! Thanks for that.
Thanks! 😃
Fantabulous.......Great explanation & worked at the first go....Thanks a ton
Most welcome, Hari 👍 Thank you.
This is incredibly helpful and useful. Really helped me solve a 2 year headache. Thank you so much. Appreciate the detailed guide.
You're very welcome!
Hitting Subscribe by watching your 1st Video.... ultimate.. The way you explain..
👍
Men should think machines should work you have done a job excellent hats off
Thank you Narayan.
Thanks, solved a data input I was having.
Very well explained.
Thank you. Glad to be able to help.
Excellent & smart stuff, saving tons of work. Grateful for sharing !!!
Very welcome! Thank you.
You have a great teaching style
Thank you, Viktor.
FANTASTIC vid. Thank you!!!
I’m scanning for Zip Codes and the address in one of the URLs isn’t valid and I get an error. How do I tell the function to ignor it and continue?
Thank you! Took me way too long to find this gem
You're welcome.
You have saved me a huge amount of time - thank you so much!
You're very welcome.
That is sick!;) I saw nobody adding some text to PowerBi code before. It looks very advanced and powerful. I really love it. It opens eyes;)
Thank you, Enrike 😊
Excellent !!!! This is what I was looking for. Very well explained step by step.
Thank you, Dhananjay
Thank you Computergaga. This was a superb explanation and tutorial.
You are welcome, Robert.
Thank you so much! You really saved me here (And taught me how to use a great and powerful tool)!
Awesome! Good to hear.
Thanks so much for your excellent tutorial!!! It really helped get the job done!!! 👍👍👍
Hi, thank you for the video. I have a quick query, If I want to fetch a table after clicking on "Jun. 28-Jul. 4" @0:53 (like for all weeks dynamically) which gives me another table... how to handle this? Please suggest. Thank you.
WOW!! I know this video is old.. But thank you for making it. I never would've figured this out on my own 😅
You're very welcome! Glad that it helped.
Excellent, easy and smooth explanation 👌
Thanks a lot 😊
Great video and great explanation !!! Congratulations
Thank you, Xavier.
Many thanks, Mr Computergaga - awesome!
Thank you so much, Ian.
This is super usefull, I wont remember it, but I will come back again when I will need it again
Thank you!
Great info, what is my URLs are 2 columns instead of 1, in my example I have one for the actuals and one for the goals... thank you I've been searching for this solution for a while
it is amazing technique very easy to manage. by the way could we make the query for local excel table, just like this form of query?
This is great tutorial and very clear explanation
Thank you very much.
This is an excellent tutorial - clear and easy to follow, thank you! It's saved me a ton of time.
You're welcome, Matthew. Thank you.
This tutorial is pure Gold! Thanks! Looking forward to more videos from you. :)
Thank you very much Subanta.
Thanks, Alan. You are awesome!
Thank you 😊 you're welcome
This is brilliant mate. Many thanks
Hi, I know this video is a few years old, but hopefully you occasionally check in for questions/comments. Btw, thanks you for this tutorial, it has made my Excel life so much easier.
I was trying to create a new worksheet using over 300 URLs, and I'm wondering if that is just too many because it always gets stuck about a third of the way through loading the rows.
Any ideas why?
Thanks alot! This is perfect. However, I was wondering if there’s a workaround for if the columns from the web are variable.
Depends on the variable. If you're referring to the URL, we will need a way to calculate it using this method.
If you're referring to the page, we would build into the query a method to find the table.
Many thanks for your great help ❤️❤️❤️❤️❤️❤️❤️
You're welcome 😊
Creative solution to a problem I was having! Amazing job and very helpful.
Thank you.
@@Computergaga you're welcome. Do you have any courses/tutorials outside of youtube that I can check out, or is all your stuff on your channel?
@@jkkslldsl Sure. I have courses and tutorials at my blog - www.computergaga.com
This tutorial is so clear and easy to understand. I have some problem, the table result only show 10 data per row (because the website default is show 10 row, but we can change to show all) how can we fetch all data?
Thank you for this video.. Saved me so much time
Great to hear! Thank you.
Fantastic!! It saved me so much time. Very well explained
Glad it helped! Thank you, Luis.
Hi Alan happy new year I hope have a good one. You've been a great help so far but on this I'm stuck I want to get certain pages of a paper sent to my excel sheet but all I'm getting is the hyperlink to the paper, can you help also I have 6 sets of 4 figures that I want to check against each other then back track a 1000 times do you think I'm being over optimistic on trying this exercise
Incredible video. I have been trying to learn how to do this for ages now, and every other video I've seen isn't near as helpful.
Great! Happy to help, Nathan.
This was a great video. Thanks so much for the tutorial
You are so welcome!
Excellent video, question about the data source. If I use a api URL and api-key headers and get results and all. And then pass this file to someone else. Would they be able to extract the API-key. In this case the api-key is not part of the URL, but seperate headers
Good one Bro I got some manual work automated now...thanks verymuch
Great 👍
This video was amazing and so easy to follow. Thanks!!!
You're welcome, Sam. Thank you 😊