Fetching data using Web Scraping | Day 18 | 100 Days of Machine Learning

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 เม.ย. 2021
  • Machine learning algorithms are powerful tools for analyzing large amounts of data. Developers who need more training data than they have access to can use a web scraping tool to extract the right kind of information from publicly available websites.
    Web Scraping is a technique to extract data from websites, automating the process of fetching information for analysis.
    Code used:
    github.com/campusx-official/1...
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.campusx.in/s/store
    ============================
    📱 Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    Instagram: / campusx.official
    E-mail us at support@campusx.in

ความคิดเห็น • 204

  • @learnenglish699
    @learnenglish699 2 ปีที่แล้ว +114

    i cried when i undertstood all the concept..teachers like you should exist...be lated happy teachers day ...take a bow

    • @snehaltheprogrammer
      @snehaltheprogrammer 4 หลายเดือนก่อน

      Can you please tell me from where he took this header url ? Which url is this ?

    • @RohitKumar-wb4pe
      @RohitKumar-wb4pe 3 หลายเดือนก่อน

      @@snehaltheprogrammer he has given his github link in description, where u can find the header url.

    • @AnshAneja-cf3le
      @AnshAneja-cf3le หลายเดือนก่อน

      @@snehaltheprogrammer Koi bhi random header h wo. Net pe jake random headers use kar skte ho

  • @Shubham-su7sm
    @Shubham-su7sm ปีที่แล้ว +13

    I was so much frustrated because I couldn’t find any good Web Scraping tutorials. Thanks man, You’re a legend 🙏🏻🙏🏻

  • @simranagichani4943
    @simranagichani4943 4 หลายเดือนก่อน +6

    No words to describe how clear and descriptive your videos are. You're just legen-dary!

    • @TheNeu2ron
      @TheNeu2ron 2 หลายเดือนก่อน

      yes he is just amazing after watching day 17 videos it just blow my mind..

  • @kalam_indian
    @kalam_indian ปีที่แล้ว +38

    I don't know which word i should use to express my respect love for you, the best teacher trainer i ever see and totally soft cool down to earth matured behaviour with a charming smile throughout... we all can do only one thing which you will definitely like, praying for you and your family,... We will do this regularly, you are just just above all...

  • @walidadmahtam7512
    @walidadmahtam7512 ปีที่แล้ว +1

    Great nitish bro. very late but find you as my first teacher after researching and analyzing for six months. I am happy now that I am following the path in the right direction🙏😊

  • @user-bs5ck1yh8u
    @user-bs5ck1yh8u 5 หลายเดือนก่อน

    Well Done, Nitish you are such a great teacher. I have been following your playlist, even I usually dont watch lengthy videos but the way you teach is incredible and I love to watch your videos. You have proved to be the greatest support for the Data Science enthusiasts.

  • @paragvachhani4643
    @paragvachhani4643 ปีที่แล้ว +1

    U remembered my school days teachers... Thise r good at teaching along with ample emotion like you..
    Jab emotion hote tab hi koy banda ko kuchh sikhasakte...
    India needs teachers like you at higher lever education ...
    Have a great day ...Guruji

  • @sid9137
    @sid9137 ปีที่แล้ว +13

    Man, this is top notch content, I feel lucky to find this playlist of your sir.
    Thank you for helping people like me to excel our data skills, I really wonder why I couldn't find this playlist earlier, the content is pure quality and your teaching skills are mind blowing. Thank you once again sir.

  • @deb1995
    @deb1995 ปีที่แล้ว +3

    Your videos are like the Oxygen for data science & data engineering domain..The way of teaching skill is highly appreciated.
    I was literally stucked to know about API therefore I visited multiple website, YT channels but didn't get any satisfaction & confidence hence finally after 3 months of extreme searching on the internet I landed up on your video..... thanks to you & TH-cam recommendations engine 🎉

  • @gourabguha3167
    @gourabguha3167 4 หลายเดือนก่อน +4

    Sir, getting addicted to your channel, kash I saw your tutorials earlier

  • @te_b4_73_sushant_yelurkar4
    @te_b4_73_sushant_yelurkar4 2 ปีที่แล้ว +4

    Bhai tu ek number hai
    Good teacher
    Tere jaisa koie nhi ❤️❤️❤️

  • @zunaidqureshi8521
    @zunaidqureshi8521 ปีที่แล้ว +4

    I struggle to find these concepts of web scrapping , but you explained in just one video,
    Thanks sir for this wonderful explanation

  • @kasiviswanath7981
    @kasiviswanath7981 ปีที่แล้ว +1

    The best youtube channel for machine learning and data science.Thak you sir for your efforts

  • @mathics2869
    @mathics2869 2 หลายเดือนก่อน

    Best channel for Machine Learning , Thank u so much

  • @wrenchfusion
    @wrenchfusion 3 ปีที่แล้ว +9

    you are the best of the best teacher. A great explanation saray concepts clear ho gae. thanks alot

  • @BollywoodDuffer
    @BollywoodDuffer 7 หลายเดือนก่อน

    The way you teach, is superb. Thanks!

  • @TheChocovini
    @TheChocovini หลายเดือนก่อน

    The way you teach is amazing but even your editing skills are awesome !!

  • @rahulmungali7810
    @rahulmungali7810 ปีที่แล้ว

    Best video for web scrapping.👍👍👍

  • @sameergupta150
    @sameergupta150 ปีที่แล้ว

    Highly under-rated channel.

  • @mohitjoshi8818
    @mohitjoshi8818 3 หลายเดือนก่อน

    Thankyou for posting this sir, very well explained.

  • @zaidnadeem4918
    @zaidnadeem4918 ปีที่แล้ว +1

    Amazing teacher
    My All time favourite.
    MAY ALLAH BLESS YOU SIR.

  • @mangeshtakras1889
    @mangeshtakras1889 ปีที่แล้ว

    best teaching of all💐💐💐

  • @viditvaish7317
    @viditvaish7317 7 หลายเดือนก่อน

    amazing sir ,bahut accha explain karte hai aap sir ,best video sir

  • @AbhishekKumar-eh6zy
    @AbhishekKumar-eh6zy 2 ปีที่แล้ว +1

    thankyou so much sir for teaching this kind of important topic

  • @user-lg7xg9mn5o
    @user-lg7xg9mn5o 4 หลายเดือนก่อน

    wow amazing video I have watched many videos but can't clear so this man help me in this web scrapping skill

  • @anandtale1147
    @anandtale1147 ปีที่แล้ว +2

    you are awesome, generally, I learn this using regular expressions, but i don't think there will be another way using logic of HTML though I don't know that time, but now I have one more way to scrap, and this is simple and easy way thank you god bless you🙌🙌

  • @depalvveturkar
    @depalvveturkar ปีที่แล้ว

    WOW...I have not seen such a course..awesome...may god bless you with all the sucess..

  • @ankurshivhare8132
    @ankurshivhare8132 ปีที่แล้ว

    very nice web scraping tutorial, i learnt and scrap data from internshala website.
    Thank you

  • @Aestheticdeeps
    @Aestheticdeeps หลายเดือนก่อน

    really the best piece🙌thank you soo much sir

  • @dailyenglishvocabulary3116
    @dailyenglishvocabulary3116 2 ปีที่แล้ว

    Sir your channel deserves millions of subscribers

  • @narendraparmar1631
    @narendraparmar1631 6 หลายเดือนก่อน

    Good Lecture Sir , Thanks😀

  • @bitanbarman7930
    @bitanbarman7930 11 หลายเดือนก่อน +2

    If any one has problem for extracting ' no. of reviews' of companies, then---
    for i in range(0,115):
    if i % 6 == 0:
    print(soup.find_all('span',class_ = 'companyCardWrapper__ActionCount')[i])
    .
    .
    .
    .
    .
    The website has since changed its formatting. And used the same class for writing -- 'salaries' , 'interviews', 'jobs', ' benefits', 'photos'.
    .
    .
    .
    Every 6 element is the company review, so run a loop for extracting every 6th element.

    • @tarungoyal9395
      @tarungoyal9395 10 หลายเดือนก่อน

      can u explain me this code?
      why u use range(0,115)

    • @bitanbarman7930
      @bitanbarman7930 10 หลายเดือนก่อน

      @@tarungoyal9395 because the webpage in question has since updated and within the same 'span' & 'class' has 6 new elements for. On the page 1, there are 20 companies listed, 6 * 20 =120, every 6th element is the 'no. of reviews', 0,6,12, ...114 (the last one). You could actually use 120, but it will make no sense, cos 120/6 = 20, (indexing starts at 0, so 114the element is the last 'no. of review' [20] for the company in that page).

    • @tyetc206prathamsaboo2
      @tyetc206prathamsaboo2 6 หลายเดือนก่อน

      Thank You buddy...

    • @PraveenKumar-vd8ev
      @PraveenKumar-vd8ev 26 วันที่ผ่านมา

      Thanks BRO...

  • @anandtalware2283
    @anandtalware2283 10 หลายเดือนก่อน

    Thank you🙏🙏.. Your explanation is awesome...

  • @ArunKumar_237
    @ArunKumar_237 ปีที่แล้ว

    great class thank you sir

  • @tajveertyagi3084
    @tajveertyagi3084 2 ปีที่แล้ว +1

    Great !! concept are well explained in all the videos .

  • @ShivamKumar-zq8jv
    @ShivamKumar-zq8jv ปีที่แล้ว

    Thanku so much for your efforts sir🙏🙏🙏

  • @karansen9101
    @karansen9101 ปีที่แล้ว

    I have no words to express my happiness thanku very much sir

  • @AbdurRahman-lv9ec
    @AbdurRahman-lv9ec 6 หลายเดือนก่อน +2

    Kudos, Nitish ! Your teaching style is phenomenal, and I've learned so much from your Python and Machine Learning tutorials. As I delve deeper into my passion for data engineering, I can't help but wish for a mentor like you in this field. Your guidance would be invaluable. Any plans to explore data engineering topics?

  • @samtyagi6754
    @samtyagi6754 5 หลายเดือนก่อน

    as I progress in this playlist my respect for u sir is increasing day by day ❤

  • @travelogue.brothers
    @travelogue.brothers 8 หลายเดือนก่อน

    Nitish sir coz of you only i was able to get a good internship. I just dont how can i thank you . Thanks a lot sir🥺 Love from jammu.

  • @grandson_f_phixis9480
    @grandson_f_phixis9480 หลายเดือนก่อน

    Thank you very much sir

  • @azqamubeen3934
    @azqamubeen3934 ปีที่แล้ว

    thanks sir wonderfull work you are doing sir

  • @danishthev-log2264
    @danishthev-log2264 6 หลายเดือนก่อน

    I really like your teaching way

  • @webscrapingfreelancer76
    @webscrapingfreelancer76 ปีที่แล้ว

    Very well explained

  • @HappyHumbleHopefulHelpKey
    @HappyHumbleHopefulHelpKey 5 หลายเดือนก่อน

    Love you and your dedication sir ❤

  • @siddhant_n
    @siddhant_n ปีที่แล้ว

    dhanyavad sir.

  • @sagarkhule6439
    @sagarkhule6439 ปีที่แล้ว

    Best explanation foe web scrapping tysm buddy

  • @MohdRafi_Lover
    @MohdRafi_Lover 7 หลายเดือนก่อน

    Nice teaching sir.

  • @ashutoshyadav8653
    @ashutoshyadav8653 ปีที่แล้ว

    very knowledgeable video👍👍👍👍👍

  • @surajnikam3327
    @surajnikam3327 6 หลายเดือนก่อน

    sir you are just wowwww♥♥♥. I learned and enjoyed this video. Thank you sir

  • @learnomics
    @learnomics ปีที่แล้ว

    Also, I'm a web developer. This will be very easy for me. Now, I can extract any data from a web pages!

  • @TheNeu2ron
    @TheNeu2ron 2 หลายเดือนก่อน

    Your api day 17 video is just amazing

  • @aayushisolanki2935
    @aayushisolanki2935 ปีที่แล้ว

    most underrated channel

  • @atifsalam6802
    @atifsalam6802 ปีที่แล้ว

    Excellent bro, buhat he bariyah keep it up 👍

  • @mainakseal5027
    @mainakseal5027 5 หลายเดือนก่อน

    what an amazing explanation sir!!! loved it!!! loved IT!!

    • @AryanGuleria-kj1gt
      @AryanGuleria-kj1gt 5 หลายเดือนก่อน

      hey!!! if u are interested in learning and sharing knowledge in web scraping pls contact

  • @-mnv
    @-mnv 11 หลายเดือนก่อน +1

    For anyone trying this in 2023, As of pandas 2.0, append (previously deprecated) was removed. So at 36:37 instead of final = final.append(df) use final = pd.concat([final,df],ignore_index=True)

    • @kaushalsurana6336
      @kaushalsurana6336 11 หลายเดือนก่อน

      still my program is not giving output , it is showing empty dataframe

    • @kaushalsurana6336
      @kaushalsurana6336 11 หลายเดือนก่อน

      please reply if you know how to solve it

    • @mohammadaffan5001
      @mohammadaffan5001 8 หลายเดือนก่อน

      ​@@kaushalsurana6336its working dude...then you might have done some mistakes..

    • @kapilraisinghani8105
      @kapilraisinghani8105 4 หลายเดือนก่อน

      thanks bhai for the help

  • @JACKSPARROW-ch7jl
    @JACKSPARROW-ch7jl ปีที่แล้ว

    Thank u bro , u are doing great job , i will certainly pay you something , if i can get any job in data science field in next two years🙌🙌🙌

  • @grithijain132
    @grithijain132 6 หลายเดือนก่อน

    Best thank you so much campus x ...

  • @sudhanshusingh5594
    @sudhanshusingh5594 ปีที่แล้ว

    "my feature extraction" (it's a Nick name which I gave you & I really really really respect you a lot) once again thank you so much .

  • @heetbhatt4511
    @heetbhatt4511 10 หลายเดือนก่อน

    thank you sir

  • @_._Harsh_._
    @_._Harsh_._ 8 วันที่ผ่านมา

    "Bhagwaan ka naam leke" was epic🤣🤣
    Anyways, a great video for Web Scrapping to understand the basics
    Thank you for this!!!

  • @darkshadowgaming1102
    @darkshadowgaming1102 3 หลายเดือนก่อน

    You are angel sir... Salute you.. thanks sir...

    • @ritikkumarsah7034
      @ritikkumarsah7034 14 วันที่ผ่านมา

      Headers = jo link hai wo use nhi ho rha h main apne Jupyter Notebook me kr rha hu to
      Any solution

  • @debashispatra2368
    @debashispatra2368 ปีที่แล้ว

    At last it was very funny .... i have understood all the concepts that u have taught .... i would wish if i can join ur live sections

  • @vijayjadonthakur
    @vijayjadonthakur ปีที่แล้ว +1

    The way of teaching is awesome. Thanks

    • @aisharawat9102
      @aisharawat9102 ปีที่แล้ว

      Hii actually I tried to code it but when I used requests library it is showing this -> name 'requests' is not defined. have you faced this thing or not. can you tell me what can I do?

    • @adarshkumarsingh1509
      @adarshkumarsingh1509 ปีที่แล้ว

      @@aisharawat9102 may be you haven't requests library installed in your system in that case you can use pip install requests to install it moreover if you use anaconda environment then you can see the web this may be one issue other might be just clear all and restart the kernal may be it'll solve
      extra tip :- just see for spell errors sometimes the problem is so tiny that we'cant think of

  • @anniedhawan7945
    @anniedhawan7945 2 ปีที่แล้ว

    Amazing !!!!

  • @MuhammadJunaid-yr8jd
    @MuhammadJunaid-yr8jd ปีที่แล้ว

    thank you so much

  • @sandipansarkar9211
    @sandipansarkar9211 ปีที่แล้ว

    finished watching

  • @siddhigolatkar8558
    @siddhigolatkar8558 6 หลายเดือนก่อน

    Thank you Sir.. God Bless you

    • @ritikkumarsah7034
      @ritikkumarsah7034 14 วันที่ผ่านมา

      Headers = jo link hai wo use nhi ho rha h main apne Jupyter Notebook me kr rha hu to
      Any solution please

  • @laurkids
    @laurkids 3 หลายเดือนก่อน

    Superb

  • @youtubekumar8590
    @youtubekumar8590 ปีที่แล้ว

    Thanku sir, bhaiya

  • @shubham-yy8tj
    @shubham-yy8tj 7 หลายเดือนก่อน

    Thank You....

  • @anshulsharma7080
    @anshulsharma7080 ปีที่แล้ว

    Unfortunately list out of range so unable to see the magic at last however great work by this man.booooooom💫💫

  • @rajkumarshedage3200
    @rajkumarshedage3200 ปีที่แล้ว

    Great❤

  • @Dipenparmar12
    @Dipenparmar12 7 หลายเดือนก่อน

    Thanks.

  • @raj4624
    @raj4624 2 ปีที่แล้ว

    Gem video

  • @HetkumarPatel-bf9sc
    @HetkumarPatel-bf9sc 3 หลายเดือนก่อน

    ending badhiya tha

  • @salonijain6565
    @salonijain6565 ปีที่แล้ว

    Amazing explanation, how to know which websites are providing APIs and which are not.

  • @ramangupta4061
    @ramangupta4061 ปีที่แล้ว

    I love you Brother❤❤❤❤❤

  • @thomasmuller7733
    @thomasmuller7733 2 ปีที่แล้ว +3

    Hi sir, nice video and great explanation.
    I have a question.
    Some tags are showing inspect but not showing on source code. Is there any way to extract that too? Or am I missing something?
    Thanks in advance.

  • @madhavneupane168
    @madhavneupane168 5 หลายเดือนก่อน

    Thanku so much for this. this is very remarkable video to learn. what is the code for single constant webpage has many next pages tabular data ?

  • @Tech_insider168
    @Tech_insider168 5 หลายเดือนก่อน

    00:02 Today's video discussed web scraping and its application
    02:47 Web scraping for fetching company data
    08:51 Introduction to web scraping and Beautiful Soup
    11:22 Using BeautifulSoup to locate specific elements in a web page.
    16:55 Using web scraping to extract specific data from a web page
    19:13 Data extraction challenges using web scraping
    24:15 Understanding data scraping and its applications
    26:33 Using web scraping to fetch data and finalize a single item list
    30:53 Using web scraping to fetch data

  • @tejasburadkar189
    @tejasburadkar189 2 ปีที่แล้ว +1

    hi sir.. thankyou for this tutorials but i question is u created one variable is called Headers and inside the headers dict contains string where you get it that... i didnt get that point... waiting for your response

  • @osho_magic
    @osho_magic ปีที่แล้ว

    i also feel lucky to find you

  • @somnathghosh854
    @somnathghosh854 ปีที่แล้ว

    Hello Sir, thanks for this video. my question is how we get that 'headers' that you used in this video?

  • @shwetkumar8370
    @shwetkumar8370 ปีที่แล้ว +2

    FeatureNotFound: Couldn't find a tree builder with the features you requested: lxml. Do you need to install a parser library?
    I'm getting this error .. I also tried to install lxml via !pip install lxml command but it returns
    equirement already satisfied: lxml in c:\users
    ame\anaconda3\lib\site-packages (4.9.1)
    what to do??

  • @munisyamputtur9673
    @munisyamputtur9673 ปีที่แล้ว

    shall you provide the both website and additional link for accesss . so we gonna practise the web scraping

  • @ahmbilal
    @ahmbilal 4 หลายเดือนก่อน +1

    if website have no restriction then still we need to write that header code ?

  • @kalam_indian
    @kalam_indian ปีที่แล้ว

    After seeing krish Naik video and now seeing your video, oh my god zameen aawman ka difference, you are just above all and very difficult to reach your level...

  • @krishnakanthmacherla4431
    @krishnakanthmacherla4431 2 ปีที่แล้ว

    Done

  • @puspanjalimuduli9159
    @puspanjalimuduli9159 5 หลายเดือนก่อน

    why do my data doesnot appear like Dataframe format.It appear without the bold dark column heading?

  • @NehaSen10
    @NehaSen10 ปีที่แล้ว

    Sir how did you get the header code to access another website data,
    I tried many times to copy this code and fetch data but i unable so please help me how I can do scrap that data

  • @IEI-ECE
    @IEI-ECE ปีที่แล้ว

    Nice explanation Sir. i want to know how do i get headers information

  • @roselinamoven7986
    @roselinamoven7986 หลายเดือนก่อน

    what happens when the lists are of different lengths ie if there are missing values in for some fields?

  • @bhushanmali7635
    @bhushanmali7635 ปีที่แล้ว

    How we get header in second shell of notebook at 7.02

  • @kidzz5153
    @kidzz5153 ปีที่แล้ว +1

    Now url address is different like it's not page=1 and so on...
    What to do then...

  • @Superjhonwick
    @Superjhonwick 2 หลายเดือนก่อน

    How did you get this header? Could you please give a detail explanation regarding the same?

  • @rohitjadhav2270
    @rohitjadhav2270 11 หลายเดือนก่อน

    Sir Nice Video !! Just one question , for some products all information available ,but somewhere rating are not available !! So how to deal with that ? For example len of product name is 267 but len of rating is 247 ,this because for some products rating is not there ,so how to deal with this to get proper DataFrame !! Thank You !!

  • @aymanijaz
    @aymanijaz ปีที่แล้ว

    How tp fix : IndexError: list index out of range .... ?
    getting this error on ratings

  • @parthpatel473
    @parthpatel473 ปีที่แล้ว

    Sir headers are different from different laptops or same for all laptops plz anyone know about it so plz reply .

  • @jangiramitkumar439
    @jangiramitkumar439 ปีที่แล้ว

    during watching your videos we feel like the problem you are facing , actually we are facing , means connect kr pa rhe hai