Webscraping with Python How to Save to CSV, JSON and Clean Data

แชร์
ฝัง
  • เผยแพร่เมื่อ 2 พ.ย. 2024

ความคิดเห็น • 12

  • @AliceShisori
    @AliceShisori 11 หลายเดือนก่อน +1

    I really enjoy this series and will probably need to replay it in the future. this is helpful and practical as it shows the whole process on how to approach it.
    thank you John.

  • @bakasenpaidesu
    @bakasenpaidesu ปีที่แล้ว +5

    Still waiting for the neovim set up video ❤

  • @PanFlute68
    @PanFlute68 ปีที่แล้ว +5

    Thanks for another informative video!
    There is one tiny concern with the append_to_csv code. The file lacks the normal (but optional per RFC 4180) header that some apps expect or that may be needed if there were more fields in the file. This small change would create the header line just once when the file is created. Before the with block simply add this little bit of code:
    # Check if the file exists
    if not os.path.exists('append.csv'):
    # Open file in write mode to write the header line
    with open('append.csv', 'w') as f:
    writer = csv.DictWriter(f, field_names)
    writer.writeheader()

  • @andrepereira1807
    @andrepereira1807 11 หลายเดือนก่อน

    John thanks a lot for your videos! They are really interesting and well made, i learnt a lot with you! Many thanks! CHEERS!

  • @thebuggser2752
    @thebuggser2752 9 หลายเดือนก่อน

    John,
    Another great presentaion!
    Also the program is very logically developed.
    I liked to see list compressions.
    Another idea I think. Could have a GUI front end where user inputs some conditions or product categories or names or whatever, and the program returns records based on the conditions either one at a time or in a table on the form. Just a thought.
    Thanks!

  • @rajatkumar35
    @rajatkumar35 10 หลายเดือนก่อน

    Wouldn't the clean_data function also remove the word "Item" and "$" from the name of the product too?

  • @mohammedaldbag9827
    @mohammedaldbag9827 ปีที่แล้ว

    Thanks for information but I have a question about something similar to this topic. If I have an local web page and I have some graphics in jpg format, how do I scrap them or store them in a specific file by using a web scraper? Thanks alot for all info

  • @lordlegendsss7776
    @lordlegendsss7776 11 หลายเดือนก่อน +1

    I am scrapping a online shopping site
    With from last 10days it's doesn't work properly
    After 3-4 times scan it take about 15-20X more time to scan
    And after again it work smooth for 2-3 times and then again it take lots of time
    Why it's happing
    I m using scrapy py

  • @adarshjamwal3448
    @adarshjamwal3448 ปีที่แล้ว

    Thanks bro for sharing the great content, So if you not have any issue can you make the same or another web scraping content in object oriented programming concept.

  • @chamikagimshan
    @chamikagimshan ปีที่แล้ว

    🧡

  • @theclam1338
    @theclam1338 11 หลายเดือนก่อน

    Can you scrape bet365?