OMG... This just saved me potentially hours, days, weeks, etc. Was literally like meh even if I spend the next couple of hours trying to figure out a way to copy the 17,000 comments ( not nested ) I need from this 1 video for my project & it doesn't work out, I will still be better off considering my pc can only load like 10 comments when scrolling per second & then it crashes after like max 1000. YOU ARE AN ABSOLUTE LIFESAVER & if I wasn't so goddamn broke I would genuinely tip you heavy for this. Thanks brother!
Man thank you very much! Your code really helps me with my bachelors degree! It is my second day with these codes and you really did a good job for a rookie me!
You're such a blessing. I spent ages trying to figure out how to extract comments on a topic on Twitter. I recently thought about extracting comments from a TH-cam video about that topic and came across your video. Thank you so so much.
Its one of the only free places left because everyone is cutting off access to their API due to LLM projects scraping the data. Glad you found it useful. 👍
interesting. I think it would take me a bit to fully understand how to do this. Is there a program or something out there to scan all that data and find keywords, or most found phrases etc... or would you just filter in excel to find such data?
Do you know if it is possible that the number of replies shown in the browser doesnt match with the number of comments fetched using the API? I am trying to get comments to a specific video and the number of comments I receive are 5-6 more than what is shown in the browser. My theory is that since youtube hides certain comments (such as ones contains urls), so those are not shown in the browser but the API can still access it. I still need to test it.
@@JwalinBhatt I don't know, I perceive some specific things when it comes to spams, like url sharing or comments indicating use this app and earn money fast etc. but if you were able to see the comments appearing in api but don't seem in ui, I think you should have experienced what is defined as spam by youtube.
I have a project in my mind to start, if I encounter the situation you mentioned and find out earlier what may be the reason, I'll keep you updated here
Thank you, it works well. It doesn't however handle the scenario of a playlist containing a video marked as "private". Can't post the error log here on YT :(
Hello, thank you for the great informative video. I will need to extract about 4 million comments from a video. Do you think it would be possible on my local computer with this code? I am currently trying but taking a lot of time for sure.
@@analyticswithadam no worries, in around 15 minutes chat gpt create me a code to get all the comments from a video, i used a youtube api and everything works correct thanks for the video and the idea
مشكلة ملفات csv لا تعرض جميع التعليقات وايضا لا تعرضها اصلا لا يظهر سوى حروف مقطعة ورموز ولا يظهر سوى اخر ٢٠٠ خانة اتمنى لو يرجعون ملف html ، افضل من هذا
Try the method in this video Not the Gemini part just the retrieving comments part… This should come out correct in the Google sheet with all the symbols intact How to Extract TH-cam comments in sheets and analyse with AI with 1 click th-cam.com/video/1KwFFb_tofY/w-d-xo.html
Thanks a lot for the video, is there a way we can know how long it is going to take for the code to get all comments. Something like a tqdm? The infinite while loop makes things tricky but maybe you can find a way :)
Its not a infinite loop its executes while they is a next page of comments on the videos chosen from the playlist. Tricky to count how many that is from the outset.
@@analyticswithadam Of course yes that's what I meant, the terminating condition is within the inf loop coz we dont know when it will end. But thanks anyway :)
Great content! I have 2 questions: 1. Will this work on YT shorts as well? 2. What's the limit/quota for the free TH-cam API, something like 10,000 comments per day?
I tweaked the code to retrieve comments and replies just from a single video. To keep things simple, I am just using the side effect aka print function to get the comments and replies and I am not writing them to a csv. All credit goes to @analyticswithadam. PS: The indentation might have got corrupted. Fix the indentation if you were to grab a copy of the modified code from below. from googleapiclient.discovery import build API_KEY = "Replace with your own API Key here" # Get the Video Id video_id = input("VideoId: ") # Build the TH-cam client youtube = build('youtube', 'v3', developerKey=API_KEY) # Function to get replies for a specific comment def get_replies(youtube, parent_id, video_id): # Added video_id as an argument replies = [] next_page_token = None while True: reply_request = youtube.comments().list( part="snippet", parentId=parent_id, textFormat="plainText", maxResults=100, pageToken=next_page_token ) reply_response = reply_request.execute() for item in reply_response['items']: comment = item['snippet'] replies.append({ 'Timestamp': comment['publishedAt'], 'Username': comment['authorDisplayName'], 'VideoID': video_id, 'Comment': comment['textDisplay'], 'Date': comment['updatedAt'] if 'updatedAt' in comment else comment['publishedAt'] }) next_page_token = reply_response.get('nextPageToken') if not next_page_token: break return replies # Function to get all comments (including replies) for a single video def get_comments_for_video(youtube, video_id): all_comments = [] next_page_token = None while True: comment_request = youtube.commentThreads().list( part="snippet", videoId=video_id, pageToken=next_page_token, textFormat="plainText", maxResults=100 ) comment_response = comment_request.execute() for item in comment_response['items']: top_comment = item['snippet']['topLevelComment']['snippet'] all_comments.append({ 'Timestamp': top_comment['publishedAt'], 'Username': top_comment['authorDisplayName'], 'VideoID': video_id, # Directly using video_id from function parameter 'Comment': top_comment['textDisplay'], 'Date': top_comment['updatedAt'] if 'updatedAt' in top_comment else top_comment['publishedAt'] }) # Fetch replies if there are any if item['snippet']['totalReplyCount'] > 0: all_comments.extend(get_replies(youtube, item['snippet']['topLevelComment']['id'], video_id)) next_page_token = comment_response.get('nextPageToken') if not next_page_token: break return all_comments # List to hold all comments from all videos all_comments = [] video_comments = get_comments_for_video(youtube, video_id) all_comments.extend(video_comments) for _ in all_comments: print(_["Comment"])
@@analyticswithadam I can't create a playlist because I want to analyze the comments of a single video with millions of views from a third party. It doesn't have a playlist, it has 1 isolated video, I wanted to export the comments from these videos, I can't create a playlist.
OMG... This just saved me potentially hours, days, weeks, etc. Was literally like meh even if I spend the next couple of hours trying to figure out a way to copy the 17,000 comments ( not nested ) I need from this 1 video for my project & it doesn't work out, I will still be better off considering my pc can only load like 10 comments when scrolling per second & then it crashes after like max 1000. YOU ARE AN ABSOLUTE LIFESAVER & if I wasn't so goddamn broke I would genuinely tip you heavy for this. Thanks brother!
Glad it helped.
Man thank you very much! Your code really helps me with my bachelors degree! It is my second day with these codes and you really did a good job for a rookie me!
You're such a blessing.
I spent ages trying to figure out how to extract comments on a topic on Twitter.
I recently thought about extracting comments from a TH-cam video about that topic and came across your video.
Thank you so so much.
Its one of the only free places left because everyone is cutting off access to their API due to LLM projects scraping the data. Glad you found it useful. 👍
Thank you a lot with this video, you saved me with a university project, after the page X blocked the API for all of us, thank you a lot
No worries , it’s hard to get api data these days
Wowsers. This went right over my head. It assumes lots of prerequisite knowledge.
There were a couple of video before this. Also I have provided the code so essentially all you need to do is change the playlist ID and it should work
Man thanks a ton for this bro. Great stuff. Didn't realize it was this easy
interesting. I think it would take me a bit to fully understand how to do this. Is there a program or something out there to scan all that data and find keywords, or most found phrases etc... or would you just filter in excel to find such data?
You would use Python or Excel
Do you know if it is possible that the number of replies shown in the browser doesnt match with the number of comments fetched using the API? I am trying to get comments to a specific video and the number of comments I receive are 5-6 more than what is shown in the browser. My theory is that since youtube hides certain comments (such as ones contains urls), so those are not shown in the browser but the API can still access it. I still need to test it.
did you ever try by yourself, like adding some spam things from your accounts and then see whether they seem in api but don't in browser ui?
@@ecqmjrNo I havent, but what even would be such things?
@@JwalinBhatt I don't know, I perceive some specific things when it comes to spams, like url sharing or comments indicating use this app and earn money fast etc. but if you were able to see the comments appearing in api but don't seem in ui, I think you should have experienced what is defined as spam by youtube.
I have a project in my mind to start, if I encounter the situation you mentioned and find out earlier what may be the reason, I'll keep you updated here
@@ecqmjr hmm yeah that could be the reason. That would be nice, I'll look forward to hearing back from you. Thanks Dan 👍🏼
Hey, do you know a way to find all the videos with the phrase written in the comments like "this is the best song ever"? thank you
You could if you limited it to specific channels but you can’t do a general search as you can only take down 10k comments at a time
thanks so much, let me try
You are truly amazing! Thankyou so much.
Thank you, it works well.
It doesn't however handle the scenario of a playlist containing a video marked as "private".
Can't post the error log here on YT :(
Hello, thank you for the great informative video. I will need to extract about 4 million comments from a video. Do you think it would be possible on my local computer with this code?
I am currently trying but taking a lot of time for sure.
Should be throw me the link for the video I’d be interested in testing the limits
@@analyticswithadam th-cam.com/video/kJQP7kiw5Fk/w-d-xo.html
th-cam.com/video/RgKAFK5djSk/w-d-xo.html
@@analyticswithadam or i could get partially but i dont know how to set a number or range
i guess i cannot share the link despacito and see you again official song videos
thanks man, it's really work, have a nice day :)
Thanks for the nice comment
Man it works only for a video??
For a number of playlists
@@analyticswithadam no worries, in around 15 minutes chat gpt create me a code to get all the comments from a video, i used a youtube api and everything works correct thanks for the video and the idea
@@RonalRomeroVergelhellooooo listen
مشكلة ملفات csv لا تعرض جميع التعليقات وايضا لا تعرضها اصلا لا يظهر سوى حروف مقطعة ورموز
ولا يظهر سوى اخر ٢٠٠ خانة
اتمنى لو يرجعون ملف html ، افضل من هذا
Try the method in this video
Not the Gemini part just the retrieving comments part… This should come out correct in the Google sheet with all the symbols intact
How to Extract TH-cam comments in sheets and analyse with AI with 1 click
th-cam.com/video/1KwFFb_tofY/w-d-xo.html
Thanks a lot for the video, is there a way we can know how long it is going to take for the code to get all comments. Something like a tqdm? The infinite while loop makes things tricky but maybe you can find a way :)
Its not a infinite loop its executes while they is a next page of comments on the videos chosen from the playlist. Tricky to count how many that is from the outset.
@@analyticswithadam Of course yes that's what I meant, the terminating condition is within the inf loop coz we dont know when it will end. But thanks anyway :)
Great content! I have 2 questions:
1. Will this work on YT shorts as well?
2. What's the limit/quota for the free TH-cam API, something like 10,000 comments per day?
10000 requests I believe. 1 request per video covers. Should work for shorts too
@@analyticswithadam Hi! Thanks for the video. Does it have a 100 comment limit per request? I'm trying to get more than 100 comments and can't do it.
@@gabrielcroquer7399 I'm having the same issue. Looks like it's capped at 100 comments... :-(
Can I also use this for single videos?
Make a playlist with one video
Hi from Brazil, I loved the video. I would really like to know how to extract it from the TH-cam chat
i dont have gogle cloud, dont ways again?
It’s free to use the api you just need to login to Google cloud w gmail
thanks it is really helpful 😘
How to do it for just a video and not a playlist?
Put the video in a playlist
@@analyticswithadam Even I did the same
I tweaked the code to retrieve comments and replies just from a single video. To keep things simple, I am just using the side effect aka print function to get the comments and replies and I am not writing them to a csv.
All credit goes to @analyticswithadam.
PS: The indentation might have got corrupted. Fix the indentation if you were to grab a copy of the modified code from below.
from googleapiclient.discovery import build
API_KEY = "Replace with your own API Key here"
# Get the Video Id
video_id = input("VideoId: ")
# Build the TH-cam client
youtube = build('youtube', 'v3', developerKey=API_KEY)
# Function to get replies for a specific comment
def get_replies(youtube, parent_id, video_id): # Added video_id as an argument
replies = []
next_page_token = None
while True:
reply_request = youtube.comments().list(
part="snippet",
parentId=parent_id,
textFormat="plainText",
maxResults=100,
pageToken=next_page_token
)
reply_response = reply_request.execute()
for item in reply_response['items']:
comment = item['snippet']
replies.append({
'Timestamp': comment['publishedAt'],
'Username': comment['authorDisplayName'],
'VideoID': video_id,
'Comment': comment['textDisplay'],
'Date': comment['updatedAt'] if 'updatedAt' in comment else comment['publishedAt']
})
next_page_token = reply_response.get('nextPageToken')
if not next_page_token:
break
return replies
# Function to get all comments (including replies) for a single video
def get_comments_for_video(youtube, video_id):
all_comments = []
next_page_token = None
while True:
comment_request = youtube.commentThreads().list(
part="snippet",
videoId=video_id,
pageToken=next_page_token,
textFormat="plainText",
maxResults=100
)
comment_response = comment_request.execute()
for item in comment_response['items']:
top_comment = item['snippet']['topLevelComment']['snippet']
all_comments.append({
'Timestamp': top_comment['publishedAt'],
'Username': top_comment['authorDisplayName'],
'VideoID': video_id, # Directly using video_id from function parameter
'Comment': top_comment['textDisplay'],
'Date': top_comment['updatedAt'] if 'updatedAt' in top_comment else top_comment['publishedAt']
})
# Fetch replies if there are any
if item['snippet']['totalReplyCount'] > 0:
all_comments.extend(get_replies(youtube, item['snippet']['topLevelComment']['id'], video_id))
next_page_token = comment_response.get('nextPageToken')
if not next_page_token:
break
return all_comments
# List to hold all comments from all videos
all_comments = []
video_comments = get_comments_for_video(youtube, video_id)
all_comments.extend(video_comments)
for _ in all_comments:
print(_["Comment"])
Hello
@@rrc012I need your help can u help me
✅ Perfect
u r amazing😍, but can u suggest me how to retrieve all comments just for one video
Make your own playlist with one video and you can follow the script here
@@analyticswithadam I can't create a playlist because I want to analyze the comments of a single video with millions of views from a third party. It doesn't have a playlist, it has 1 isolated video, I wanted to export the comments from these videos, I can't create a playlist.
@@player-xm9qm Try this method in Sheets.
Leave you the gemini bit.
th-cam.com/video/1KwFFb_tofY/w-d-xo.html