what is Apache Parquet file | Lec-7
ฝัง
- เผยแพร่เมื่อ 14 ต.ค. 2024
- In this video I have talked about parquet file reading in spark. If you want to optimize your file and process in Spark then you should have a solid understanding of Parquet file format. Please do ask your doubts in comment section.
Directly connect with me on:- topmate.io/man...
Download Parquet Data:- github.com/dat...
Download parquet tools in your local to run all the below commands.
Parquet tools can be downloaded using pip command.
Run the below command in cmd or terminal
pip install parquet-tools
Run the blow command inside python
import pyarrow as pa
import pyarrow.parquet as pq
parquet_file = pq.ParquetFile(r'C:\Users
ikita\Downloads\Spark-The-Definitive-Guide-master\data\flight-data\parquet\2010-summary.parquet\part-r-00000-1a9822ba-b8fb-4d8e-844a-ea30d0801b9e.gz.parquet')
parquet_file.metadata
parquet_file.metadata.row_group(0)
parquet_file.metadata.row_group(0).column(0)
parquet_file.metadata.row_group(0).column(0).statistics
Run the below command in cmd/terminal
parquet-tools show C:\Users\manish\Downloads\Spark-The-Definitive-Guide-master\data\flight-data\parquet\2010-summary.parquet\part-r-00000-1a9822ba-b8fb-4d8e-844a-ea30d0801b9e.gz.parquet
parquet-tools inspect (path of your file location as above)
parquet.apache...
For more queries reach out to me on my below social media handle.
Follow me on LinkedIn:- / manish-kumar-373b86176
Follow Me On Instagram:- / competitive_gyan1
Follow me on Facebook:- / manish12340
My Second Channel -- / @competitivegyan1
Interview series Playlist:- • Interview Questions an...
My Gear:-
Rode Mic:-- amzn.to/3RekC7a
Boya M1 Mic-- amzn.to/3uW0nnn
Wireless Mic:-- amzn.to/3TqLRhE
Tripod1 -- amzn.to/4avjyF4
Tripod2:-- amzn.to/46Y3QPu
camera1:-- amzn.to/3GIQlsE
camera2:-- amzn.to/46X190P
Pentab (Medium size):-- amzn.to/3RgMszQ (Recommended)
Pentab (Small size):-- amzn.to/3RpmIS0
Mobile:-- amzn.to/47Y8oa4 ( Aapko ye bilkul nahi lena hai)
Laptop -- amzn.to/3Ns5Okj
Mouse+keyboard combo -- amzn.to/3Ro6GYl
21 inch Monitor-- amzn.to/3TvCE7E
27 inch Monitor-- amzn.to/47QzXlA
iPad Pencil:-- amzn.to/4aiJxiG
iPad 9th Generation:-- amzn.to/470I11X
Boom Arm/Swing Arm:-- amzn.to/48eH2we
My PC Components:-
intel i7 Processor:-- amzn.to/47Svdfe
G.Skill RAM:-- amzn.to/47VFffI
Samsung SSD:-- amzn.to/3uVSE8W
WD blue HDD:-- amzn.to/47Y91QY
RTX 3060Ti Graphic card:- amzn.to/3tdLDjn
Gigabyte Motherboard:-- amzn.to/3RFUTGl
O11 Dynamic Cabinet:-- amzn.to/4avkgSK
Liquid cooler:-- amzn.to/472S8mS
Antec Prizm FAN:-- amzn.to/48ey4Pj
I said 500 GB in the video by mistake. It is supposed to be 500MB, and when dividing 500/128, we will get 4 partitions.
This tutorial is too good. What a detailed demo based insight. I could never forget this anymore. Thank you for your efforts. These tutorials are literal Gold.
Literally the best and most detailed video on parquet file format on yt. Thank you!
Wah....Itta clarification...maza aa gya...Video kab khatm hui pta hi ni chala....!!! Excellent explanation.
Such a detailed and amazing video , I am working in big data from many years but this level of detailing I never knew , thankyou so much for this detailed video, your way of teaching encourages/excites many learners. hats off
Predicate pushdown - Rows filtering, Projection Pruning/Pushdown - Column filtering. Thanks for the session bro!!
Someone ask me in interview about internals of parquet file format and i couldn't answer it,Then i found your video.Now i can explain easily.Best video on parquet file format.
This is so far the best video in which I got to know in depth knowledge of parquet and very easy to understand.
Thankyou so much for sharing your knowledge.!!
Could you please share the video having optimization of parquet?
Hi Manish,
the parquet file detail class was classic example for how to present something .if same like this avro and orc file format classes can be discussed then it would be really helpful. Nowadays the interviewer is asking on those as well
Thank you bro you are helping a lot of DEs good work 🎉
Never saw such a detailed video for parquet file, these videos are really valuable. Really appreciate the efforts put in making these videos
I love how indept you are going, please keep doing it ! We are loving it.
The best explanation!!! Your videos are giving me motivation and inspiration to keep learning spark!
You are like gold mine in terms of knowledge
This is the best explanation for parquet file format available online. Thanks Manish.
Hi Manish, great explanation of parquet i'm using parquet but didn't know about these features which made things fast how were you able to learn all this knowledge please suggest any documentation/resources to get deep understanding like this. you made my day. Thank you 😊
Hi Manish, thanks for such smooth explanation of not just information related to parquet but also things related to it, kudos to your efforts :D
You are really awesome. Thank you. You are adding an infinite value.
the best explanation ! you are a wonderful teacher
hey Manish been following your channel for a very long time. And thanks for the awesome videos. In the ending of this video. You said you will discuss how to optimize parquet file format but I dont see that video added in this playlist. Am I missing something?
this video only, last 10 mins.
Highly informative video, super!
highly informative and detailed video on parquet. Thanks a lot Manish!
Also please explain Bucketing and partitioning
Awesome explanation
Please make a video on avro file format in detail because I faced challenges when interviewers asked about avro file format questions
Hello Manish, excellent explanation, hats off to you. When will you give optimization video on parquet eagerly waiting for it
Please make video on ORC and Avro as well
Thank you so much for putting in so much time for making this video
Hats of Manish, Please keep doing the good work :)
Hi Manish, thank you for the information. Please can you elaborate whats the default value of 128 MB and when we have 500 GB data how does that convert to 4 row groups? Thank you
500 mb not gb. 500 divided by 128 I.e 4.
4 block of data will be created. So the thing is we have a default block size of 128 MB in hdfs and multiple cloud service provider also use the same block size. So let say if you have 140 mb data that means one partition will be of 128 mb and next partition will be having just 12 mb of data.
@@manish_kumar_1 thank you so much!
How one will understand in which 1L records we need to fetch the data. Still we need to scan the complete file. Isn't it?
Kindly explain
Wow super info
Where is the nested JSON video? You said you will make a separate video on it in the previous lecture "how to read json file in Pysaprk"
Lec 23
Hi manish sir,Thanks for bringing such valuable info ..I have a question like how can we handle schema evaluation in parquet
I didn't get you
I think he wants to about shema evolution
@@manish_kumar_1 I think @pankajsolunke3714 is asking how to handle schema evolution in parquet if we can?
Hello Manish, i really like your videos, thanks for the efforts you put in. Have a question, can you please tell a good tutorial/course that we can go through to get really good at pyspark, if not a single resource then what are the various resources that we can go through to get good at pyspark coding.
You don't need a course. Still if you want to go for a course then you can buy a udemy course by Prashant Pandey titled pyspark for Beginner. Rest depends on you ki how much questions you want to solve. Solve more problems rather than running behind multiple courses. Practice is the key to success not a number of course you have done.
sparkbyexamples is the RESOURCE we need for practice!! :)
Where can we practice spark from?
@@manish_kumar_1 where can we practice spark from??
@@royalkumar7658 Leet code se. Aap playlist start se follow kijiye tab Pata chal jayega kaha se and kaise
Manish, if we are writing parquet by making the files already sorted in asc or desc then the process of retrieval of data would be faster right? Because in row_number's meta data would have min and Max value in a certain range? Please correct me if I am wrong.
Very good video!
One doubt, I did not understand the logical partitioning completely, it resembles with file size we can set in spark config. Please help me understand it
Superb bro … very helpful thanks 🙏🏻
bhai 500gb data mein 4 row kyun rakhe gaye and manish bhai 128mb hota hai . aap explain kar sakte ho aisa kyun bhai
Manish bhai, nested json?
Hi Manish,
In the last video, you told that u will explain about nested json in further videos. Where can i find that?
I have not done yet. I will try to make one soon.
Yaar bhaiya wo bana do pls. Industry to usi pe chal ri
Nested json to data frame explain kijiyee na
hii bro each row grop stores 128mb or 128 gb data you told 128mb
bur for for 500gb you told 4 row groups
you are talking about 500mb or 500gb
unable to install parquet-tools. can you help or point n right direction
Very detailed awesome video!! Thanks
Excellent video. Thank you :)
sir please make short video to downloadd and install parquet tool
Sure
Do i need to install python first before downloading parquet tool?
Think so
nice explaination brother
can you make one video on ORC also
Hi Manish, what is projection pruning? Unable to find it on Google. Or is it Partition Pruning*? Can you please explain/clarify?
Projection pushdown Hota hai jisme columns ki pruning hoti hai. So Projection pushdown ya Projection pruning same hai.
@@manish_kumar_1 thanks for clarifying!
Thanks! for this masterpiece
@manish_kumar_1 Sir kya mast content banaya, maja aa gaya, thank you!
error occured while processing file ??
yeh error continous hai...help:
Thanks for this Manish! Great Work!
You should have spent more time and detail on projection pruning and pushdown
bhai, maza aa gya, thanks bro
Thankyou 😊
Bro nested json video is not available. Please provide in depth if feasible.
It is there. Practical wale playlist me hoga. Ek baar check kar lijiye
bro you have installed java to read parquet file in command prompt
Not particularly. I had in my laptop already installed
what a gem you are
informative Thanks
Null kaise write hota hai disk pe??
Very nice.
Thanks Manish 🙂
sir, parquet file download nahi ho raha he github se
Your content is good.Why don't you do videoes in English.
english nahi aati hai 😒. Just joking, I may record a session in future but not for now.
th-cam.com/video/zM2OAAvJItQ/w-d-xo.html&ab_channel=knowledgeEpicenter (We are making videos for those people for whom no one is making videos.)
😇
Thanks Manish
Example of Modi ji for finding age >18 was highlight of the video
😂😂
Hey @manish_kumar_1, I was able to use the modes (append, overwrite, etc.) using this command:
df.write.option("header", first_row_is_header) \
.option("sep", delimiter) \
.mode("Overwrite") \
.csv(file_location)
All other ways of writing is returning error on Databricks if the file exists.. even if we're trying to append the data..! :| Unsure why is this happening...! :\
Same here. May be due to community edition. In production environment it does work
@@manish_kumar_1 oh..okay! Still weird, though!!! I'm yet to try databricks in production..
Thanks sir
Cmd-parquet-tool issue
❤
Directly connect with me on:- topmate.io/manish_kumar25
21:25 500 GB or 500 Mb ?
500 mb
anybody help me please i cant read parquet file using command prompt
Koi issue nahi hai. Aap direct databricks me read kar lijiye. Ek baar video ko bas sahi se dekh lijiyega
@manish_kumar_1 databricks me kr liya h command prompt ka nhi ho rha h
Stage failure error show
ok sir ni manunga notebook apse
🤥😔
Thanks 😂
Please drop your twitter or X account. I will promote you. You are the only person on youtube who is actually teaching something useful in DE filed. That TOO IN HINDI. Great work and great effort.
God Bless You !!
I don't have ex😂😂. Sorry I mean this X
@@manish_kumar_1 Hahaha. Please create one than. It pays better than TH-cam
@@KavyaPristha oh is it. I did not know about this.
import pyarrow as pa
import pyarrow.parquet as pq
parquet_file = pq.ParquetFile(r'C:\Users\DELL\Desktop\part-r-00000-1a9822ba-b8fb-4d8e-844a-ea30d0801b9e.gz.parquet')
parquet_file.metadata
parquet_file.metadata.row_group(0)
parquet_file.metadata.row_group(0).column(0)
parquet_file.metadata.row_group(0).column(0).statistics Not able to see any output with this file. Not sure why
Error v nhi aa rha?
@@manish_kumar_1 sorted
really awesome explanation
Thanks Sir