As a person with just 2 years of experience my mind was blown watching this. I am a single person writing code in my department so I don't have any seniors to learn from but I'm leading a data engineering project that deals with terabytes of data and each request is multiple times larger than the server's RAM and multiple such requests need to be processed in parallel to complete stuff in time. Also, we have the tiniest possible budget to aggregate 25 to 30 columns and billions of rows every day. Also, we need to cut down on costs. This was super helpful.
Please consider doing an updated version of this video, perhaps using Delta Tables as a storage format!
Thanks for the call out George! Just added to our content planning calendar
As a person with just 2 years of experience my mind was blown watching this.
I am a single person writing code in my department so I don't have any seniors to learn from but I'm leading a data engineering project that deals with terabytes of data and each request is multiple times larger than the server's RAM and multiple such requests need to be processed in parallel to complete stuff in time.
Also, we have the tiniest possible budget to aggregate 25 to 30 columns and billions of rows every day. Also, we need to cut down on costs.
This was super helpful.
Really good stuff! A lot of good ideas.
Awesome!!! Please more!
Great video, thanks!
thank you pete
Thanks for this demo. Can you comment on what role polars may play in this?
polars is akin to pandas or spark dataframes.. a way to organize your tables of data , if im not mistaken
As pandas alternative.
aawwwseome.