Thanks so much Greg, great job! Paying thousands for a masters at university, and people like you consistently pump out tutorials of way better quality. Its madness.
Many thanks Greg for opening up a new frontier! I had no idea Google Colab was so generous and allowed installation and practicing of Spark. Your tutorial packs an astonishing amount of information, that too in an engaging way, in a very short timeframe. You are now my Guru for Spark.
Don’t know if you’ll see this but I got into data engineering thru my company. They provided me the opportunity to become a software engineer, I was previously a cable installer/field tech. Although they provided this opportunity, I’ve still had to do much of my learning on my own. Your channel is amazing. Videos like these make all the difference. I really appreciate you making content where you’re walking thru the code. Once I get this under my belt I plan on creating content as well. Thank you. 🙏🏾
I appreciate you and your videos so much. In my data science classes we're expected to teach ourselves Pyspark, Dataframe, Pandas and a bunch of other technologies and you've made the experience much more manageable.
thanks for the vids bro. really curious what you think of lakesail's pysail. built on rust and suppsoedly 4x the speeds and 90% less hardware cost than spark. pretty recent project but would love some perspective on it
Thank you for the video. I have a problem - When I convert a column from string to int and then run printSchema it shows String and not the int. Is there a better way to convert string column to int in pyspark and make it a permanent change? I use thr data uploaded locally, I.e from my computer. Is this happens to only locally uploaded files? Will the conversation take place smoothly when operating on okne databases i.e through servers.
Not in the comments (not properly, anyway); joins is merging two tables together by matching common rows. In PySpark, a join is essentially the exact same thing as it is in normal SQL. You'd have to learn what a join is first :)
Hi, Greg - just come across your page - great tutorials : Getting the following error : 106 107 if not os.path.isfile(conn_info_file): --> 108 raise Exception("Java gateway process exited before sending its port number") 109 110 with open(conn_info_file, "rb") as info: Any advice ? Tried with the latest hadoop distro without success.
I would like to hear your opinion on Ponder. Considering that you can now work with Ponder similarly to how you work with Spark, do you believe it is still necessary to learn PySpark? I'm interested in your perspective on this matter, and if you are aware of any downsides or differences between Ponder and Spark.
I downloaded the train.csv file to my laptop's local hard drive, and tried to read it with titanic_df = spark.read.csv("c:\UserFiles\My Data\train.csv", header=True, inferSchema=True), but got an error message. Do you kbnow what I did wrong?
hey.. Hogg while i am trying to extract sum of sales by grouping the states from the dataframe, its giving an unnesessary floating values. If the sum is 150.0 its giving like 150.856743 like this.can you explain this..
Small annoyance, but does anyone know why when I run something, like spark.sql('select * from Movies'), for example, it gives me the datatypes instead of displaying the actual table data?
bro what is the difference between .limit(3) and .show(3) ? i tried it on data brick using python on spark 3.0.1 . show command showed the csv dataframe&row , but limit command can't showed the csv dataframe&row.
It's essentially a different way of doing exactly the same thing. Sometimes I mix and match depending on how comfortable I am with what I'm trying to do
Thanks Greg , it was clean and straight forward. like it a lot.. could you suggest me a course to learn Spark .In our company we are trying to build a data lake on hadoop using hive.. We have a lot of complex stored procedures on a rdbms. i will be migrating all the logic into Data lake.. spark would be great tool to accomplish this.I would really appreciate if you suggest some online courses.
For installing pyspark, why didn't you just do `pip install pyspark`? I'm trying to use the pandas api that was introduced in 3.2 with this method but even if I wget and unzip the spark 3.2 tar file I can't import the module. Cool tutorial though!
Take my courses at mlnow.ai/!
Thanks so much Greg, great job!
Paying thousands for a masters at university, and people like you consistently pump out tutorials of way better quality. Its madness.
Yup that's how it goes! Haha I'm really glad to have helped 😃
Many thanks Greg for opening up a new frontier!
I had no idea Google Colab was so generous and allowed installation and practicing of Spark.
Your tutorial packs an astonishing amount of information, that too in an engaging way, in a very short timeframe.
You are now my Guru for Spark.
Yup, it does! 😃
Don’t know if you’ll see this but I got into data engineering thru my company. They provided me the opportunity to become a software engineer, I was previously a cable installer/field tech. Although they provided this opportunity, I’ve still had to do much of my learning on my own. Your channel is amazing. Videos like these make all the difference. I really appreciate you making content where you’re walking thru the code. Once I get this under my belt I plan on creating content as well. Thank you. 🙏🏾
Oh that's so great to hear!! Thank you for the kind words and I wish you all the best Darrien!!
🙌🙌
I appreciate you and your videos so much. In my data science classes we're expected to teach ourselves Pyspark, Dataframe, Pandas and a bunch of other technologies and you've made the experience much more manageable.
Well I'm super happy to hear that Jacob, thanks for the kind words!
These introductory videos are pure gold; thanks for sharing.
Thank you greatly for the kind words, and for your support! It means a lot. 😊
Thanks
Wow, that was very nice of you Joseph! Thank you :)
In just 17 minutes I've learnt so much. Thanks!
Perfect, really glad to hear it :)
This was really amazing. Waiting for more uploads on pyspark.
Awesome! Did you catch the other 15 minute long one?
Yep, I have. Followed along both the videos.
@@gauravraichandani7722 okay awesome!
This videos deserves 1m views
Haha that would definitely be preferred, thanks so much for the kind words I really appreciate it!
Bro you have explained it so well.. keep going
Thanks, great to hear!
Thank you so much. Pretty covers everything you to get started with pyspark. I wish you had included merging as well.
Excellent presentation and to the point !!!
I was waiting for this one!
I've wanted to make this for a long time since the PySpark RDD video did so well. Enjoy!
Thanks!
From Greg to Greg, thank you so much!
Great video. Simple yet effective to comprehend.
I'm very glad to hear that Faizal, and I greatly appreciate your kind words!
This is amazing!! Thanks Greg.
Thanks for sharing your knowledge. Great video.
You're very welcome and glad to hear it!
Amazing content...please prepare more like these.. 👍🏻
Awesome. Please keep up the good work. Please make more videos in spark. Thank you
Awesome, thank you!
Could you please suggest
any good material video tutorial for pyspark for a newbie?
@@noushinbehboudi5694 Isn't that this one?
@@GregHogg I started pyspark with your videos. But I only found 2 videos in your channel. Are you going to upload more?
@@noushinbehboudi5694 Makes sense. Not for awhile unfortunately, I would recommend doing the databricks specialization on Coursera :)
This is so amazing 😍😍
Thanks so much!!
Thank you so much !!! Always great contents
You're super welcome. Really glad to hear that.
This is amazing! Thanks Greg : ]
Awesome! You're very welcome 😄
thanks for the vids bro. really curious what you think of lakesail's pysail. built on rust and suppsoedly 4x the speeds and 90% less hardware cost than spark. pretty recent project but would love some perspective on it
Thank you for the video.
I have a problem -
When I convert a column from string to int and then run printSchema it shows String and not the int.
Is there a better way to convert string column to int in pyspark and make it a permanent change?
I use thr data uploaded locally, I.e from my computer.
Is this happens to only locally uploaded files? Will the conversation take place smoothly when operating on okne databases i.e through servers.
You're awesome man!
No you!
Can you expand more on the sql bit along with joins?
Not in the comments (not properly, anyway); joins is merging two tables together by matching common rows. In PySpark, a join is essentially the exact same thing as it is in normal SQL. You'd have to learn what a join is first :)
Hi, Greg - just come across your page - great tutorials : Getting the following error :
106
107 if not os.path.isfile(conn_info_file):
--> 108 raise Exception("Java gateway process exited before sending its port number")
109
110 with open(conn_info_file, "rb") as info:
Any advice ? Tried with the latest hadoop distro without success.
how can we create a python DataFrame from an already existing table?
You'll need to read in the file using one of Sparks read functions
is this the same as Apache spark ?
I would like to hear your opinion on Ponder. Considering that you can now work with Ponder similarly to how you work with Spark, do you believe it is still necessary to learn PySpark? I'm interested in your perspective on this matter, and if you are aware of any downsides or differences between Ponder and Spark.
Thank u so much, sir i have à problem in converting spark.dataframe to pandas.df, beacuse i have a large number of data... How can i do !?
Isn't there a .topandas function?
I downloaded the train.csv file to my laptop's local hard drive, and tried to read it with titanic_df = spark.read.csv("c:\UserFiles\My Data\train.csv", header=True, inferSchema=True), but got an error message. Do you kbnow what I did wrong?
Hola, como creo una base de datos con pyspark?
hey.. Hogg while i am trying to extract sum of sales by grouping the states from the dataframe, its giving an unnesessary floating values. If the sum is 150.0 its giving like 150.856743 like this.can you explain this..
Small annoyance, but does anyone know why when I run something, like spark.sql('select * from Movies'), for example, it gives me the datatypes instead of displaying the actual table data?
Empty table maybe?
Are there more then the 2 videos on pyspark? Thanks and great work
That's all I've got, sorry!
Awesome
so its like a worse sql?
I am getting an error message ' E: Unable to locate package open-jdk-8-jdk-headless'. What could be the reason plz?
I think pip install PySpark is enough to install
bro what is the difference between .limit(3) and .show(3) ? i tried it on data brick using python on spark 3.0.1 . show command showed the csv dataframe&row , but limit command can't showed the csv dataframe&row.
I don't know, sorry.
i need to learn "the rest" of pyspark sql **fast** (& hardly know any sql at all). suggestions??? what are some good resources???
Honestly, the documentation is great.
what is the difference between filter and where
Nothing, they are the same :)
If we can use spark.sql then we don't need dataframes function like filter, agg etc.?
It's essentially a different way of doing exactly the same thing. Sometimes I mix and match depending on how comfortable I am with what I'm trying to do
Thanks Greg , it was clean and straight forward. like it a lot.. could you suggest me a course to learn Spark .In our company we are trying to build a data lake on hadoop using hive.. We have a lot of complex stored procedures on a rdbms. i will be migrating all the logic into Data lake.. spark would be great tool to accomplish this.I would really appreciate if you suggest some online courses.
No problem! Check out some of the big data courses on Coursera.
For installing pyspark, why didn't you just do `pip install pyspark`?
I'm trying to use the pandas api that was introduced in 3.2 with this method but even if I wget and unzip the spark 3.2 tar file I can't import the module.
Cool tutorial though!
That's a great question. I actually didn't know it was this easy in Colab at the time. Thanks!
Can anyone give the command to replace null value in age column with average age for each gender ?
This is a great exercise 😁
just wow.
😄
Awesome ! (..)
Thank you!!
4:54 funny voice crack LOL
You're right, that is pretty funny 😂
@@GregHogg the video was pretty amazing. Thanks.
@@yashsvidixit7169 Really glad to hear that, and thanks a bunch!