I learned that Databricks use datelakehouse and delta lake format, in that case if i create or add files to DBFS then it will convert into delta lake or it'll be as it is, and DBFS is delta lake house?
Thanks mate for the detailed info as i started to watch with first video then i continue watch the complete series. Please could you attach the lab work what you explain the video and much appreciated for your great work!
Thank you Raja for this awesome video. Could you please create video to understand how to read key value pair to manage hardcode from.properties file in azure databricks and log4j implementation to create daily abc_ddmmyy.log file for the job in azure databricks using Scala
I have a question please help me I have a function defined on Notebook1 one as def Multiply (a,b): Result = a*b print( Result) Now i am trying to use notebook 2 trying to run %run/Notebook1 Here i am not sure how to call function Multiply defined in Notebook 1 and pass 2 parameters a & b in notebook 2 can this be done?
when we are writing spark in databricks , there is no need to call from pyspark.sql import sparksession spark.session.builder appname .master. getorcreate
I learned that Databricks use datelakehouse and delta lake format, in that case if i create or add files to DBFS then it will convert into delta lake or it'll be as it is, and DBFS is delta lake house?
Thank you. Could you please provide a link for the notebooks?
Great content
Thanks
nice one
Thanks
good to learn. Thanks!
Glad it was helpful!
You are a gem. Thanks a lot.
Thank you 👍🏻
Quite helpful. Thank you
Glad it was helpful!
The videos are very good and have excellent explanations. The best videos on databanks on youtube. Can you pls do videos on python?
Thanks Jas!
Sure will create videos on python
superb...could you please also provide the documents ...it will be very helpful..Thanks 🙂
Thanks mate for the detailed info as i started to watch with first video then i continue watch the complete series. Please could you attach the lab work what you explain the video and much appreciated for your great work!
Superb explanation Raja 👌 👏 👍
Thank you Sravan!
this is really helpfull..thanks
Thank you
Hi sir, where can i get notebook for the classes.
could you please share
great
Thanks
very underrated channel. keep up the good work. besides I think u should create Udemy course about databricks.
Thanks Abdullah! Your comments make me more motivated 👍🏻
Thanks for sharing valuable information 👍
How to create a button and assign a code and the click on that button the perform the activity
Thank You
where can i find the dataset?
Thank you Raja for this awesome video. Could you please create video to understand how to read key value pair to manage hardcode from.properties file in azure databricks and log4j implementation to create daily abc_ddmmyy.log file for the job in azure databricks using Scala
Sure Sujit, will create a video on this requirement
can you provide link for this notebook files plzz..
Thank you Raja Bhai
Thank you, Uncle Dan!
I have a question please help me
I have a function defined on Notebook1 one as
def Multiply (a,b):
Result = a*b
print( Result)
Now i am trying to use notebook 2 trying to run %run/Notebook1
Here i am not sure how to call function Multiply defined in Notebook 1 and pass 2 parameters a & b in notebook 2 can this be done?
@ Raja please respond to above query
Superb explanation.. But try to provide files
when we are writing spark in databricks , there is no need to call from pyspark.sql import sparksession spark.session.builder appname .master. getorcreate
Yes it is optional in databricks but mandatory while working in open source spark projects
Your videos are just amazing❤I wish I could have found this channel earlier🥹
Thanks Ashutosh!