What about just being able to access those same tables via an existing databricks catalog in the hive metastore structure for example? Is there a way to do that?
I want to do query from sql and load the result into one variable. Can we do that. Like select max(id) from sql table. I am using this id for comparison in next steps
Yes I have also observed it. Later in other videos I found out that we have to add it as an option when reading data into data frame like .option("driver", jdbcDriver")
Hi Raja's, I am following this tutorial step by step but I got an error while running the 2nd cell of getting product table. the error is " java.sql.SQLException: No suitable driver", can you please help in this case.
Jdbc connection has performance issue while handling huge amount of data. But there are options to improve the performance which can be applied depending on the use case
simple but effective process and expalination as well. Great job
Glad you liked it!
Simply Awesome !! We are learning a lot from your videos.
Thanks Omprakash
Simply Super Bro.. Awaiting for more videos from you on db related activates in data bricks
Hi Avinash, thank you. Sure will post more videos on db related activities
What about just being able to access those same tables via an existing databricks catalog in the hive metastore structure for example? Is there a way to do that?
like in unix , can i save all these details in one file and call in the beginning of the scripts. ?.
Yes we can use yaml or json configuration file to save the details and during run time, spark can read the configuration file and process accordingly
How to write this product table data into blob storage in parquet format in a databrick notebook? Plz help
We can use databricks writer df.write.format("parquet").save(location)
I want to do query from sql and load the result into one variable. Can we do that. Like select max(id) from sql table. I am using this id for comparison in next steps
Awesome, Very nice explanation...
Thank you
Great video! Just one question, I saw you defined the jdbcDriver...but I didn't see it used after in jdbcUrl? What is it for?
Yes I have also observed it. Later in other videos I found out that we have to add it as an option when reading data into data frame like .option("driver", jdbcDriver")
Can we mention type of authentication while connecting? What if we have only Azure Active Directory Password ? How to mention that?
Nice explanation bro 👍
Hi Raja's, I am following this tutorial step by step but I got an error while running the 2nd cell of getting product table. the error is "
java.sql.SQLException: No suitable driver", can you please help in this case.
Now I got that, something was wrong in preparing the connection. I can connect, and get the data from Azure Sql Server.. Thanks Raja.
Glad to hear you fixed the issue 👍🏻
Could you help me , establish connection string using azure active directory authentication mode
Thanks worked perfectly.
Great
Sir, Can we also create , view, alter , run stored procedures from databricks ?
Hi Arnab, stored procedures can't be created in databricks. Views can be created and can be altered as well
I really appreciate this video thank you🙏
Thanks Prathap! Glad you find it useful
Can i join for paid project.. Or else how can i contact you..
HELLO SIR..WHILE IMPORTING DATA HOW WE COME KNOW WHICH IS MODIFIED AND WHICH IS LATEST DATA??I MEAN ANY UPDATED DATA HOW WE HANDLE THAT..PLS REPLY
Super nice video
Thank you
What if data is huge like 100 gb. Is it still recommended?
Jdbc connection has performance issue while handling huge amount of data. But there are options to improve the performance which can be applied depending on the use case
how to get that IP address , foe me it was not visible
You didn't use the jdbcDriver . What is the purpose to have jdbcDriver ????
Why do you say I didn't use jdbc driver????
Look at 7:30 in the video
@@rajasdataengineering7585I meant that u didn't pass the jdbcDriver value in to the jdbc url
how to get that ip address . i did not find while logging. please can you say
You can get it from command prompt using ipconfig command
I want to connect to my localhost but the connection is getting refused. Can you please make a video on it?
Sir how can we connect using serects from keyvault ?
We need to create scoped credentials in databricks first to setup integration between key vault and databricks
@@rajasdataengineering7585 thank you will check and do 💪
how to get multiple tables from Azure SQL into databricks notebook
Sir, is there any way to hide the password from exposing it in the code ?
Yes Arnab, we can use azure key vault
Also we can use Databricks secret scope
Yes we can use databricks secret scope
great video
Glad you enjoyed it
Simple and effective
Thank you so much ❤Sir....
Most welcome! Hope you find it useful
@@rajasdataengineering7585 Yes 💯
Sir getting no suitable driver on running df @@rajasdataengineering7585
Hard coding Password in the code is not recommended. Can we get password from Azure Key Valt. Can you please let us know the steps for that
We need to integrate azure key vault with databricks by creating secret scope
simple superb
can you make a video how to creatra account in databricks community addition for free