Awesome video, Adam, clear, concise and with a demo! Really helped to absorb the information in an effective and usable way and progress with my work. Thanks! :)
easy to understand explanation! Looking forward for more tutorials :) Just a quick question Adam, how to ingest in databricks the data (batch) json file in the blob storage? I've done mounted it.
This is so informative. I struggled a lot to understand the scopes and secrets in Databricks at work. Now, I know how it was done. Thank you very much. Can I request you to please also create a course for the AZ-104 Azure Admin Certification exam?
Quick question - Is it a good practice to link multiple Databricks Scopes to a single Key Vault? I was thinking of creating my separate Key Vault in Azure but multiple scopes in Databricks for different purposes. Can I use one Key Vault as backend for all those scopes?
Great ! thanks Adam.I have a question , how can we retrieve secret scope values like certificate and pass them to a 3rd party tools like presto? is there away to save these values as they are redacted.thx
Hey, unfortunately I don't think there is API available externally for this. Looks like they target Q3 2020 github.com/databricks/databricks-cli/issues/262
@@AdamMarczakYT I watch your videos regularly and they have been immensely educative. Is it possible to do one that demonstrates streaming scenario that integrate kafka and azure databrickd
Is not this more relevant when accessing a database because for accessing blob or ADLS storage I will most likely use the service principal. Thanks a lot for your answer in advance.
And how are you planning to pass service principal Client ID and Secret :) ? Managed Identities are not yet supported for Databricks. But in general, this is relevant for any information that should not be placed directly in the code.
Great! Thanks for sharing. When to establish a new key, I noticed under Secrets, there is a warning as "The operation "List" is not enabled in this key vault's access policy. " How to fix it please?
Seems like you are missing 'list' permission in Key Vault. You principal should have assigned Get and List in Key Vault access policies. This is shown during the video in case you have troubles finding this.
great video as always! so the scope of these secrets would be at Work space level correct? is there a way to limit it to a particular notebook to limit the scope of any secret ? Thank you
Thanks! You are active as always, glad to see your back. You can control ACLs for scopes using databricks CLI. It's not as granular and won't solve all scenarios but it's helpful. Too bad it requires premium plan of databricks. docs.databricks.com/security/secrets/secret-acl.html
@@AdamMarczakYT Yeap , I did had a look earlier..Example if a person has access at work space level, can you stop that user accessing a particular Notebook or data set or a secret scope within that as I believe it would inherit the accessibility from parent. On the other hand if we grant access to a user on a Notebook and not workspace then from which pane he would access the Notebook .... in his own-workspace? Bit unclear how the access levels are defined in Databricks..like Workspace -> Notebook ->Datasets...so on...
Unfortunately there is no linkage like this because those items are no in child-parent relationships. You manage secrets scopes apart from managing notebooks/datasets etc.
@@AdamMarczakYT : Ok good to know. On the related access query , someone has workspace level admin access be prevented to access a Table API? I understand its premium feature so not able to check myself.
Hi l am using the community edition, and l am trying to to set up authentication with the CLI, problem is l can't find any token option in my work-space l only have a password and l don't know how to use it to authenticate with the CLI
That's the plan for the future tutorials. Right now try Microsoft tutorials Azure SQL Using JDBC docs.microsoft.com/en-us/azure/databricks/data/data-sources/sql-databases?WT.mc_id=AZ-MVP-5003556 Azure SQL Using Apache Spark connector docs.microsoft.com/en-us/azure/databricks/data/data-sources/sql-databases-azure?WT.mc_id=AZ-MVP-5003556 Azure SQL DW (synapse) docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/synapse-analytics?WT.mc_id=AZ-MVP-5003556
Hi adam I could able create AKV backed scope and I can see azure databricks permission in AKV - ACL with get, list secrets and other privilege but I was getting bad request failure to pick the secrets , would because of any firewall or network issues ? Can you please let me know do I need to check any other permission ( off course I made SP to grant get and list for the same AKV)
Hi Adam - I am facing the below error while trying to add secrets using CLI c:\Users\ databricks secrets put --scope myblobkv --key blobkey Error: b'{"error_code":"INTERNAL_ERROR","message":""}'. Could you pls help me with this ?
Unfortunately can't debug issues over youtube comments. I have not seen this issue. Try again all commands step by step as in my video, if you still encounter the same issue than I'd try databricks forums. This seems to be generic issue.
If the secret scope is managed with backend as AzureKeyVault, you will not be able to add or manage keys with access token provided from databricks user settings. If you had followed this video, Adam has accessed databricks workspace from CLI using access token from user settings provided in databricks UI. Refer command: databricks configure --token. You need an Azure AD user token to create an Azure Key Vault-backed secret scope with the Databricks CLI. We can't use databricks personal token to manage key vault based scopes through CLI.
@@AdamMarczakYT Thank you Adam for such a good video. Keep posting more for us on databricks. You need an Azure AD user token to create an Azure Key Vault-backed secret scope with the Databricks CLI. You cannot use an Azure Databricks personal access token or an Azure AD application token that belongs to a service principal.
how to write code Sarah and John like to pass notes to each other in class. To ensure their secrets don't get exposed, they have developed a secret language to communicate. It is your job to decipher the code and write a program that can decode their secrets in python
I'm currently running Cosmos DB project at work so definitely some detail and lessons learned projects are coming in future, so far I got introduction video on my channel. For synapse I'm waiting for new Synapse UI to come out to make a video.
Hope this can help you. There is you tube video "Best practices for Azure Cosmos DB: Data modeling Partitioning and RUs - BRK3054" is very good. th-cam.com/video/bQBeTeYUrR8/w-d-xo.html
Maybe it was your connection. I checked entire video and it's 4k recording. in case you experience issues let me know at which part! Thanks for watching!
I just might. In the meanwhile I highly recommend this article and azure friday video as it already covers the topic docs.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment
Will do as soon as they release unified workspace experience otherwise entire video would be using user interface that will change completely very soon.
Dude, truly awesome content. You are making such videos free in youtube, which will even make the paid trainers feel ashamed. God bless you.
Thanks Adam. Excellent explanation.. as clear as crystal. These tutorials help a lot of people to enhance their skills and understanding.
My pleasure!
Thank you very much Adam for explaining the scope and how to manage it through Azure Keyvault in such a simpler way.
Awesome video, Adam, clear, concise and with a demo! Really helped to absorb the information in an effective and usable way and progress with my work. Thanks! :)
Cheers Anna, great to hear that! :)
great video even after 3 years it is still one of the best on this topic.
The best video for the subject. Great job Adam.
Excellent Adam. A great follow on from your Excel Databricks tutorial which I have now evolved using the tutorial above
Excellent! Thanks!
easy to understand explanation! Looking forward for more tutorials :)
Just a quick question Adam, how to ingest in databricks
the data (batch) json file in the blob storage? I've done mounted it.
Try spark.read.json function :) Thanks for stopping by! :)
short and simple - nice presentation
Thank you 🙂
This is so informative. I struggled a lot to understand the scopes and secrets in Databricks at work. Now, I know how it was done. Thank you very much. Can I request you to please also create a course for the AZ-104 Azure Admin Certification exam?
Glad it was helpful! I surely will consider AZ-104 after I'm done with AZ-900.
Quick question - Is it a good practice to link multiple Databricks Scopes to a single Key Vault? I was thinking of creating my separate Key Vault in Azure but multiple scopes in Databricks for different purposes. Can I use one Key Vault as backend for all those scopes?
Thank you very much, finally I found some great content and explanation!!
Glad it helped!
Great ! thanks
Adam.I have a question , how can we retrieve secret scope values like certificate and pass them to a 3rd party tools like presto? is there away to save these values as they are redacted.thx
Great video Adam! A very clear explanation of a very useful service. Can you do a video on copying data from SQL to azure blob using dynamic query?
Hey, thanks for watching. For video do you mean by using data factory or databricks?
Excellent content. Really well explained. Great job Adam.
Hi Adam.... Can we use this technique to store secrets inside the dbfs file system on a cluster?
Excellent as usual! A big thanks.
Can we create secret scope which will store keyvault secret from CLI?
Great content Adam! Keep the Azure tutorials coming :)
Hi Adam, is it possible to create a secret scope type backend keyvault via API? manually doing it is not really workable in CI/CD pipeline
Hey, unfortunately I don't think there is API available externally for this.
Looks like they target Q3 2020 github.com/databricks/databricks-cli/issues/262
Thank you for the clear and consise tutorials
Glad you like them!
@@AdamMarczakYT I watch your videos regularly and they have been immensely educative. Is it possible to do one that demonstrates streaming scenario that integrate kafka and azure databrickd
As always another great knowledgeable video... You are always to the point with clarity in your session ....
I appreciate that!
Is not this more relevant when accessing a database because for accessing blob or ADLS storage I will most likely use the service principal. Thanks a lot for your answer in advance.
And how are you planning to pass service principal Client ID and Secret :) ? Managed Identities are not yet supported for Databricks. But in general, this is relevant for any information that should not be placed directly in the code.
Adam Marczak - Azure for Everyone Got it , thanks .
@@AdamMarczakYT Can you please do a video on the security considerations and best practices when deploying databricks in production ?
Awesome! Simple & Powerful presentation! thanks for making this video.
Thanks man!
Where is the databricks application coming from, that gets permissions on the Key Vault @Adam ?
Great! Thanks for sharing. When to establish a new key, I noticed under Secrets, there is a warning as "The operation "List" is not enabled in this key vault's access policy.
" How to fix it please?
Seems like you are missing 'list' permission in Key Vault. You principal should have assigned Get and List in Key Vault access policies. This is shown during the video in case you have troubles finding this.
great video as always! so the scope of these secrets would be at Work space level correct? is there a way to limit it to a particular notebook to limit the scope of any secret ? Thank you
Thanks! You are active as always, glad to see your back. You can control ACLs for scopes using databricks CLI. It's not as granular and won't solve all scenarios but it's helpful. Too bad it requires premium plan of databricks.
docs.databricks.com/security/secrets/secret-acl.html
@@AdamMarczakYT Yeap , I did had a look earlier..Example if a person has access at work space level, can you stop that user accessing a particular Notebook or data set or a secret scope within that as I believe it would inherit the accessibility from parent. On the other hand if we grant access to a user on a Notebook and not workspace then from which pane he would access the Notebook .... in his own-workspace? Bit unclear how the access levels are defined in Databricks..like Workspace -> Notebook ->Datasets...so on...
Unfortunately there is no linkage like this because those items are no in child-parent relationships. You manage secrets scopes apart from managing notebooks/datasets etc.
@@AdamMarczakYT : Ok good to know. On the related access query , someone has workspace level admin access be prevented to access a Table API? I understand its premium feature so not able to check myself.
Thanks, Adam. You rock, as always.
Thanks again! My pleasure!
Amazing tutorial, congrats.
Glad you think so Fabio!
Hi l am using the community edition, and l am trying to to set up authentication with the CLI, problem is l can't find any token option in my work-space l only have a password and l don't know how to use it to authenticate with the CLI
I'm not sure if you can do this on community edition, but I never used it so I won't be of much help.
Can you have video to mount Azure sqlDW and Azure SQL database to databricks or if you have document.. please share...
That's the plan for the future tutorials. Right now try Microsoft tutorials
Azure SQL Using JDBC docs.microsoft.com/en-us/azure/databricks/data/data-sources/sql-databases?WT.mc_id=AZ-MVP-5003556
Azure SQL Using Apache Spark connector docs.microsoft.com/en-us/azure/databricks/data/data-sources/sql-databases-azure?WT.mc_id=AZ-MVP-5003556
Azure SQL DW (synapse) docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/synapse-analytics?WT.mc_id=AZ-MVP-5003556
Hi Adam, Is there a way to put the key directly in command instead of a file?
You can, but you shouldn't expose your keys in the notebooks. You should always use secret scopes! It's a big security risk!
@@AdamMarczakYT not on note book on CLI ?
You are the best Adam. Many thanks!!
My pleasure!
Hi adam - I am unable to create scope using AKV is it because of premium databrick plan ?
As far as I remember AKV with scopes is not premium feature. Only granular permissions are, you should be able to set it up.
Hi adam I could able create AKV backed scope and I can see azure databricks permission in AKV - ACL with get, list secrets and other privilege but I was getting bad request failure to pick the secrets , would because of any firewall or network issues ? Can you please let me know do I need to check any other permission ( off course I made SP to grant get and list for the same AKV)
I could not establish the connection between databricks and AKV after creating AKV backed secret?? I need help here
I could create scope .. thank you
clear and very instructive
well done!
Thanks Vinny!
Very useful video. Thanks a lot!
Glad to hear that!
Can you create a video using secrets and scopes to mount ADLS on databricks.
I got video on ADLS introduction where I show how to mount it in databricks. Just replace the value with secret scope and you are good to go :)
thanks for this video ! helped me to unstuck :)
when i paste the token, it did not work, any help?
Awesome video adam
Thanks!
Thank you very much Adam....
My pleasure!
Do more on Data bricks
I might :) Thanks!
@@AdamMarczakYT you said might that means definitely yes : )
Hi Adam - I am facing the below error while trying to add secrets using CLI
c:\Users\ databricks secrets put --scope myblobkv --key blobkey
Error: b'{"error_code":"INTERNAL_ERROR","message":""}'. Could you pls help me with this ?
Unfortunately can't debug issues over youtube comments. I have not seen this issue. Try again all commands step by step as in my video, if you still encounter the same issue than I'd try databricks forums. This seems to be generic issue.
If the secret scope is managed with backend as AzureKeyVault, you will not be able to add or manage keys with access token provided from databricks user settings. If you had followed this video, Adam has accessed databricks workspace from CLI using access token from user settings provided in databricks UI. Refer command: databricks configure --token.
You need an Azure AD user token to create an Azure Key Vault-backed secret scope with the Databricks CLI. We can't use databricks personal token to manage key vault based scopes through CLI.
@@AdamMarczakYT Thank you Adam for such a good video. Keep posting more for us on databricks.
You need an Azure AD user token to create an Azure Key Vault-backed secret scope with the Databricks CLI. You cannot use an Azure Databricks personal access token or an Azure AD application token that belongs to a service principal.
how to write code
Sarah and John like to pass notes to each other in class. To ensure their secrets don't get exposed, they have developed a secret language to communicate. It is your job to decipher the code and write a program that can decode their secrets in python
excellent.keep going
Thank you, I will
Great!
Thanks!
Can we have some nice detail video on COSMOS DB & Azure Synapse Analytics ?
I'm currently running Cosmos DB project at work so definitely some detail and lessons learned projects are coming in future, so far I got introduction video on my channel. For synapse I'm waiting for new Synapse UI to come out to make a video.
@@AdamMarczakYT Can I ask you for one suggestion. I want to go for synapse & Cosmos db expertise/certification.Can you guide me pls.
I personally learn from the Microsoft docs and blogs. Can't recommend external training websites because I haven't seen them.
@@AdamMarczakYT Thanks. Will wait for your next video on above topics.
Hope this can help you. There is you tube video "Best practices for Azure Cosmos DB: Data modeling Partitioning and RUs - BRK3054" is very good.
th-cam.com/video/bQBeTeYUrR8/w-d-xo.html
Some portion of video was blurred but content is good.
Maybe it was your connection. I checked entire video and it's 4k recording. in case you experience issues let me know at which part! Thanks for watching!
Lovely Adam
Thank you :)
Pls upload a video for azure devops CD CI pipeline for Azure Data Factory
I just might. In the meanwhile I highly recommend this article and azure friday video as it already covers the topic docs.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment
Nice explanation . Can you please make video azure synapse
I plan to! But currently synapse has any barely any new features released over old SQL DW so I might wait till they release some.
@@AdamMarczakYT : I agreed...but just overview of synapse arc hitecture ..how it will replace datwarehouse?
Will do as soon as they release unified workspace experience otherwise entire video would be using user interface that will change completely very soon.
wow
thanks :)