Thanks, I really like this video also. It is a very important use case and one almost every company I worked at needed. Made my day, that you are able to use it!
Thank you for the info. I normally do this through powershell but this is a very quick and simple way for one-off tasks. The storage explorer is pretty neat!
Hi@@CoopmanGreg , I followed your instructions. However, when shared the SAS Urls to users, they were not able to access . AuthenticationFailed Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:6593c8cd-c01e-006e-0d17-4232df000000 Time:2022-03-27T20:19:22.1895723Z Signature did not match. String to sign used was /blob/dpstorageaccnt/$root dpstoragecontainer1-read 2020-10-02 c How do I fix it? Thank you.
@@dianeng1030 It might be a firewall problem. If you have selected networks on, you might have to add their users ip address to the "Add IP ranges to allow access from the internet or your on-premises networks" section' or change the setting to allow all networks to connect. You can find these settings under network or your storage account. Be careful, and check with your company's IT networking dept and make sure you can do this safely. If that does not work, double check how I did it, see th-cam.com/video/R_YrhDD47vk/w-d-xo.html and maybe you made a mistake there.
Really very good. I need some help. how can ensure only B2C user login can download files. Basically how can I integrate Azure B2C with Azure blob storage container.
i have a use case can you help? all client container in one stroage account. One client should not see the other containers not even listed and also no content. even though you mentioned , you have not explained how does this method in the video restrict
Sorry for the late reply. In the video I am only setting up a SAS for London, as you can see when I right click on the London container at 18:52 at then go into setting up the SAS Urls . I would then proceed if I was doing this in real life, to select each separate city (like Fort Lauderdale) and then set up a new SAS for it (which would be different URL(s) that would be created.) I only send the SAS Urls to the respective Tennis centers and they would not be able to guess the others, nor should they.
Thanks for the very helpful tutorial. Do you have a link to any tutorial, documentation, etc, that does what you demonstrate in this video for GEN2 storage? You mentioned Powershell and I'd like to get this working. I have external partners dropping files into our storage account. If I do figure this out all the way through, then I may have to give each partner a separate ADLS. Thanks.
Greg this was very well done and very helpful, thank you very much. Question for you as a newbie: How do partners upload their information that have not used Azure before? Do they have to download Azure Explorer?
Thanks Mike for the kind words. As to your question, Yes they can download and use Azure Explorer which would be the manual way of doing the uploads. But, a utility can be run from here local computer using AZCOPY utility (see my video, th-cam.com/video/I0s1H9VAcPA/w-d-xo.html ) also. They can place the AZCOPY command into a DOS batch file and schedule it to run on a schedule using Windows Task Scheduler. The code would look something like this (this uploads the all files within the folder): azcopy sync "d:\testupload" "localwork.blob.core.windows.net/abunchofstuff/?sv=2019-10-10&st=2021-06-07T15%%3A58%%3A00Z&se=2023-06-09T15%%3A58%%3A00Z&sr=c&sp=racwdxl&sig=3mKibZBZqgCJO6vfY%%2FcxoqudAY1Oi%%2FvHUNipSM418Y%%3D" in the batch file. Hope this helps.
Can u put same video for ADLS Gen2, step by step and then access it through Databricks and restricting user access in DataBricks notebook using Python. Thanks in advance
I made a video on Databricks that uses ADLS Gen2 blob storage. See th-cam.com/video/xsesrHIW9FQ/w-d-xo.html . Outside of that video I do not plan on making any further Data bricks videos at this time. But, thanks for your suggestion.
My concern is that it worked perfectly in within Azure Storage Explorer emulator. When I gave the SAS to an actual person and they used it, I got an authentication failed error as follows: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. Signature did not match. String to sign used was cl 2023-03-10T03:20:48Z 2023-03-17T02:20:48Z... I removed sensitive info. Any idea how to address this
I know that error and have experienced myself many times in the past. It was not a hard fix, but right now I cannot remember what I did to get around it. I am not able to test it either right now. Maybe the IP address she is using is not white listed on the storage account, I will try to get back to you if you don't get the answer first, but it might be a while. Sorry.
@@CoopmanGreg Thank you for your response Greg. This is making me learn AIM and RBAC. I just assigned myself role access in AD 2 minutes ago. I am trying to access it as we speak
Hi Greg, Very good and informative video... I followed your steps and done everything as you explained... The only one problem I am facing is: When I share the 3 URLS, with another person, they are not able to access it, how to fix that? What I am missing or which writes and from where I need to assign them?
I just responded to someone else with this problem "Leon L" in this videos comments section. I am not sure if this solves the problem. But check it out under the "Leon L". In general you should get the customers ip addresses they will be using the URLs with and then enter them on the IP Access list for the respective storage account. For more details, see the thread I mentioned.
Thank you. This helped me to get the understand on how to make it available to the services/persons.
Fantastic!
Spend days searching for this on Google, this is really helpful
Great! Glad it helped.
Really its a good use case. We got a similar use case and achieved the same after referring to your videos. Thanks a lot.
Thanks, I really like this video also. It is a very important use case and one almost every company I worked at needed. Made my day, that you are able to use it!
Thank you for the info. I normally do this through powershell but this is a very quick and simple way for one-off tasks. The storage explorer is pretty neat!
You're welcome!
Great video, really helped me understand SAS for the exam, thank you
You're very welcome!
Explained perfectly with examples. Awesome
Thanks and glad it helped.
Thank you for the demo. It is great demo.
My pleasure!
Hi@@CoopmanGreg , I followed your instructions. However, when shared the SAS Urls to users, they were not able to access .
AuthenticationFailed
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:6593c8cd-c01e-006e-0d17-4232df000000 Time:2022-03-27T20:19:22.1895723Z
Signature did not match. String to sign used was /blob/dpstorageaccnt/$root dpstoragecontainer1-read 2020-10-02 c
How do I fix it? Thank you.
@@dianeng1030 It might be a firewall problem. If you have selected networks on, you might have to add their users ip address to the "Add IP ranges to allow access from the internet or your on-premises networks" section' or change the setting to allow all networks to connect. You can find these settings under network or your storage account. Be careful, and check with your company's IT networking dept and make sure you can do this safely. If that does not work, double check how I did it, see th-cam.com/video/R_YrhDD47vk/w-d-xo.html and maybe you made a mistake there.
Love that you shared this!
Glad you liked it!
I sure learned something; a great, practical example. Thx!
Glad to hear it!
Thanks for making this video. Nicely explained and very useful.😀
Glad it was helpful!
Thank you so much, Greg. This was so informative. It helped me. :)
Thanks, glad it helped.
Very useful. Thanks!
You're welcome!
Nice and clear voice
Thanks Ashish!
Really very good. I need some help. how can ensure only B2C user login can download files. Basically how can I integrate Azure B2C with Azure blob storage container.
i have a use case can you help? all client container in one stroage account. One client should not see the other containers not even listed and also no content. even though you mentioned , you have not explained how does this method in the video restrict
Sorry for the late reply. In the video I am only setting up a SAS for London, as you can see when I right click on the London container at 18:52 at then go into setting up the SAS Urls . I would then proceed if I was doing this in real life, to select each separate city (like Fort Lauderdale) and then set up a new SAS for it (which would be different URL(s) that would be created.) I only send the SAS Urls to the respective Tennis centers and they would not be able to guess the others, nor should they.
Thanks for the very helpful tutorial. Do you have a link to any tutorial, documentation, etc, that does what you demonstrate in this video for GEN2 storage? You mentioned Powershell and I'd like to get this working. I have external partners dropping files into our storage account. If I do figure this out all the way through, then I may have to give each partner a separate ADLS. Thanks.
I understand. Maybe this link will help, sauget-ch.fr/en/2020/11/create-a-shared-access-signature-on-an-adls-gen2/
Hi Greg, with just upload (R,C,L) permissions I am able to download the file actually it should not. any Idea why it is happening?
Sorry, it worked ok for me in the demos.
Greg this was very well done and very helpful, thank you very much. Question for you as a newbie: How do partners upload their information that have not used Azure before? Do they have to download Azure Explorer?
Thanks Mike for the kind words. As to your question, Yes they can download and use Azure Explorer which would be the manual way of doing the uploads. But, a utility can be run from here local computer using AZCOPY utility (see my video, th-cam.com/video/I0s1H9VAcPA/w-d-xo.html ) also. They can place the AZCOPY command into a DOS batch file and schedule it to run on a schedule using Windows Task Scheduler.
The code would look something like this (this uploads the all files within the folder):
azcopy sync "d:\testupload" "localwork.blob.core.windows.net/abunchofstuff/?sv=2019-10-10&st=2021-06-07T15%%3A58%%3A00Z&se=2023-06-09T15%%3A58%%3A00Z&sr=c&sp=racwdxl&sig=3mKibZBZqgCJO6vfY%%2FcxoqudAY1Oi%%2FvHUNipSM418Y%%3D" in the batch file.
Hope this helps.
Can u put same video for ADLS Gen2, step by step and then access it through Databricks and restricting user access in DataBricks notebook using Python. Thanks in advance
I made a video on Databricks that uses ADLS Gen2 blob storage. See th-cam.com/video/xsesrHIW9FQ/w-d-xo.html . Outside of that video I do not plan on making any further Data bricks videos at this time. But, thanks for your suggestion.
My concern is that it worked perfectly in within Azure Storage Explorer emulator. When I gave the SAS to an actual person and they used it, I got an authentication failed error as follows:
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Signature did not match. String to sign used was cl 2023-03-10T03:20:48Z 2023-03-17T02:20:48Z...
I removed sensitive info.
Any idea how to address this
Did you get the error when you tested it as yourself?
I know that error and have experienced myself many times in the past. It was not a hard fix, but right now I cannot remember what I did to get around it. I am not able to test it either right now. Maybe the IP address she is using is not white listed on the storage account, I will try to get back to you if you don't get the answer first, but it might be a while. Sorry.
@@CoopmanGreg Thank you for your response Greg. This is making me learn AIM and RBAC. I just assigned myself role access in AD 2 minutes ago. I am trying to access it as we speak
@@CoopmanGreg No worries Greg. Thank you. I will white list my ip address and see what happens
Nope. I couldn't get it to work. Any direction you can provide hen you have time will really be helpful
Great Video!. Thanks. How do you do that in PowerShell? or through programming like python
Thanks for the feedback. Sorry, I have not done it in any script, but if I do, I will try to make it into a video.
Best alternatives of publishing SQL data without Excel and SAS Token
Thanks!
Hi Greg,
Very good and informative video...
I followed your steps and done everything as you explained...
The only one problem I am facing is: When I share the 3 URLS, with another person, they are not able to access it, how to fix that? What I am missing or which writes and from where I need to assign them?
Thanks. What is the response and/or error message they are getting in browser windows? and does it work for you when you try the URLs in question?
I just responded to someone else with this problem "Leon L" in this videos comments section. I am not sure if this solves the problem. But check it out under the "Leon L". In general you should get the customers ip addresses they will be using the URLs with and then enter them on the IP Access list for the respective storage account. For more details, see the thread I mentioned.