Thanks for the video, When you copied from one folder to data folder, it created a data file. Could you please explain what the data file is when you did a refresh and are they anyways for not create it?
I have a doubt .You have successfully mounted folder inside container .But can you please tell me how to mount input folder also so that files inside it can also be mounted to dbfs right.I tried adding in wasb with a dot but the code takes long time to run.Could you please help me sir
Thank you so much for making this wonderful video. I have a small question. The mount_point that you selected as 'mnt/blobstorage', can it be anything of our wish? Can we give it any other name say mnt/temp_storage? Is there any format that we need to follow?
Hi I have one doubt , in your blob storage account how come a input folder got created ?? As per my knowledge blob storage won't have hierarchical structure right ?
We can create the folders in blob also but the only thing is which u delete the outer folder, inner folder and files will be deleted. It's just like path and not actually hierarchical system
How can I copy from one mount point to another moint point using copy command. Both mount points being in different Azure subscriptions and different storage accounts? Please help
Hi Maheer I have a question that if the blob storage contains huge data can we still able to mount it to dbfs? if so what is the storage limit for dbfs?
HI @WafaStudies Sir, I would some quick info, how do we mount windows shared drive path to databricks. I need to copy directory from shared path to DBFS and from there i need to process data. please help me in getting my problem fix or let me know your number i will reach you out. This is bit urgent
Nice crystal clear explanation. Keep posting same kind of simplified videos
Once again, your videos are saving the day, and saving a lot of work for me. Thank you!
Please provide more videos on actual data ingestion related using databricks same like in ADF videos
thank you for this super simplified explanation 👍
Thank you ☺️
Very well explained Sir ❤
Thanks for the video,
When you copied from one folder to data folder, it created a data file. Could you please explain what the data file is when you did a refresh and are they anyways for not create it?
I have a doubt .You have successfully mounted folder inside container .But can you please tell me how to mount input folder also so that files inside it can also be mounted to dbfs right.I tried adding in wasb with a dot but the code takes long time to run.Could you please help me sir
Neat presentation. Thank you.
Hi Maheer..I am watching your Databricks video series, it just amazing...Could you please let me know how many videos are left to upload
Thanku brother ❤
I have 2 questions:
1. Is there any other way to access Azure blob storage?
2. Is there any performance issue with using mount functionality?
Thank you so much for making this wonderful video. I have a small question. The mount_point that you selected as 'mnt/blobstorage', can it be anything of our wish? Can we give it any other name say mnt/temp_storage? Is there any format that we need to follow?
Yes u can give any name.
Please share the code snippet for ease use, and thnx for ur video
How can I get that employee.csv file into a dataframe now?
Hi I have one doubt , in your blob storage account how come a input folder got created ?? As per my knowledge blob storage won't have hierarchical structure right ?
We can create the folders in blob also but the only thing is which u delete the outer folder, inner folder and files will be deleted. It's just like path and not actually hierarchical system
how to mount fileshare and not container, but trying to mount a fileshare but its not working
Is that mount folder needs to start with /mnt or can i give like /stg
How can I copy from one mount point to another moint point using copy command. Both mount points being in different Azure subscriptions and different storage accounts?
Please help
There's multiple ways you can accomplish that.
1. Use the %sh cp command.
2. Use dbutils copy command.
3. Use the azcopy utility binary .
How are we making sure data in transit is secured?
super explanation to the point thank u
Welcome 😀
How to mount using databricks managed Identity?
If you get an error try unomount the mount point ans try again
Hi Maheer I have a question that if the blob storage contains huge data can we still able to mount it to dbfs?
if so what is the storage limit for dbfs?
The mount point in DBFS is just a reference to the actual storage. No data is actually transfered to to DBFS when a mount is created.
please show how to mount fileshare, and not container
Thank you so much ❤
nice video. thanks
🔥
HI @WafaStudies Sir, I would some quick info, how do we mount windows shared drive path to databricks. I need to copy directory from shared path to DBFS and from there i need to process data. please help me in getting my problem fix or let me know your number i will reach you out. This is bit urgent