Thank you @BaGua79. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
Thank you @Arsalan. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
Thank you @Ken. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
Glad you liked it. Help us promoting the video by sharing it on your social networks and tagging us. Follow us on LinkedIn and X to receive our updates about Cloud and Tech.
How do you take care of scd 1 , scd 2 etc ? Coming from traditional SQL server background, this all sounds f'ed up to me. They talk about external table, then views , then CSV files 😯 ?? W t f is going on
Thank you for your support Ahmed. We appreciate it. If you have a few minutes, please share the session on your social so your friends and contacts can watch it as well. And you can tag the Cloud Lunch and Learn group. Have a great day!
Great session - really like your "show it for real" demo style. So the use of external tables in this session is primarily to easily transform CSV to Parquet format? Also, for adding new partitions to the view, it relies on the Parquet file remaining after a Drop External table has been issued?
Thank you @George. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
Great, thanks for sharing! I was trying to implement a DataVault logical DWH but there is no hashbytes :| I hope this feature will be supported in the future
Thank you @Rodrigo. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
Thank you @Joshua. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
This could be done using Pipelines/Data Factory to iterate and trigger the SP asynchronously. The external table name would need to be unique (dynamic sql to generate the table creation syntax)
13min in, already obvious this is a fantastic video. Thanks for doing this!
Thank you @BaGua79. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
very information session sir
Thank you @Arsalan. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
Great video. thanks for sharing
Thank you @Ken. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
i enjoyed watching this video but please get us 1080p or 4K going forward. THANKS
Very helpful and clear. Thanks for sharing this.
Glad you liked it. Help us promoting the video by sharing it on your social networks and tagging us. Follow us on LinkedIn and X to receive our updates about Cloud and Tech.
How do you take care of scd 1 , scd 2 etc ? Coming from traditional SQL server background, this all sounds f'ed up to me. They talk about external table, then views , then CSV files 😯 ?? W t f is going on
Thank you this amazing video!!
Thank you for your support Ahmed. We appreciate it. If you have a few minutes, please share the session on your social so your friends and contacts can watch it as well. And you can tag the Cloud Lunch and Learn group. Have a great day!
@@CloudLunchLearn I did already. Thank you 😊
Great session - really like your "show it for real" demo style. So the use of external tables in this session is primarily to easily transform CSV to Parquet format? Also, for adding new partitions to the view, it relies on the Parquet file remaining after a Drop External table has been issued?
Thank you @George. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
Great, thanks for sharing! I was trying to implement a DataVault logical DWH but there is no hashbytes :| I hope this feature will be supported in the future
Hi, the HASHBYTES function is now supported in Serverless SQL Pools
Thank you @Rodrigo. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
Super nice !! I is there any way to have the scripts and source files ? Thank you !!
nice video, a shame the resolution.
Thank you @Joshua. We are happy you enjoyed it. Please help us promote it, by sharing it with your colleagues and friends using your social accounts. Have a great day.
Can you please give us the csv files
Would looping through a series of dates be synchronous? Is there a way to do that asynchronously?
This could be done using Pipelines/Data Factory to iterate and trigger the SP asynchronously. The external table name would need to be unique (dynamic sql to generate the table creation syntax)
I m surprised I can't see anything
Your intro is too long