I'm a senior Data engineer from Belgium and this is one of the best videos i've ever seen about Fabric , a 38 minutes or rich high quality content . Congrats Hari! I have faced only small issue in saving the xmla in tabular editor , I'll look for alternatives
Hi Hari, I have created and presented to my leaders usage metrics report POC for one workspace based on your video and it was really cool. Any plans for multiple workspaces video ?
@MrV2707 I am facing one issue , i can't able to keep the history records , its always shows the data for latest 30 days only. could you please share how you configured to maintain history records.
Hi Hari, great video! Im not able to get past 11:22 in the video, i get the following message when running this XMLA query, any ideas what could be wrong? I've followed step by step before that. Im using SQL server management studio 20.1 I have tried three different workspaces and different Usage Metrics Reports... , on where i am admin/creator, same error message on all three. Result: Executing the query ... Lists of BinaryXml value tokens not supported. Run complete Query { "export": { "layout": "delta", "type": "full", "objects": [ { "database": "Name of our database" } ] } }
Answering myself here.. i dont get the same log as you in sql server management studio, but after syncing One Lake another time the Parquet files etc. is available.. Maybe it works.. must continue with the video :)
Hey Hari, This is was great, thanks , do you think it's can work also for getting data from multiple workspaces? If I will connect them to onelake, and then will do this process two times (but won't replace the dimensions ) it can work, no ?
Hey Hari, this is a great video with so much important concepts. I was trying to follow this video and apply to capture historical data from Fabric Capacity Metrics offered by Microsoft. It didn't work. Couldn't export OneLake Delta Tables in the OneLake -Microsoft (Preview) (personal Computer). The issue is TMSL command ran successfully by I don't see them inthe Onelake Drive. Can you create one video on that topic?
@@HariBI Hi Hari , Historical data not able to capture it always showing for latest 30 days data only. I have created separate date time table with 6 months feature dates. could you please do one more video how to handle 6 months history data.
Hi Hari, I can't seem to be getting the historical data. I've ingested first time 30 days. As a result, my warehouse shows dates 19 Sep to 20 Oct. Then, I changed the parameter to 1 day, and refresh my data flow daily. However, I notice that the dates are still 30 days. For example, today I'm getting 22 Sep to 23 Oct data... How do I save the history?
Hi @@HariBI , I'm facing error when I changed to Append for the Dates table under Others folder. In my report, it says duplicate value is not allowed for many to one relationship. Error: column Date in Table Dates contains a duplicate value 45561 and this is not allowed for columns on the one side of a many to one relationship or for columns that are used as the primary key of a table. Dates is Append Refresh Stats is Replace Tables under Facts are Append Tables under Dimensions are Replace YesterdayDate is not ingested to Data Warehouse
Could you please tell how you created power query template for report model and also for date table does not have hierarchy while i tried to add hierarchy not adding.
@HariBi Could you please explain how we can load this historical data to sql server as incremental load. you have showed to create data model but not showed how to load data to sql server.
Hi Hari , thanks for the video. I tried this and getting error in SSMS server - I have used the connection string from workspace settings only. Any inputs ?
Hi Hari, I’d like to personally talk to you regarding my future career options. Is there way that I can reach out to you? Kindly let me know. Thank you! Venu Gopal.
This is unreal content! Have been spending way too many hours on using Power Automate, but this is the cleaner way to go, thanks Hari
I'm a senior Data engineer from Belgium and this is one of the best videos i've ever seen about Fabric , a 38 minutes or rich high quality content . Congrats Hari! I have faced only small issue in saving the xmla in tabular editor , I'll look for alternatives
Thank you
Hey Hari, This is great, thanks.
Can you please let me know how to handle it for multiple workspaces?
Hi Hari, Any plans for multiple workspaces video?
Hi Hari, any process for collating multiple workspaces data into one dataflow?
Hi Hari, I have created and presented to my leaders usage metrics report POC for one workspace based on your video and it was really cool. Any plans for multiple workspaces video ?
@MrV2707 I am facing one issue , i can't able to keep the history records , its always shows the data for latest 30 days only. could you please share how you configured to maintain history records.
Hi Hari, great video! Im not able to get past 11:22 in the video, i get the following message when running this XMLA query, any ideas what could be wrong? I've followed step by step before that.
Im using SQL server management studio 20.1
I have tried three different workspaces and different Usage Metrics Reports... , on where i am admin/creator, same error message on all three.
Result:
Executing the query ...
Lists of BinaryXml value tokens not supported.
Run complete
Query
{
"export": {
"layout": "delta",
"type": "full",
"objects": [
{
"database": "Name of our database"
}
]
}
}
Answering myself here.. i dont get the same log as you in sql server management studio, but after syncing One Lake another time the Parquet files etc. is available.. Maybe it works.. must continue with the video :)
Hey Hari, This is was great, thanks ,
do you think it's can work also for getting data from multiple workspaces?
If I will connect them to onelake, and then will do this process two times (but won't replace the dimensions ) it can work, no ?
Hey Hari, this is a great video with so much important concepts. I was trying to follow this video and apply to capture historical data from Fabric Capacity Metrics offered by Microsoft. It didn't work. Couldn't export OneLake Delta Tables in the OneLake -Microsoft (Preview) (personal Computer). The issue is TMSL command ran successfully by I don't see them inthe Onelake Drive. Can you create one video on that topic?
Try refresh your semantic model once.
@@HariBI Hi Hari , Historical data not able to capture it always showing for latest 30 days data only.
I have created separate date time table with 6 months feature dates. could you please do one more video how to handle 6 months history data.
Hello hari,
Worderful content, have one question does it will save data automatically in sql
If you schedule then yes
Any Api to be used there or we can directly setup schedule
You can schedule the flow directly
One more favor, can u bit explain where to set up this
Hi Hari, I can't seem to be getting the historical data. I've ingested first time 30 days. As a result, my warehouse shows dates 19 Sep to 20 Oct. Then, I changed the parameter to 1 day, and refresh my data flow daily. However, I notice that the dates are still 30 days. For example, today I'm getting 22 Sep to 23 Oct data... How do I save the history?
If you choose Append then you should see
Hi @@HariBI , I'm facing error when I changed to Append for the Dates table under Others folder. In my report, it says duplicate value is not allowed for many to one relationship.
Error: column Date in Table Dates contains a duplicate value 45561 and this is not allowed for columns on the one side of a many to one relationship or for columns that are used as the primary key of a table.
Dates is Append
Refresh Stats is Replace
Tables under Facts are Append
Tables under Dimensions are Replace
YesterdayDate is not ingested to Data Warehouse
@@HariBI btw append doesn't work as well :/ it's not 28 Sep to 29 Oct
Could you please tell how you created power query template for report model and also for date table does not have hierarchy while i tried to add hierarchy not adding.
@HariBi Could you please explain how we can load this historical data to sql server as incremental load. you have showed to create data model but not showed how to load data to sql server.
Hi Hari , thanks for the video. I tried this and getting error in SSMS server - I have used the connection string from workspace settings only. Any inputs ?
What is the error that you are getting
Hi Hari, thanks for this. Is there any way to get this data for non fabric but premium workspaces?
Yes Premium capacity workspaces supports fabric workloads so you can fo this for your premium workspaces. Not supported to ppu workspaces
Yes, you can use power automate for it
Can you please explain it in detail on how to use power automate
@@emersonviniciussoares2827: can you explain on how to use power automate for this
how to load audit logs to SQl without using power bi service?
You need to call activity events apis and store the results in SQL
@@HariBI could please make a video on that to fetch usage log from office 365 to SQL? 🥺 Please
My report usage metrics shows 2 years old data. what should I do to get the current data?
Pls check whether the dataset of usage metrics is refreshing daily
@@HariBI it's getting refreshed daily. I checked. My senior manager wanted this data latest. Please help
@@malaydubey3584 it should work. Try create one more usage reports by cloning the 3 dots and check
@@HariBI if the refresh is failing in one of the workspaces from where I can reschedule the refresh
@HariBI how can I delete the usage data source in workspace I got to know that after deleting and recreating it might work
Hi Hari, I’d like to personally talk to you regarding my future career options. Is there way that I can reach out to you? Kindly let me know.
Thank you!
Venu Gopal.
Can you reach me at dataonmyview@gmail.com?