Is it possible to take snapshot of the graphs from power bi it need to trigger email and the snapshots need to be change dynamically every time it triggers all this can be done using power automate??
Not sure what you mean by this. You can set up paginated reports which sounds potentially the closest to sending out charts via email and things like that
Historical data can fast become an enormous amount of data. If you not only have 20 or 50 files with historical data (and every file hast 8 columns and 100 rows), but 2000 files with 30 columns and 25.000 rows each, you could get an performance problem. (E.g. when you monitor machines in a plant) Can you make a video and show how to compress this data in Power BI by deleting all rows that did not change. The challenge is to create a report that takes the fact in account that on some days the report has to take the value of the last change.
I would probably explore a solution in Power Automate for that. New file exported to SharePoint, if the data is the same as the previous version archive the previous version - I'm not a huge fan of deleting data. If data is different keep it. You might still end up with lots of files, however, wouldn't keep two files with the same values.
How do you get the Query for power automate without dragging each column of the table in a power bi visual and use refresh query? There must be a better way thx
Do you have best practises / tipps /VIDEOS you can make/share for data historization with power bi? Thinking of bigger datasets here so power automate snapshot probably not a good solution here br
We actually decided to go down the daily snapshot creation, but limit the numbers to 2-3 days per week. Also, we are not keen on grabbing transactional details and validate each and every transaction. If you want to go down that path you really need a more robust solution (ie proper database structure). Please bear in mind, a high-level summary like this is enough to flag if there was any change. After that you always have the option to drill into the details and identify the issue.
You can find the first part of this video here:
th-cam.com/video/8KyF-wMP0O4/w-d-xo.html
Is it possible to take snapshot of the graphs from power bi it need to trigger email and the snapshots need to be change dynamically every time it triggers all this can be done using power automate??
Not sure what you mean by this. You can set up paginated reports which sounds potentially the closest to sending out charts via email and things like that
Historical data can fast become an enormous amount of data. If you not only have 20 or 50 files with historical data (and every file hast 8 columns and 100 rows), but 2000 files with 30 columns and 25.000 rows each, you could get an performance problem. (E.g. when you monitor machines in a plant) Can you make a video and show how to compress this data in Power BI by deleting all rows that did not change. The challenge is to create a report that takes the fact in account that on some days the report has to take the value of the last change.
I would probably explore a solution in Power Automate for that.
New file exported to SharePoint, if the data is the same as the previous version archive the previous version - I'm not a huge fan of deleting data.
If data is different keep it. You might still end up with lots of files, however, wouldn't keep two files with the same values.
How do you get the Query for power automate without dragging each column of the table in a power bi visual and use refresh query?
There must be a better way
thx
You can write the DAX query "by hand" as well - I just found it easier and much quicker to do this way.
Do you have best practises / tipps /VIDEOS you can make/share for data historization with power bi? Thinking of bigger datasets here so power automate snapshot probably not a good solution here
br
We actually decided to go down the daily snapshot creation, but limit the numbers to 2-3 days per week. Also, we are not keen on grabbing transactional details and validate each and every transaction. If you want to go down that path you really need a more robust solution (ie proper database structure).
Please bear in mind, a high-level summary like this is enough to flag if there was any change. After that you always have the option to drill into the details and identify the issue.