You are absolutely right, they will not be loaded into the powerpivot model. Thanks for pointing that out, I will pin your comment for more visibility:) /Ruth
Thanks again Ruth! I have a question about cardinality... If I need to create a URL column based upon some lengthy but common stem text merged with a unique identifier in another field, I can see how that longer (but still unique) column would bloat the data model. So in that situation, would you create a DAX measure that generates the full URL? My concern about doing that is that if there are say 100,000 rows, and we're showing them in a table, every time the table contents changes on screen (perhaps as a result of a filter?) the framework will have to recalculate all of the measures before it can redraw the table again - and that could be very slow... WDYT? Thanks!
Splitting the columns is the way to go. Here the sqlbi guys give some tips depending on the column type: www.sqlbi.com/articles/optimizing-high-cardinality-columns-in-vertipaq/ /Ruth
Thanks for the video, Ruth. Though the measure vs columns part is not entirely clear to me. In my experience the models with focus on columns instead of measures perform more smoothly as every time you filter the measure is recalculated which takes more time? Also wondering at what model size the memory becomes a bottleneck? Eg Is 300 mb model too big ? Would be great if you commented on this , Ruth. Thank you.
Hi Sergey, Analyze your model using this: www.powerpivotblog.nl/what-is-eating-up-my-memory-powerpivot-excel-edition/ or this (same but video format), th-cam.com/video/IRPEAIgZeY8/w-d-xo.html And check how much memory is consumed using calculated columns or measures. It is the best way to know! /Ruth
Good Info Ruth - Thanks. It would be interesting to see some statistics / results when implementing these technique and the different in file size etc. I know I have seen a surprisingly small PBIX file and once in awhile one that is not very small.
Thank you for the great tips. What should be the max size of a model in PBI out of your experience? It is so convenient to have everything inside .... I have collected already 200 MB and the performance is still okay.
Think about the audience: who is going to see the power bi report ? If it is only marketing , then you don’t need the purchasing analysis there, but if it is relevant, you can link to it in the power bi service. Better highly focused reports with links to related reports, than a huge Power BI file that contains everything and is unresponsive. Happy Friday Nir! /Ruth
Curbal Agree but... In case we split the report, in many scenarios there are few / several tables that are relevant to all report (dates /location /etc) and if you made a change to one of them in one report you need to do it many times. In addition you refresh the same table few times (for every report) instead of one time. The downside is indeed that make one big file can be slowly (still didn't happen to me ;-)) Have a wonderful weekend. Nir.
There is no perfect scenario, but targeted reports usually perform better not only technically but also usability. But then again, no solution is perfect... it always depends on something.. Have a great weekend! /Ruth
If we uncheck the enable load in Powerquery those tables will not affect performance am I right? Thanks as always Ruth. Much appreciated
You are absolutely right, they will not be loaded into the powerpivot model.
Thanks for pointing that out, I will pin your comment for more visibility:)
/Ruth
Thank you Ruth, as always this was very informative and a good reminder.
East rules that are hard to follow!
/Ruth
Thanks again Ruth! I have a question about cardinality... If I need to create a URL column based upon some lengthy but common stem text merged with a unique identifier in another field, I can see how that longer (but still unique) column would bloat the data model. So in that situation, would you create a DAX measure that generates the full URL? My concern about doing that is that if there are say 100,000 rows, and we're showing them in a table, every time the table contents changes on screen (perhaps as a result of a filter?) the framework will have to recalculate all of the measures before it can redraw the table again - and that could be very slow... WDYT? Thanks!
Splitting the columns is the way to go. Here the sqlbi guys give some tips depending on the column type:
www.sqlbi.com/articles/optimizing-high-cardinality-columns-in-vertipaq/
/Ruth
Thanks for the video, Ruth.
Though the measure vs columns part is not entirely clear to me. In my experience the models with focus on columns instead of measures perform more smoothly as every time you filter the measure is recalculated which takes more time? Also wondering at what model size the memory becomes a bottleneck? Eg Is 300 mb model too big ? Would be great if you commented on this , Ruth.
Thank you.
Hi Sergey,
Analyze your model using this:
www.powerpivotblog.nl/what-is-eating-up-my-memory-powerpivot-excel-edition/
or this (same but video format),
th-cam.com/video/IRPEAIgZeY8/w-d-xo.html
And check how much memory is consumed using calculated columns or measures. It is the best way to know!
/Ruth
Thanks for more these tips.
You welcome and have a great weekend!
/Ruth
Good Info Ruth - Thanks. It would be interesting to see some statistics / results when implementing these technique and the different in file size etc. I know I have seen a surprisingly small PBIX file and once in awhile one that is not very small.
That is a great idea! Will do :)
/Ruth
Thank you for the great tips. What should be the max size of a model in PBI out of your experience? It is so convenient to have everything inside .... I have collected already 200 MB and the performance is still okay.
I have not looked at the size of the models but more at the contents, so I couldn’t tell you .... maybe somebody here has?
Happy Friday :)
/Ruth
Thank you for the great tips!
You welcome and happy Friday!!
/Ruth
Great video as always!! 10x
Tip 1- what do u you prefer, many pbix file with file per analysis or one big file that cover all ( most) of the analysis?
Think about the audience: who is going to see the power bi report ?
If it is only marketing , then you don’t need the purchasing analysis there, but if it is relevant, you can link to it in the power bi service.
Better highly focused reports with links to related reports, than a huge Power BI file that contains everything and is unresponsive.
Happy Friday Nir!
/Ruth
Curbal
Agree but...
In case we split the report, in many scenarios there are few / several tables that are relevant to all report (dates /location /etc) and if you made a change to one of them in one report you need to do it many times.
In addition you refresh the same table few times (for every report) instead of one time.
The downside is indeed that make one big file can be slowly (still didn't happen to me ;-))
Have a wonderful weekend.
Nir.
There is no perfect scenario, but targeted reports usually perform better not only technically but also usability.
But then again, no solution is perfect... it always depends on something..
Have a great weekend!
/Ruth
@@CurbalEN Hopefully DataFlows will start to address some of this. It is a tough balance right now and there definitely is no right answer. :o)
Mmm I am curious... how would dataflows tackle this? You mean, create more focus data sources instead of giving access to the entire database?
/Ruth
Can you please make some video on map. Like Import customize map then how to work on it. Regards
Have you seen my maps playlist?
m.th-cam.com/play/PLDz00l_jz6zyQeP3lOa4BeUr3TX4_czkL.html
/Ruth
This is great for me, i am starting in models dax.
Great to hear :)
/Ruth
Thank you
Glad you liked it :)
/Ruth
Easy rules, thanks for the advice
Easy tips are the best tips!
Happy Friday :)
/Ruth