I love this approach. I've just finished a Power BI course and I found that DAX can be scary sometimes 😊. I needed to see this video and I'm glad I found it. It will help me a lot with my further adventures with Power BI and Power Query.
As always I have to practice by myself to fully understand the many subtleties that you present to us. your videos are both factual lessons and sources of ideas to progress playfully. Thank you
I usually turns my Google Sheets to PQ for simplicity and use PPv for linking files/tables and good presentation Thanks a lot for your continuous efforts and making PQ more efficient.
Very clear explanation which leads to a conclusion: since PQuery is easy to use, prepare your data in PQuery first so that you will need less formula authoring in Dax.
Thx Wyn! I work with very large data sets and before building a PBI I usually do all the ‘cleanup’ (including adding helper columns) in an excel power query file first. This allows me to do a thorough analysis of the data before pulling it into power bi and then I use it as an auditing tool to validate the PBI results. I feel like less can go wrong in DAX when I use helper columns in power query, so I am a proponent of using them.
Excellent video Wyn and I absolutely agree that using PQ to format data where possible is preferable to using DAX. I haven't really used the Group By function yet but will definitely give it a try after this...Ken
WYN I enjoyed this very much. Power Query 1st is my typical methodology as I prefer simple to complex. It also my be a preference since my DAX is weaker than my M.
Thanks Maurice, yes Power Query is so much easier to understand. Sometimes there’s no avoiding complex DAX or Power Query can be misused, bit there’s a nice balance to be searched for in many scenarios.
Anything that simplifies DAX is plus! You could also have added the helper columns as calculated columns, but creating them with Power Query is probably easier, and if part of a Power BI dataset they will be available for use in other reports.
Definitely recommended best practice to add columns in Power Query ( or the source ) rather than using DAX calculated columns. “As far upstream as possible, as far downstream as necessary” to quote Matthew Roche. Easier to debug and better performance.
Very good lesson. The only thing that I would like to see is how to use this approach in a real dashboard and when applied filter or slicer it behave accordingly to a click and what about the relationships between tables
@@AccessAnalytic I watched entire video but I didn't see anything I wanted. If the previous table were auxiliar table how I should use it in real dashboard and if it enter as part of relationships or not
Not something I've delved into, but technically a returning customer is one where the new customer flag 1 Not sure of the definition of temporary lost.
Agreed 100%. If your data model is clean and with the correct granularity, your DAX measures are much easier and simple to create. I always try to work on the data modeling side making as many transformations as possible in PQ. Now, quick question: on the "100 club visits" how do you force the correct grand total? Right now, it is showing as 3 where it should be a total of 7. Thanks, Wyn... Great posting
I think the key thing here is that the meausre isn't showing the number of visits by 100 club members in the selected time period, it's showing us the no. of 100 club members that visted in that time period. Therefore the total is correct as it's giving us the no. of 100 club memebers that visited in the year. However my pedantry would probably say we should name the measure something slightly different. :) EDIT: I've just re-watched the video and he uses an accurate and clearer name for the meausre in the example at the begining. See 40 seconds in. btw, I totally agree and enjoy with the example of Roche's Maxim here. I'm a big fan of it and your PQ videos too!
And this approach will be much faster than DAX approach. I’d say start with DAX if fast enough, then move to ETL/Power Query to speed up if needed as data size increases.
Sorry, but I can't agree with this way to simplify the DAX code by cooking the raw data. This way can work with small dataset but with big dataset you will get trouble and the second issue is that you will lose the dynamic of the calculation when you use it in measure. I think for power query, use it to clean and restructure data but not aggregate data or group data, better to keep the raw data in correct structure.
Hi, note I'm still using DAX to keep it dynamic, and I'm not storing the aggregated data in Power Query. Although aggregating in PQ might be the right approach if your data is unnecessarily granular. I'm also not saying this is the approach for all problems, but it can make things a lot quicker in a big data set too. Might be worth reading this thread... twitter.com/Milhouse/status/1579138159266652161
I love this approach. I've just finished a Power BI course and I found that DAX can be scary sometimes 😊. I needed to see this video and I'm glad I found it. It will help me a lot with my further adventures with Power BI and Power Query.
That's great Barbara. DAX is a difficult concept to understand and apply correctly. We all struggle with it. Power Query can often make it simpler.
Came to the same conclusion a hole back but wasn't sure. Now I am. Thanks
You’re welcome. Yep, PQ comes to the rescue regularly
As always I have to practice by myself to fully understand the many subtleties that you present to us. your videos are both factual lessons and sources of ideas to progress playfully. Thank you
Kind words, thank you
I learned a few new things as you moved through the steps. Great tutorial! Thank you!!!
Cheers
I usually turns my Google Sheets to PQ for simplicity and use PPv for linking files/tables and good presentation
Thanks a lot for your continuous efforts and making PQ more efficient.
You're welcome
Very clear explanation which leads to a conclusion: since PQuery is easy to use, prepare your data in PQuery first so that you will need less formula authoring in Dax.
Absolutely Carlos. Sometimes there’s no avoiding complex DAX but when I can I do
What a relief for the project I’m conducting ! 🎊 🎉
hope it helps!
Thx Wyn! I work with very large data sets and before building a PBI I usually do all the ‘cleanup’ (including adding helper columns) in an excel power query file first. This allows me to do a thorough analysis of the data before pulling it into power bi and then I use it as an auditing tool to validate the PBI results. I feel like less can go wrong in DAX when I use helper columns in power query, so I am a proponent of using them.
You’re welcome Irene. Simplicity and “debug-ability” are important for me
As usaual awesome video, thanks for sharing....
Combination of power query and dax can make wonders
Cheers
Really learnt a lot from this video Wyn, thanks for sharing 🙏🏿
You're welcome, thanks for taking the time to leave some kind feedback
Excellent video Wyn and I absolutely agree that using PQ to format data where possible is preferable to using DAX. I haven't really used the Group By function yet but will definitely give it a try after this...Ken
Cheers Ken
I a lot of enjoy yours videos, Wyn thanks for sharing, great solution with PQ.
You’re welcome
WYN I enjoyed this very much. Power Query 1st is my typical methodology as I prefer simple to complex. It also my be a preference since my DAX is weaker than my M.
Thanks Maurice, yes Power Query is so much easier to understand. Sometimes there’s no avoiding complex DAX or Power Query can be misused, bit there’s a nice balance to be searched for in many scenarios.
You da man Wyn! Awesome stuff :)
😀 Cheers!
Anything that simplifies DAX is plus! You could also have added the helper columns as calculated columns, but creating them with Power Query is probably easier, and if part of a Power BI dataset they will be available for use in other reports.
Definitely recommended best practice to add columns in Power Query ( or the source ) rather than using DAX calculated columns. “As far upstream as possible, as far downstream as necessary” to quote Matthew Roche. Easier to debug and better performance.
Great Tips and very efficient way.
Thanks
Very good lesson. The only thing that I would like to see is how to use this approach in a real dashboard and when applied filter or slicer it behave accordingly to a click and what about the relationships between tables
Thanks. If you want to know more about relationships and data model then this might help th-cam.com/video/RV47yX70NN8/w-d-xo.htmlsi=Vmu0SB6Xb9LPw-73
@@AccessAnalytic I watched entire video but I didn't see anything I wanted. If the previous table were auxiliar table how I should use it in real dashboard and if it enter as part of relationships or not
@sirasnet6499 - I’m not understanding sorry
Thank you for sharing !
My pleasure!
Awesowe vedio sir thank you
You’re welcome, thanks
I really like this, is it possible to extend on this more for, returning customers, temporary lost customers etc?
Not something I've delved into, but technically a returning customer is one where the new customer flag 1
Not sure of the definition of temporary lost.
Thanks for sharing.
Awesome.
You’re welcome
Agreed 100%. If your data model is clean and with the correct granularity, your DAX measures are much easier and simple to create. I always try to work on the data modeling side making as many transformations as possible in PQ. Now, quick question: on the "100 club visits" how do you force the correct grand total? Right now, it is showing as 3 where it should be a total of 7. Thanks, Wyn... Great posting
Cheers. Depends what the total should show. Number of Monthly visits? Then in Excel you’d need a combination of IF HASONEVALUE and a SUMX
I think the key thing here is that the meausre isn't showing the number of visits by 100 club members in the selected time period, it's showing us the no. of 100 club members that visted in that time period.
Therefore the total is correct as it's giving us the no. of 100 club memebers that visited in the year.
However my pedantry would probably say we should name the measure something slightly different. :)
EDIT: I've just re-watched the video and he uses an accurate and clearer name for the meausre in the example at the begining. See 40 seconds in.
btw, I totally agree and enjoy with the example of Roche's Maxim here. I'm a big fan of it and your PQ videos too!
Thanks Paul. oh yeah. Mathew Roche. the purple hair guy. I like his videos too! I will also re-watch Wynns video
Cheers Paul
😁
Excellent idea
Thanks Kebin
😊 I do this in SAS EDW often 😊
First. Function or Last.
Fantastic approach, Wyn; DAX can be daunting. Reading your Power BI book right now
Excellent! Thanks Nìall
Revolutionary!!
Thanks Brandon
And this approach will be much faster than DAX approach. I’d say start with DAX if fast enough, then move to ETL/Power Query to speed up if needed as data size increases.
I go with sacrificing longer refresh for ease of debugging / editing in future and simpler DAX. Better end user and future report owner experience.
Power query always lovable for the doers, Please let us know how to change grand totals to correct?
The total is an interesting one. Question is should it add up the monthly figures or is it a total for year?
@@AccessAnalytic Means as per the selection to have the result
👌👍
👍🏼
Thank you sir 🙏
You’re welcome.
Sorry, but I can't agree with this way to simplify the DAX code by cooking the raw data. This way can work with small dataset but with big dataset you will get trouble and the second issue is that you will lose the dynamic of the calculation when you use it in measure. I think for power query, use it to clean and restructure data but not aggregate data or group data, better to keep the raw data in correct structure.
Hi, note I'm still using DAX to keep it dynamic, and I'm not storing the aggregated data in Power Query. Although aggregating in PQ might be the right approach if your data is unnecessarily granular. I'm also not saying this is the approach for all problems, but it can make things a lot quicker in a big data set too.
Might be worth reading this thread...
twitter.com/Milhouse/status/1579138159266652161