This series is terrific. Intense M coding esp. the end functionfunction joining the table with the bufferred table for the date lookback. It will be very instructive to see the implementation in DAX in a later episode. Always grateful, Mike!!
I recall someone else's words when trying to pick up with PowerQuery/DAX/M -- 'I have to bang my head against walls so many times to overcome the initial hurdle' -- I'm doing the same here. Amazed by the power and beauty behind all this, but lacking the capability to comprehend all, yet. Climbing up my learning curve! ;) Thanks so much for all these great videos!
The whole MSPTDA course is dense and filled with a lot of material. But if you study hard and put in the work, and then practice , practice, practice - you will be greatly Rewarded : )
Mike, "long time listeners (a few months, anyway), first time caller here". First off, thanks so much for offering this series. I have learned a ton. After listening to MSPTDA-10 repeatedly, I finally figured out how to implement the ideas here on a problem for work. I work in sales and have a customer that provides me monthly updates with their weekly use rates of each product for one year out from the date of the update, which I can then compare against my planned inventory to make sure that their needs will be met. I'm just now wrapping that up. But this has led to another idea that I am not sure how to implement. Given their use rate, their safety stock inventories, and how they determine the size of their next order, I can look ahead and determine the next date and order size...which is good. But what I really want is to do that calculation for the entire year's outlook. So I need to be able to determine the first "next order", then recursively adjust the table and then determine the second "next order", and so forth until the end of the rolling 1-year period for the current set of weekly updates. I am not sure how to do this recursive step, or whether it would be better to do this in Power Query or if Data Modeal/DAX (which I still know very little about) would be better. I don't have an enormous amount of data to deal with, just a few thousand rows at most. Any suggestions would be most welcome!
List.Generate() function does the trick. Though I find that what I really want to do is a nested loop to generate a list of lists so as to get the projected inventory for each part number, for all part numbers. The list for each individual part number I've got down and saved as a function, generating a list where each record is the result of calling that function is proving to be beyond me. I'm resorting to an inelegant work-around.
WOW! Each step from Step 1 to Step 5 involved some modification to the M Code - this was definitely a complex solution. I will be re-watching to better understand the concepts of Inside Table, Outside Table, and Buffering. That was a jam-packed video!
This custom function approach we can use to a left join table with many conditions not only equal. Thank you Mike, for your time and efforts to share useful knowledges!!!
Hi Mike, I remembered looking at this video a couple of months back and I needed to use a variable in a formula to store a Buffered table. I found the video and created this - = Table.AddColumn(BufferedTable, "MinIndexForResource", (x) => List.Min(Table.SelectRows(BufferedTable, each [Employee Num] = x[Employee Num] and [Project Number] = x[Project Number] and [Assignment Start Date] = x[Assignment Start Date])[Index]), Int64.Type) - the refresh time of the original query where I used a function passing in to it a non buffered table was 20 mins plus (hopeless). This method reduced that time to just over a minute, just amazing. Thank you so much. Best wishes. David
I love hear that, David! I am happy that the videos and materials that I post can help you in your work! Thanks for helping me with your comments and thumbs ups : )
This video is amazing and it took me a while to figure out what is actually happening here. Thanks for sharing this with the community and investing your time. I am forever in your debt.
Thanks for another great vid, finally booked my MOS Excel Expert exam, at 52 I should have done it years ago, hope to get my Office Master certificate within a year or so. Thanks for all the education and your enthusiasm.
Ta, I've done something similar to this using Offset and Counta combined which will always count back a certain period and return a formula result for a certain period.
It took a while to grasp that as soon as you do custom column there is external table added which has same content of buffered table and then select rows , conditional criteria and custom function x does it work. After getting this it was smooth ride. External table and internal table does the trick. Great Mike. Amazing trick. Hope my understanding is correct.
Yes, the conceptual trick to this is that that there was an internal table that is locked on the full table, but there was also the ability to look at each row in the external table, sort of like a locked range on the full table and then a row-by-row Relative Reference. Thank you for your support, Gentle Raj.
Mike, even paid courses out there don't have the quality and thoroughness your videos do; A huge Thanks for all the effort you're putting to make all these videos for the rest of us in the community.
That is true. Most of the paid courses, are heavy on marketing and promises of easy fixes. Most of TH-cam is filled with that sort of stuff too. My quandary is: how come my stuff which has the hows and whys, and is free, does not get more traction. It might be that I am not good at marketing, or that many people do not want the whys, just the quick fix hows. Nevertheless, you and I and some others, who do want the hows and whys, that is what the excelisfun channel at TH-cam is here for! Thanks for your support on each video that you learn from, Jacob, with a comment, thumbs up and of course your Sub : )
@@excelisfun Hey, just thinking out loud, how about going through a quick overview, covering quick fixes; And then diving deep into the details Those who're just looking for the quick fixes can exit after the initial bit. That way, you're not restricting the content only to those of us who need the detailed version, but also those who want a quick bite of info on the go?
@@JacobVJoseph In the last two years I have tried a bit of that. With over 3000+ videos, I do have some variety. But, even though what I do does not have the highest demand, I love doing it with the hows and whys because for me and can be more effective in creating solutions, more efficient and have way more fun that way. Thanks for your out load think, Jacob!!
AND... it just will keep getting more and more amazing the deeper you go into M. Then, the combo of Excel, Power Query & M, and then Power Pivot and DAX -- Excel is now the mega-ultimate tool every : )
Oh , this one is really complex and takes time to digest and make use of, however, on top of the M code course previously, to learn more by the real example and "feel" and "taste" the grammar and design of M code is really suffering but also enjoyable. Great course !!!
I feel like this might be part of the solution to a project I've hit a wall on that I had to put on hold. Hopefully, I'll be able to work on it again soon. Your videos are a godsend!
Homework is posted below video. Bill Szysz silent movie that shows how to write Custom Function seen in this video with mostly the user interface (silent movie is easy to follow): drive.google.com/open?id=1v7heN-GwV7OHN1vGTLHOxTujI8QVRmVJ
OMG Bill, always awesome, indeed. You just build the formula using step-by-step with one case and then substitute the arguments after using the columns. NICE!!
I’m a big fan of these videos and learned a lot. There is a major problem with the way you do this which occurs if the data is larger. Simply put, you a have five queries when what you need is four tables and a list in one query. Each one query is evaluated independently, so when you reference a query it runs again, reading in the information again. When you merge two separate queries it runs both in the merge and in the separate queries. If you merge two tables inside the same query the data is read only once. Otherwise your refresh time gets long very fast. supplychaindatascientistcom.wordpress.com/2020/12/10/why-is-my-power-bi-query-so-slow-a-query-is-not-a-table/
Hi Mike. Thanks for bringing this topic for all of us on your channel officially. The idea of using custom function becomes even clearer. Will be looking forward to the next videos. And as always, big thumb-up!
Yes, there will be some more in this series. I have a few videos already on Power BI, but in the meantime, here are two other cool TH-cam Teamate channels: th-cam.com/channels/RNmSv7mAPYiC0Y40TJijAw.html th-cam.com/channels/J7UhloHSA4wAqPzyi6TOkw.html docs.microsoft.com/en-us/power-bi/videos
That was brilliant, and never realized how simple it is to turn a function to a table just by removing the superfluous 'each' when using the 'add custom column' . Its' puzzled me for a while , but then a lot of things do.
Table.Buffer() was a great help to me, since Excel and Power Query multiple-columns-sorting do not have same results + what is seen as result from Power Query does not always equal what is saved in the Cache. Wanting to delete duplicates and only Keep the top entry of each Group it was impossible to execute without Table.Buffer(). A Great tool! There is probably a better way to retain the minimum of each group, but it worked. I guess it can be done using the add column with custom function depicted in this Video. Got to figure this one out.
Mark Haggett Table.Buffer is great; I just applied on a 20 minute query (multiple files, multiple tables, thousands of records... don’t ask), and turned it to a 5 minute query
That was pretty impressive! I enjoyed the session and got lost a little (I understand the process but don't know if I am ready to repeat it) but really cool! It's amazing how many tools are available with M code. Off to MSPTDA11 ! Thanks again!
@@tomr9969 You are welcome again! Thanks for your consistent support. I just wish more viewers were as respectful as you are with consistent support : )
Wow! The similarities between PQ and PP (DAX) are greater than I imagined. The fact that you can do that ‘EARLIER’ trick in PQ is amazing. Transcending table boundaries is powerful technique that creates a lot of possibilities, and you just showed one. And all that M-code programming all over the place: very powerful. BTW: very good of you to start referencing other queries in stead of copying them: more efficient. Q: in your explanations on several occations you used the term “steps” where -in my opinion- you should have said “queries”. Steps are parts of a query. I find it clearer not to mix them. Finally: next video, and BOOM, another 2K of subs! Just 10K more and you’ve reached the 500K milestone. Things are really growing fast! Maybe you should contemplate reaching 1M subs before this series is well and truely over! :-) See you on the next one and the previous one, because we’ll be watching them again!
Each step in a let statement is an expression, and the let statement itself is an expression, so in that sense each line is a query... Maybe it is better to say that each step is a sub-query? I decided to call them steps because the "Applied Steps" list uses the word step. There are multiple synonyms for each line in a let statement...
Over the last year or two I have tried to promote the channel more, with 1 m in mind, but it just does not seem that I have the marketing skills to accomplish this. It is particularly odd because I have intentionally provided all my classes (Excel Basics, MSPTDA, Highline Excel Class, Statistics and so on) for free. Especially since I do not believe that there is a better Excel Basics Series (a class that gets you up to a professional level) available for free, anywhere. I have been working with TH-cam and Microsoft and other potential marketing partners for the last year or two, and nothing moves on the needle. Each month, it is the same: about 1 m views a month and about 6000 new subs a month -- that has held steady for years. As some have correctly pointed out, humans don't trust free stuff. But I have been determined to stay here at TH-cam and provide quality resources free of change. I thought that both the MSPTDA and Excel Basics would take off more than they have, but nothing yet seems to move the needle. I keep hoping that people will share the links and tell all their friends and colleagues and someday there will be a viral hit that can get this channel more exposure...
ExcelIsFun On Steps and Queries: I would like to suggest to use the terms per as suggested by the (sub)pane names in PQ: - on the right we have the “steps (sub)pane”, so those things I would call steps. And yes, they consist of possibly multiple let-steps and expressions themselves. The steps inside a let statement I would call a “line”. - on the left we have the “queries panes”, and the various entries there I would call queries. Those are the things you refer to when you use the Reference button. In conclusion: I would use different and distinct/unique terms for each and not “overload” any term, e.g. like “step”. In action-packed fast-paced videos like yours that concept creates a little more clarity IMO.
I thought that is what I did? lines in let statement are steps because the pane list them as steps, and the queries on the left I referred to as query.
Mike, your series are an inspiration to me and I keep coming back every so much month and learn new stuff. Now I am a bit more experienced in PowerBI, I am a bit puzzled by your strategy to come to the solution. I reconstructed the solution by first calculating the MAT per row in the source table and secondly groupby EOM/Product and aggregate MAX (MAT). This solution requires fewer steps and is flexible (exchanging EOM to BOM is quite easy).
LOVE your solutions. Yes, when I filmed this 2.5 years ago, I was less experienced than I am now - just like you. And there are a number of better ways to do this. Thanks for the post and knowledge, Jan Willem : )
Can you post your M Code? I am wondering if you created one or two queries? And how you did MAT in source table? Did you do it for every row in source table?
@@excelisfun I used a custom function. I this reply you will find the code for the table and in the next reply the code of the custom function let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZrNjmQnDIXfpdZT0sXABbbJE8y6NYuWMrtRIkXJ+6e6fvA5x1DKojf9yeA/jM2tj4+LXdPVjtQv3y759vf759+//vrn8/Lj28elOjJBDVAVlkAufcHffv769fPPBzuBZWZ2CPv+7+cfD5KAdCJNZFCRfLXJDKUKgIOVGI6CXYezIWYlR0XFQA1rCgsL4poV5BY+dhOGwg5QAwAGJtXGwAxL6LHzmnepgOiILpswUwhuhk+STgrpVuamHhPSPYMenD0diAQOmaaPXQuGZq6X/f93LThmZbdVas40KIByUh+2jR63UBYKs1sMKp5icJatkMFOSR0FTE9gwdRwNx0AKMLpWun4obXFmUl9QKSHrAKLtcpZ5yU7aBJT11mAYMLi9NVN/sKS96PipMpmnKLOmiTO9aTzyp6cjKvpCUDd34AFX5279D1AalE1z2XiJAMprg0ZzAon5dw4F+zNerP8T5b0sKCrtDqAP6yFkJ2UcnwrNUydaUD3/y98OJlmAIhpNTIQM7nnbrXUWawRbZOnxUnS3SpIqUMGyKlD6rXTPQeJyoBvnU4pB27s6CqUAWSciglWO9X52SHtlHC9EBVfUH1RQCzcOUUECTaHWgBv1dYFJcWRGeVcuo5tfjvSCgcoVJ0MK8bMH2s1TgdZPdJA6B3TzbozLVeoh5oGtoXieAvObk3r4hNw8VffuDkWwDIdtAxE+kJEi4792GVIwt1ilhfUk+6Lr5792OXWLdMBXvi2gxXVlQjbhXsXENOqVVUVyi/ytVyjBLV2Rej6XJMPSJrtFaBJnE5goUVvKCguHSio7j4Qas4n2pNbwoI7SsG/udy2NhaAiUJ1AjEL1rtUSMUDJaXPSLQh1QlDOT2BCVh/w1JSv+wV/Rovd7qMN9YzlPvpHobxAtRhTRSngAJy4rDTUZwFh0Bxy4upx7IjvYbyAapIfj7H0vEqaCg3gGk9eA3J41V3cVC0jZIJdgv1rMlu3BdNFiJ3vJEzkOMhqIP68QDlF5OAAwkvAycwSaAJEs+D5iQ6GPcqwWZfkmfxDFKhQ6ggFgaTQp7iollIzblbc6AHERC3Dl0WozpadlFOh7NYnWDNMGgS5GsS9tPahNtpplqVFbmtKOssMLQutBV1IgpnBZCCR+rumCVHmnAThGbpdqIrHQnsJ+rGUa/ZOap+8+5OyNDeUMYcLdq5CdXvDZQnPTL4SWttBkdpJJHFVvukIG9YuBP6GzlgfGnfiiZLsftdSrPgMcHFTKwOYuJMFAuSOdRcRLQYx1wRfjboTnSixRVj+jgL8xjsxo+KqAbXzNOJXrgna4iPaDuZfLxxRk5voD2mvnkA6eKZSAtBAyn1xmtOXpbUIttJY9ox2GGGjv4FCX6DeA5h85Dx1TKW/u0gw6fhABIvicl4OksJpDR5szPeaoCQ3okGK3L/g0RevMBaPa3WRGrDYsKDq+JDJHuR+r7H/LQ86gi1zUHG92lDEm4XA5pCbSFZbU0GimoxIBjmq+cMtei9CqIgR5RfYBogHXfTQLnYjQJc2JEooeiIJyNtNozj0ZFw4a2IzmiD7aI/B7p73iyKLyga32YBUjAMVQ1jSMY9xfrHQHQPg5ZMRFppq7P4TjmR3rYNVuRKkURoTYrGYBJVfThbfVd9AD3toLkmkHXZTL6sPoT0FJzOTD3YgfHT7OGEXZFBJDyYl42U5TcKWgMYusX+br/7JLQ2DZkWqgxM36scJWniJ5Aj0ZyEs9RhJ23TDJjk1G6rBHuFqoUszE+o5OJTZ96kAegf25LhMD62+4rKYDctn6hluF3uk9fidJqDwRGb/xfXA+Em5zmgLvVO2dni9xVlm9/P2dVTkWtE2SaPgTaZn8/qXk9i5BA7xe448s6kA+/W1//1/cJJVf9WykR6oJxo8RXUxeQOnCDcKwk3W4+obrBcc65KmCnvE9ky1s1RVMbewQqLanLBqvK+0UEq9iLOwngLNoRbBphpd3uSOxcfXxc32gApbrTuk9Lyjh6OwhNSI81lGHKov22ZJL4uVYdN/NvI83J0fcl3UIwGA+R7wCE6cvXbSd2HsnVWnc5Ms7g7iz8QAMHQ5GXYMHxqPEQb6tYGVTOYKiaQ3q870Z/uDFgtZPfhMDgS2YUbLwehHhhooitWEeSef+Ch4JeY+1izdvKBcPFZ5VinuRWUE5c9v0R5S7RhsSdO2ywiGM5VkVWpNV73vwk10dL0nC/WTd1AGH7UdwANFf31fWuVg69Jahn7hBrFY9RRlLPNAIXfv2RdNXziWjShhCAtfvwH", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date = _t, Sales = _t, Product = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type date}, {"Sales", Int64.Type}, {"Product", type text}}), BufferedTable = Table.Buffer( #"Changed Type" ), #"Added Custom" = Table.AddColumn(BufferedTable, "MAT", each fnMAT(BufferedTable, [Date],[Product])), #"Inserted End of Month" = Table.AddColumn(#"Added Custom", "End of Month", each Date.EndOfMonth([Date]), type date), #"Grouped Rows" = Table.Group(#"Inserted End of Month", {"End of Month", "Product"}, {{"MAT", each List.Max([MAT]), type number}}) in #"Grouped Rows"
@@excelisfun The code of the custom function: (InputTable, InputDate, InputBrand) => let step1 = Table.SelectRows (InputTable , each ( [Product] = InputBrand) and [Date] > Date.AddMonths(InputDate, -12) and [Date]
@@janwillemvanholst Thank you for your code. But no matter how many times I have tried your code or dissected the logic, I can not get the same MAT results as in the video. I tried some new M Code listed below and verified the laborious method shows in the video with Excel formulas. Did you get the same MAT results with your code on the data set in this video?
Yes, almsot of it can be done with UI. Here is a video Bill Szysz made that shows how to write Custom Function seen in this video with mostly the user interface (silent movie is easy to follow): drive.google.com/open?id=1v7heN-GwV7OHN1vGTLHOxTujI8QVRmVJ
Yes, that is a good idea. Check out: Bill Szysz silent movie that shows how to write Custom Function seen in this video with mostly the user interface (silent movie is easy to follow): drive.google.com/open?id=1v7heN-GwV7OHN1vGTLHOxTujI8QVRmVJ
Great video. Thanks. 22:05 besides the performance you needed Table.Buffer in order to maintain your sort. If you hadn't used it you would have lost your sort.
27:45 "in Excel, we cannot delete the column "A" where another function column "B" is attached to it". If the function column "B" is selected and its frame is selected with right-click with a slight drag to the right and back to original place, a menu appear with the option "Copy here as Value". After doing this, column "A" can be deleted. This only works if no filter are applied to the table.
Hi. Thank you for the video. I don't get the part where buffered table is compared to external table. Aren't these exactly the same tables? Why they have to be compared?
The struggling things and poor is that M code doesn't pops out the suggested argument for us to reference such that it frequently made us to be keeping wrong. (E.g. Which the normal excel table =sum(number1, number2, ....) < they will pops out the suggestions). (Although I can know that it can type the power query formula for that function for more information, I don't want to always do this for reference and revert back again and loop if the syntax have expression error and so on). So how can this be improved?
I do not have DAX videos on that topic, but I have a bunch of forecast videos in my Analytics class which shows Excel worksheet formulas: th-cam.com/play/PLrRPvpgDmw0mSJCZaqQPFj0eto4qnzkCZ.html Sections 10 and 11.
Is there an example of nested context List.Generate that someone can share? Trying to create a list of 12 dates starting with a date in a column (ex. {1/1/2020, 2/1/2020...12,1,2020}).
Hi Mike, Is there a way to replace values in power query using wild card character like we do in excel. Example: like in above text we can write power* and it will replace entire text after power.
To replace values you can use Replace in Transform group in Home Ribbon Tab, or use the Text.Replace Power Query Function. I do not know of a wild card in Power Query. If you need to test whether a text string is contains in another text string there is this function: Text.Contains.
MAT from this video only worked fine for the first 12 months, after which the number didn't add up anymore for each product. That was because in the function Date.AddMonths the example used -11. When Mike added 2 more years of data (2019 and 2020) the number -11 should have been changed to -55. One would argue we only need annual running total not going forever. If that was the case the video couldn't cope with this requirement. It needs reset after every 12 months.
I live in NZ which has a different date format than the US so PBI only recognizes my dates in text (ugh!!) - which means this formula won't work. Frustrating. Any solutions/work arounds, Mike?
To reduce calculation time between the multiple queries, you can speed up calculation time by putting all the code together in one query. 1) This tip comes from yearofmars supplychaindatascientistcom.wordpress.com/2020/12/10/why-is-my-power-bi-query-so-slow-a-query-is-not-a-table/ : Initial Table: let Source = Folder.Files("C:\Users\User\Desktop\010-TextFiles\010-TextFiles\Start"), LowercasedText = Table.TransformColumns(Source,{{"Extension", Text.Lower, type text}}), FilteredRows = Table.SelectRows(LowercasedText, each [Extension] = ".txt"), FilteredHiddenFiles1 = Table.SelectRows(FilteredRows, each [Attributes]?[Hidden]? true), InvokeCustomFunction1 = Table.AddColumn(FilteredHiddenFiles1, "Transform File from Step01GetExternalTextFiles", each #"Transform File from Step01GetExternalTextFiles"([Content])), RemovedOtherColumns1 = Table.SelectColumns(InvokeCustomFunction1, {"Transform File from Step01GetExternalTextFiles"}), ExpandedTableColumn1 = Table.ExpandTableColumn(RemovedOtherColumns1, "Transform File from Step01GetExternalTextFiles", Table.ColumnNames(#"Transform File from Step01GetExternalTextFiles"(#"Sample File"))), ChangedType = Table.TransformColumnTypes(ExpandedTableColumn1,{{"Date", type date}, {"Sales", Int64.Type}, {"Product", type text}}) in ChangedType MAT Table: let Buffer1stTable = Table.Buffer(Step01GetExternalTextFiles), ProductList = List.Sort(List.Distinct(Buffer1stTable[Product])), MinDate = List.Min(Buffer1stTable[Date]), MaxDate = List.Max(Buffer1stTable[Date]), ListDates = List.Dates(MinDate,Number.From(MaxDate-MinDate)+1,#duration(1,0,0,0)), ConvertedToTable = Table.FromList(ListDates, Splitter.SplitByNothing(), null, null, ExtraValues.Error), CalculatedStartOfMonth = Table.TransformColumns(ConvertedToTable,{{"Column1", Date.StartOfMonth, type date}}), RemovedDuplicates = Table.Distinct(CalculatedStartOfMonth), RenamedColumns = Table.RenameColumns(RemovedDuplicates,{{"Column1", "Date"}}), AddedCustom = Table.AddColumn(RenamedColumns, "Product", each ProductList), ExpandedProduct = Table.ExpandListColumn(AddedCustom, "Product"), ChangedType2 = Table.TransformColumnTypes(ExpandedProduct,{{"Product", type text}}), StartGroupToGetMonthSales = Buffer1stTable, CalculatedStartOfMonth2 = Table.TransformColumns(StartGroupToGetMonthSales,{{"Date", Date.StartOfMonth, type date}}), GroupedRows = Table.Group(CalculatedStartOfMonth2, {"Date", "Product"}, {{"MonthlySales", each List.Sum([Sales]), type number}}), MergeToGetMonthSales = Table.NestedJoin(ChangedType2,{"Date","Product"},GroupedRows,{"Date","Product"},"New"), ExpandedNew = Table.ExpandTableColumn(MergeToGetMonthSales, "New", {"MonthlySales"}, {"MonthlySales"}), DateOneYearBackColumn = Table.AddColumn(FilteredRows2, "DateOneYearBack", each Date.AddMonths([Date],-11),type date), Buffer2ndTable = Table.Buffer(DateOneYearBackColumn), AddedMATCustom = Table.AddColumn(Buffer2ndTable, "MAT", (x) => List.Sum(Table.SelectRows(Buffer2ndTable, each [Date] >= x[DateOneYearBack] and [Date]
Custom Functions can be defined if you put the name of any argument/s in parentheses. In our case we called the single argument "x". "x" is just the name of an argument in the newly defined function. The function we defined was (x) => and was working on the Outer Table. As the Custom Column is copied down the column in the Outer Table the x tells each Field Name that it should look to the Outer Table, rather than the Inner Table. Further, the Custom Function is defined inside the Table.AddColumn which is acting on the Outer Table. The Table.AddColumn Function copies the Custom Function Down the column and executes the Custom Function for each row. By putting x before each column name, we get access to those Outer Table columns.
Did you watch the last video, MSPTDA #9? We defined Custom Functions in that video and this video was a follow up. Nevertheless, Custom Functions can be defined if you put the name of any argument/s in parentheses. In our case we called the single argument "x". "x" is just the name of an argument in the newly defined function. The function we defined was (x) => and was working on the Outer Table. As the Custom Column is copied down the column in the Outer Table the x tells each Field Name that it should look to the Outer Table, rather than the Inner Table. I hope that helped.
Thanks for replying. I understand the concept of Inner and Outer table here. What confuses me is we didn't pass any value to the variable/argument "x" (as far as I understand we declared a variable "x" without assigning "x=outer table"), so how does M code know "x" refers to the outer table? Again, thank you for your videos - I found them extremely helpful.
Because the Custom Function is defined inside the Table.AddColumn which is acting on the Outer Table. The Table.AddColumn Function copies the Custom Function Down the column and executes the Custom Function for each row. By putting x before each column name, we get access to those Outer Table columns.
I do a lot of year over year analysis for sales by product SKU. Is this possible to do in Power Query? For example, I want 4 columns: (1) Product (2) Sales (3) 2018 Year (4) 2019 Year This can be done very easily with a pivot table. Just not sure how to do it with power query. Thank you!
For adding the custom column MAT on step 14 for some reason the M code Intellisense was making it very difficult to edit the code when jumping between web browser window and PowerBI editor. It would also not accept the code, always claiming that comma was missing even it typed out perfectly, with the "OK" button greyed out in the custom column editor. I had to disable Intellisense in options, restart PowerBI, and then had no issue typing the code which worked perfectly the first try. Did anyone else face this issue?
Amazing video, and amazing formula. Anxious to use it. Also I have a question about in what conditions do we use Table.Buffer function cause when I'm applying a new step it seems that power query is, performing the same calculations over and over again?
Use Table.Buffer when: 1) you want to access the table and then store a copy so you do not have to call back to the query over and over, 2) You have an SQL connection and you do not want Power Query to send the query back to the SQL optimizer in the SQL Database.
@@excelisfun Thank you for you answer. Do I get your message correctly that if we have on step 1 a connected SQL data base as a source, on step 2 we Table.Buffer source and on step 3 we apply changes to the buffered table. That when we use Table.Buffer on step 2 then calculating the changes applied on step 3 does not affect SQL optimizer in SQL database ? So power query makes a query to the SQL database only once, and if we didn't buffer table it would query it second time on the step when we apply our changes?
@@orcawilly54 Yes. If you are connected to an SQL database, and you do not buffer, then you can right-click any step in the list of Power Query Applied Steps, and click on Native Query and it will show you the SQL Code that is being sent back.
Hello Sir, I am wondering that I don't know how to extract months from the dates lets say I have a data table consists of dates like 9/4/2018 etc and what I need to extract September only from the dates. How can I do this in PQ? Regards
Thanks this was great! I’ve been trying to figure out how to do this for a week and then this video magically appeared. I tried using this approach in a 50K row data table using Power Query in Excel. It worked but took a LONG time to return the results, and after closing Power Query and loading to the data model it wouldn’t load. I waited 2 hours and then abandoned. I triple checked that I had the buffered table and custom function right and it looks good. Any thoughts?
I am not sure. I have not tried it out on a large set. The example I used was about 400 rows. I wonder if it is the 50,000 rows, or something else. If this transformation fails on 50,000 rows because it takes too long to load or refresh, then it is really not to useful. What data source are you using?
Thanks for the reply. I thought I knew just about everything about Excel until I stumbled on your Power Query series of videos. I’ve rearchitected some of my most important work with great results! I really appreciate you! I’m using other Excel workbooks that contain tables as a data source. These are supplied to me monthly. I save them into a designated folder, point a query to that folder, filter for specific names, and load/transform the data. This gets loaded to the data model and then drives 20+ reports that use cube functions to retrieve the data. It’s pretty slick, highly portable, and built mostly from lessons I learned from you.
Awesome! I am glad that the videos help. Power Query is really a HUGE game changer. How do you create the Cude function reports? Do you create the formula yourself, or do you click the button to convert the Data Model PivotTable to Cube Functions? What are the advantages for you of using the Cube Functions, rather than the Data Model PivotTable? Do you have a different structure for your reports, like cutting and pasting the formulas in different ways, or is it formatting or is it something other that is the reason that you use the Cube Functions rather than PivotTables?
Initially I converted Data Model PivotTables to cube functions but now just type them in. I seem to have reached a limit on cube functions, however. Each time I make a change that triggers a data refresh, the cube functions seem to refresh individually rather than en masse. Hence, the more I have, the longer it takes. Maybe this is a data architecture issue but it feels more like a cube function limitation. I'm also using the CUBESET function to roll up Last Twelve Months (LTM) data. I wanted to test whether having the LTM data in the data model would make the Cube functions more efficient, hence this video was timely. I've used cube functions more for convenience than any inherent advantage. To teach myself PowerPivot and PowerQuery, I redesigned an existing workbook to make use of these tools. That workbook was essentially a report writer, summarizing large amounts of data using SUMIFS functionality. The data source was a cross-tabulated worksheet table with months across the columns and accounting data in the rows. Additional columns were included for summarizing the months into MTD, QTD, YTD, LTM, and other accumulations, and each row was tagged with multiple report labels that the SUMIFS functions looked to. All this worked great except that with thousands of rows, the workbook essentially choked. Once I had the data in a proper data set inside a data model, it was easy to replace all those SUMIFS formulas with CUBEVALUE functions, and to still retain the professional formatting and layout (which is enormously important). In this workbook, an end user can simply change a single date on a Control Sheet and all the reports update accordingly. Hence, historical and forecast reports can be created with a single simple date change. There are other global inputs on this control page that drive the reports as well. Some of these have been replaced with slicers, but in some cases I don't want to use slicers. If it weren't for the (seemingly) extra overhead of cube functions, this would be a great solution. I'll probably end up back in PowerPivot, however. I want to keep Control Sheet Date functionality, but haven't yet figured out how to use that to drive a PivotTable filter other than with slicers (perhaps the CALCULATE function?). I'll let you know if I figure out the 50K row issue. I have another workbook that is nearly 200K rows and 25 or so columns and also needs the LTM accumulation. For now I'm sticking with CUBESET functionality on that one.
MAT can be useful for keeping track of revenue movements over a rolling (in this case) 12 months. At any given month in your reporting, you can track a products sales history over the past year. Not the last calendar year, but from today's month going back 12 months. I've used this for customer valuation when paying commissions to salespeople. If a customer's rolling 12mo. revenue is above 100k, pay out 10% of margin generated from that customer. 50k: 6% payout, 25k: 3% payout. This allows you to be more responsive in your customer valuation than just once a year creating a customer valuation.
Hi Mike. A quick question on this video: in Step03 you changed the dates to the start of the month, calculated min/max and datelist to create a table, and then converted to the start of the month. This new manufactured table is essentially the same as the initial 'Calculated Start of Month' table a few steps previous. Was there are reason for this (other than for tutoring purposes :) )? Am loving your series here and all your BI videos in general. Thanks for the work you put in.
I am not sure how he comes up with it. I do know that Bill Szysz is THE smartest Power Query guy I know. I do not think it performs well on millions of rows. Better to do it with DAX.
I think it is better practice to use named parameters or unnamed parameters and not combine them. In this instance the nested function would look something like: = Table.AddColumn(BufferedTable, "MAT", (outer) => List.Sum(Table.SelectRows(BufferedTable, (inner)=> inner[Date] >= outer[DateOneYearBack] and inner[Date]
I am not sure what you are asking. Are you asking about the EARLIER DAX Function? If yes, this video is about the computer language M Code in Power Query. EARLIER is in the DAX Computer language.
Hi Mike, thanks for your reply. You menioned in the (great by the way) video that you could also do the MAT in DAX using the earlier. My question was whether you could show this as well. best regards
I did not say that MAT could be done with EARLIER. I said two separate things: 1) MAT can be done with CALCULATE and a few other functions. 2) EARLIER is what allows you to jump back to an earlier row context. I also said that we would do this later in this class, a few months away, though. Luckily, I already have a similar video that shows Moving Average with DAX. Here it is: th-cam.com/video/FgsTNJ5dCDg/w-d-xo.html Please always help support with your comments, Thumbs Up, Sub and tell all your friends : )
Did you watch MSPTDA #9, where I introduced Custom Functions? Also, Here is some even more efficient M code technique for MAT, an update to this video: th-cam.com/video/itMtqU-walo/w-d-xo.html
@@excelisfun yes Sir I did watch the series from the start and the series was really good until the M function, and I am starting to realise that my understanding of excel functions is bigners level, pls guide me any good videos or materials to catchup .
Yes, but as you said in another comment, you did not watch this whole series. You MUST watch and study Video #9 in this series to understand this video, #10 : )
Yup, watched and solved all of them thoroughly. Still having some trouble understanding lookups though. Regardless, it is on me and I will probably have to watch this a few more times before completely getting it. Thanks regardless, Mike!
@@mahneelmusa7469 Well, I can tell you that it took me a long time to approach M Code and Custom Function. M Code, and DAX, are totally new and different types computer languages, which makes then harder to learn. Even though they are function based, like Excel Worksheet Fomulas, each is unique and requires time and practice. Thanks to you, Mahneel, you watching and studying this series so far. You can continue watching this class, which is a combo of M Code, DAX, Power BI, Power Query and Data Modeling, or if you want to immediately practice some other M Code Custom Functions, here is a different playlist of M Code Custom Functions: th-cam.com/play/PLrRPvpgDmw0kuIuvn2rCQI14AthSZ43ca.html
@@excelisfun Thank you, Mike. Seriously can't thank you enough. You have enabled me to work and analyze data with Power Query. Though my skills are not quite that advanced but hopefully I'll get there eventually. Thanks again!
Here is some even more efficient M code technique for MAT, an update to this video: th-cam.com/video/itMtqU-walo/w-d-xo.html
Who needs a break when Mike is teaching so soooo soooo extra ordinary...BANG BANG BAGN!!
This series is terrific. Intense M coding esp. the end functionfunction joining the table with the bufferred table for the date lookback. It will be very instructive to see the implementation in DAX in a later episode. Always grateful, Mike!!
what's the difference between M code and Dax?
I recall someone else's words when trying to pick up with PowerQuery/DAX/M -- 'I have to bang my head against walls so many times to overcome the initial hurdle' -- I'm doing the same here. Amazed by the power and beauty behind all this, but lacking the capability to comprehend all, yet. Climbing up my learning curve! ;) Thanks so much for all these great videos!
The whole MSPTDA course is dense and filled with a lot of material. But if you study hard and put in the work, and then practice , practice, practice - you will be greatly Rewarded : )
Mike, "long time listeners (a few months, anyway), first time caller here". First off, thanks so much for offering this series. I have learned a ton. After listening to MSPTDA-10 repeatedly, I finally figured out how to implement the ideas here on a problem for work. I work in sales and have a customer that provides me monthly updates with their weekly use rates of each product for one year out from the date of the update, which I can then compare against my planned inventory to make sure that their needs will be met. I'm just now wrapping that up. But this has led to another idea that I am not sure how to implement. Given their use rate, their safety stock inventories, and how they determine the size of their next order, I can look ahead and determine the next date and order size...which is good. But what I really want is to do that calculation for the entire year's outlook. So I need to be able to determine the first "next order", then recursively adjust the table and then determine the second "next order", and so forth until the end of the rolling 1-year period for the current set of weekly updates. I am not sure how to do this recursive step, or whether it would be better to do this in Power Query or if Data Modeal/DAX (which I still know very little about) would be better. I don't have an enormous amount of data to deal with, just a few thousand rows at most. Any suggestions would be most welcome!
List.Generate() function does the trick. Though I find that what I really want to do is a nested loop to generate a list of lists so as to get the projected inventory for each part number, for all part numbers. The list for each individual part number I've got down and saved as a function, generating a list where each record is the result of calling that function is proving to be beyond me. I'm resorting to an inelegant work-around.
WOW! Each step from Step 1 to Step 5 involved some modification to the M Code - this was definitely a complex solution. I will be re-watching to better understand the concepts of Inside Table, Outside Table, and Buffering. That was a jam-packed video!
Yes, it is worth re-watching because this trick shows up in many other videos. Some, latter in this class : )
This custom function approach we can use to a left join table with many conditions not only equal.
Thank you Mike, for your time and efforts to share useful knowledges!!!
You are welcome!
Hi Mike, I remembered looking at this video a couple of months back and I needed to use a variable in a formula to store a Buffered table. I found the video and created this - = Table.AddColumn(BufferedTable, "MinIndexForResource", (x) => List.Min(Table.SelectRows(BufferedTable, each [Employee Num] = x[Employee Num] and [Project Number] = x[Project Number] and [Assignment Start Date] = x[Assignment Start Date])[Index]), Int64.Type) - the refresh time of the original query where I used a function passing in to it a non buffered table was 20 mins plus (hopeless). This method reduced that time to just over a minute, just amazing. Thank you so much. Best wishes. David
I love hear that, David! I am happy that the videos and materials that I post can help you in your work! Thanks for helping me with your comments and thumbs ups : )
This video is amazing and it took me a while to figure out what is actually happening here. Thanks for sharing this with the community and investing your time. I am forever in your debt.
You are welcome, Abdul!
Thanks for another great vid, finally booked my MOS Excel Expert exam, at 52 I should have done it years ago, hope to get my Office Master certificate within a year or so. Thanks for all the education and your enthusiasm.
You are welcome, Dave! Good luck with the exam : )
Ta, I've done something similar to this using Offset and Counta combined which will always count back a certain period and return a formula result for a certain period.
It took a while to grasp that as soon as you do custom column there is external table added which has same content of buffered table and then select rows , conditional criteria and custom function x does it work. After getting this it was smooth ride.
External table and internal table does the trick. Great Mike. Amazing trick.
Hope my understanding is correct.
Yes, the conceptual trick to this is that that there was an internal table that is locked on the full table, but there was also the ability to look at each row in the external table, sort of like a locked range on the full table and then a row-by-row Relative Reference. Thank you for your support, Gentle Raj.
Mike, even paid courses out there don't have the quality and thoroughness your videos do;
A huge Thanks for all the effort you're putting to make all these videos for the rest of us in the community.
That is true. Most of the paid courses, are heavy on marketing and promises of easy fixes. Most of TH-cam is filled with that sort of stuff too. My quandary is: how come my stuff which has the hows and whys, and is free, does not get more traction. It might be that I am not good at marketing, or that many people do not want the whys, just the quick fix hows. Nevertheless, you and I and some others, who do want the hows and whys, that is what the excelisfun channel at TH-cam is here for! Thanks for your support on each video that you learn from, Jacob, with a comment, thumbs up and of course your Sub : )
@@excelisfun Hey, just thinking out loud, how about going through a quick overview, covering quick fixes; And then diving deep into the details
Those who're just looking for the quick fixes can exit after the initial bit.
That way, you're not restricting the content only to those of us who need the detailed version, but also those who want a quick bite of info on the go?
just thinking out loud regarding the less traction problem you mentioned, that's all; I anyway would go through your videos the way they are
@@JacobVJoseph In the last two years I have tried a bit of that. With over 3000+ videos, I do have some variety. But, even though what I do does not have the highest demand, I love doing it with the hows and whys because for me and can be more effective in creating solutions, more efficient and have way more fun that way. Thanks for your out load think, Jacob!!
@@excelisfun got it! Keep it going 🙂👍🏾
Mike & Bill = M-Power-Combo !!! Absolutely amazing !!! It seems I underestimated the capability of PQ....
AND... it just will keep getting more and more amazing the deeper you go into M. Then, the combo of Excel, Power Query & M, and then Power Pivot and DAX -- Excel is now the mega-ultimate tool every : )
Oh , this one is really complex and takes time to digest and make use of, however, on top of the M code course previously, to learn more by the real example and "feel" and "taste" the grammar and design of M code is really suffering but also enjoyable. Great course !!!
You are right it is very hard, but also very wonderful when we finally learn it : )
I feel like this might be part of the solution to a project I've hit a wall on that I had to put on hold. Hopefully, I'll be able to work on it again soon. Your videos are a godsend!
Homework is posted below video.
Bill Szysz silent movie that shows how to write Custom Function seen in this video with mostly the user interface (silent movie is easy to follow): drive.google.com/open?id=1v7heN-GwV7OHN1vGTLHOxTujI8QVRmVJ
OMG Bill, always awesome, indeed. You just build the formula using step-by-step with one case and then substitute the arguments after using the columns. NICE!!
The homework says output month and MAT sales, but you mention quarter granularity, I guess the output is quarter and mat sales?,
I’m a big fan of these videos and learned a lot. There is a major problem with the way you do this which occurs if the data is larger.
Simply put, you a have five queries when what you need is four tables and a list in one query.
Each one query is evaluated independently, so when you reference a query it runs again, reading in the information again. When you merge two separate queries it runs both in the merge and in the separate queries. If you merge two tables inside the same query the data is read only once. Otherwise your refresh time gets long very fast.
supplychaindatascientistcom.wordpress.com/2020/12/10/why-is-my-power-bi-query-so-slow-a-query-is-not-a-table/
@@yearofmars Thank you very much for this improved method!
@@yearofmars Wow, you wrote a whole article - very great! Thanks for helping the Team to get better, yearofmars!!!
Hi Mike. Thanks for bringing this topic for all of us on your channel officially. The idea of using custom function becomes even clearer. Will be looking forward to the next videos. And as always, big thumb-up!
You are welcome for bringing it up! Thanks for the idea and thanks for your support!
This section was more difficult compared to the other videos, but I still enjoy it. Thank you Mike for another great video!
Yes, this is like 10 videos worth of content in one video... You are welcome as always, Doris : )
Could you, please, make more videos about Power BI?
Yes, there will be some more in this series. I have a few videos already on Power BI, but in the meantime, here are two other cool TH-cam Teamate channels:
th-cam.com/channels/RNmSv7mAPYiC0Y40TJijAw.html
th-cam.com/channels/J7UhloHSA4wAqPzyi6TOkw.html
docs.microsoft.com/en-us/power-bi/videos
That was brilliant, and never realized how simple it is to turn a function to a table just by removing the superfluous 'each' when using the 'add custom column' . Its' puzzled me for a while , but then a lot of things do.
Table.Buffer() was a great help to me, since Excel and Power Query multiple-columns-sorting do not have same results + what is seen as result from Power Query does not always equal what is saved in the Cache. Wanting to delete duplicates and only Keep the top entry of each Group it was impossible to execute without Table.Buffer(). A Great tool! There is probably a better way to retain the minimum of each group, but it worked. I guess it can be done using the add column with custom function depicted in this Video. Got to figure this one out.
Great video in an excellent series, Mike. Table.Buffer + CustomFunction = Mind.Blown.
Mark Haggett Table.Buffer is great; I just applied on a 20 minute query (multiple files, multiple tables, thousands of records... don’t ask), and turned it to a 5 minute query
Hi Mike,
Thanks to you and Bill, that formula was mind-blowing...power query offer so many capabilities, thanks for showing us the ways. ;-)
You are welcome! We are luck to be on a good team with Excelers like Bill Szysz!!!!
That was pretty impressive! I enjoyed the session and got lost a little (I understand the process but don't know if I am ready to repeat it) but really cool! It's amazing how many tools are available with M code. Off to MSPTDA11 ! Thanks again!
Yes, it does take a bit to leaarn the advanced M Code. But after a while (repeat, repeat, repeat), you get the hang of it!!!
@@excelisfun Yes, absolutely. The more you see it in action, follow along, type it in, and learn by doing - all adds up. Thanks again!
@@tomr9969 You are welcome again! Thanks for your consistent support. I just wish more viewers were as respectful as you are with consistent support : )
Intense tutorial. Will need some time to process and also to work on some simple problems. Great work and thank you for your efforts.
You are welcome, Nirushan!!!!
Wow! The similarities between PQ and PP (DAX) are greater than I imagined.
The fact that you can do that ‘EARLIER’ trick in PQ is amazing. Transcending table boundaries is powerful technique that creates a lot of possibilities, and you just showed one. And all that M-code programming all over the place: very powerful.
BTW: very good of you to start referencing other queries in stead of copying them: more efficient.
Q: in your explanations on several occations you used the term “steps” where -in my opinion- you should have said “queries”. Steps are parts of a query. I find it clearer not to mix them.
Finally: next video, and BOOM, another 2K of subs! Just 10K more and you’ve reached the 500K milestone. Things are really growing fast!
Maybe you should contemplate reaching 1M subs before this series is well and truely over! :-)
See you on the next one and the previous one, because we’ll be watching them again!
Each step in a let statement is an expression, and the let statement itself is an expression, so in that sense each line is a query... Maybe it is better to say that each step is a sub-query? I decided to call them steps because the "Applied Steps" list uses the word step. There are multiple synonyms for each line in a let statement...
Over the last year or two I have tried to promote the channel more, with 1 m in mind, but it just does not seem that I have the marketing skills to accomplish this. It is particularly odd because I have intentionally provided all my classes (Excel Basics, MSPTDA, Highline Excel Class, Statistics and so on) for free. Especially since I do not believe that there is a better Excel Basics Series (a class that gets you up to a professional level) available for free, anywhere. I have been working with TH-cam and Microsoft and other potential marketing partners for the last year or two, and nothing moves on the needle. Each month, it is the same: about 1 m views a month and about 6000 new subs a month -- that has held steady for years. As some have correctly pointed out, humans don't trust free stuff. But I have been determined to stay here at TH-cam and provide quality resources free of change. I thought that both the MSPTDA and Excel Basics would take off more than they have, but nothing yet seems to move the needle. I keep hoping that people will share the links and tell all their friends and colleagues and someday there will be a viral hit that can get this channel more exposure...
ExcelIsFun On Steps and Queries:
I would like to suggest to use the terms per as suggested by the (sub)pane names in PQ:
- on the right we have the “steps (sub)pane”, so those things I would call steps. And yes, they consist of possibly multiple let-steps and expressions themselves. The steps inside a let statement I would call a “line”.
- on the left we have the “queries panes”, and the various entries there I would call queries. Those are the things you refer to when you use the Reference button.
In conclusion: I would use different and distinct/unique terms for each and not “overload” any term, e.g. like “step”.
In action-packed fast-paced videos like yours that concept creates a little more clarity IMO.
I thought that is what I did? lines in let statement are steps because the pane list them as steps, and the queries on the left I referred to as query.
ExcelIsFun Let me double-check after working hours: maybe I misheared. (Be advised that there is a 9 hour time zone difference between you and me).
Amazing stuff! Thank you :)
You are welcome, Teammate!!! Thanks for your support : )
Mike, your series are an inspiration to me and I keep coming back every so much month and learn new stuff. Now I am a bit more experienced in PowerBI, I am a bit puzzled by your strategy to come to the solution. I reconstructed the solution by first calculating the MAT per row in the source table and secondly groupby EOM/Product and aggregate MAX (MAT). This solution requires fewer steps and is flexible (exchanging EOM to BOM is quite easy).
LOVE your solutions. Yes, when I filmed this 2.5 years ago, I was less experienced than I am now - just like you. And there are a number of better ways to do this. Thanks for the post and knowledge, Jan Willem : )
Can you post your M Code? I am wondering if you created one or two queries? And how you did MAT in source table? Did you do it for every row in source table?
@@excelisfun I used a custom function. I this reply you will find the code for the table and in the next reply the code of the custom function
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jZrNjmQnDIXfpdZT0sXABbbJE8y6NYuWMrtRIkXJ+6e6fvA5x1DKojf9yeA/jM2tj4+LXdPVjtQv3y759vf759+//vrn8/Lj28elOjJBDVAVlkAufcHffv769fPPBzuBZWZ2CPv+7+cfD5KAdCJNZFCRfLXJDKUKgIOVGI6CXYezIWYlR0XFQA1rCgsL4poV5BY+dhOGwg5QAwAGJtXGwAxL6LHzmnepgOiILpswUwhuhk+STgrpVuamHhPSPYMenD0diAQOmaaPXQuGZq6X/f93LThmZbdVas40KIByUh+2jR63UBYKs1sMKp5icJatkMFOSR0FTE9gwdRwNx0AKMLpWun4obXFmUl9QKSHrAKLtcpZ5yU7aBJT11mAYMLi9NVN/sKS96PipMpmnKLOmiTO9aTzyp6cjKvpCUDd34AFX5279D1AalE1z2XiJAMprg0ZzAon5dw4F+zNerP8T5b0sKCrtDqAP6yFkJ2UcnwrNUydaUD3/y98OJlmAIhpNTIQM7nnbrXUWawRbZOnxUnS3SpIqUMGyKlD6rXTPQeJyoBvnU4pB27s6CqUAWSciglWO9X52SHtlHC9EBVfUH1RQCzcOUUECTaHWgBv1dYFJcWRGeVcuo5tfjvSCgcoVJ0MK8bMH2s1TgdZPdJA6B3TzbozLVeoh5oGtoXieAvObk3r4hNw8VffuDkWwDIdtAxE+kJEi4792GVIwt1ilhfUk+6Lr5792OXWLdMBXvi2gxXVlQjbhXsXENOqVVUVyi/ytVyjBLV2Rej6XJMPSJrtFaBJnE5goUVvKCguHSio7j4Qas4n2pNbwoI7SsG/udy2NhaAiUJ1AjEL1rtUSMUDJaXPSLQh1QlDOT2BCVh/w1JSv+wV/Rovd7qMN9YzlPvpHobxAtRhTRSngAJy4rDTUZwFh0Bxy4upx7IjvYbyAapIfj7H0vEqaCg3gGk9eA3J41V3cVC0jZIJdgv1rMlu3BdNFiJ3vJEzkOMhqIP68QDlF5OAAwkvAycwSaAJEs+D5iQ6GPcqwWZfkmfxDFKhQ6ggFgaTQp7iollIzblbc6AHERC3Dl0WozpadlFOh7NYnWDNMGgS5GsS9tPahNtpplqVFbmtKOssMLQutBV1IgpnBZCCR+rumCVHmnAThGbpdqIrHQnsJ+rGUa/ZOap+8+5OyNDeUMYcLdq5CdXvDZQnPTL4SWttBkdpJJHFVvukIG9YuBP6GzlgfGnfiiZLsftdSrPgMcHFTKwOYuJMFAuSOdRcRLQYx1wRfjboTnSixRVj+jgL8xjsxo+KqAbXzNOJXrgna4iPaDuZfLxxRk5voD2mvnkA6eKZSAtBAyn1xmtOXpbUIttJY9ox2GGGjv4FCX6DeA5h85Dx1TKW/u0gw6fhABIvicl4OksJpDR5szPeaoCQ3okGK3L/g0RevMBaPa3WRGrDYsKDq+JDJHuR+r7H/LQ86gi1zUHG92lDEm4XA5pCbSFZbU0GimoxIBjmq+cMtei9CqIgR5RfYBogHXfTQLnYjQJc2JEooeiIJyNtNozj0ZFw4a2IzmiD7aI/B7p73iyKLyga32YBUjAMVQ1jSMY9xfrHQHQPg5ZMRFppq7P4TjmR3rYNVuRKkURoTYrGYBJVfThbfVd9AD3toLkmkHXZTL6sPoT0FJzOTD3YgfHT7OGEXZFBJDyYl42U5TcKWgMYusX+br/7JLQ2DZkWqgxM36scJWniJ5Aj0ZyEs9RhJ23TDJjk1G6rBHuFqoUszE+o5OJTZ96kAegf25LhMD62+4rKYDctn6hluF3uk9fidJqDwRGb/xfXA+Em5zmgLvVO2dni9xVlm9/P2dVTkWtE2SaPgTaZn8/qXk9i5BA7xe448s6kA+/W1//1/cJJVf9WykR6oJxo8RXUxeQOnCDcKwk3W4+obrBcc65KmCnvE9ky1s1RVMbewQqLanLBqvK+0UEq9iLOwngLNoRbBphpd3uSOxcfXxc32gApbrTuk9Lyjh6OwhNSI81lGHKov22ZJL4uVYdN/NvI83J0fcl3UIwGA+R7wCE6cvXbSd2HsnVWnc5Ms7g7iz8QAMHQ5GXYMHxqPEQb6tYGVTOYKiaQ3q870Z/uDFgtZPfhMDgS2YUbLwehHhhooitWEeSef+Ch4JeY+1izdvKBcPFZ5VinuRWUE5c9v0R5S7RhsSdO2ywiGM5VkVWpNV73vwk10dL0nC/WTd1AGH7UdwANFf31fWuVg69Jahn7hBrFY9RRlLPNAIXfv2RdNXziWjShhCAtfvwH", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date = _t, Sales = _t, Product = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type date}, {"Sales", Int64.Type}, {"Product", type text}}),
BufferedTable = Table.Buffer( #"Changed Type" ),
#"Added Custom" = Table.AddColumn(BufferedTable, "MAT", each fnMAT(BufferedTable, [Date],[Product])),
#"Inserted End of Month" = Table.AddColumn(#"Added Custom", "End of Month", each Date.EndOfMonth([Date]), type date),
#"Grouped Rows" = Table.Group(#"Inserted End of Month", {"End of Month", "Product"}, {{"MAT", each List.Max([MAT]), type number}})
in
#"Grouped Rows"
@@excelisfun The code of the custom function:
(InputTable, InputDate, InputBrand) =>
let
step1 =
Table.SelectRows
(InputTable , each (
[Product] = InputBrand)
and [Date] > Date.AddMonths(InputDate, -12)
and [Date]
@@janwillemvanholst Thank you for your code. But no matter how many times I have tried your code or dissected the logic, I can not get the same MAT results as in the video. I tried some new M Code listed below and verified the laborious method shows in the video with Excel formulas. Did you get the same MAT results with your code on the data set in this video?
Another awesome video. I didn't know how to use List.Dates but now I know :) Thank you and appreciate your help.
You are welcome, knikl!!!
My head hasn't stopped spinning since MSPTDA 9. I think I will do as much as I can in the UI first before deciding on applying custom M code.
Yes, almsot of it can be done with UI. Here is a video Bill Szysz made that shows how to write Custom Function seen in this video with mostly the user interface (silent movie is easy to follow): drive.google.com/open?id=1v7heN-GwV7OHN1vGTLHOxTujI8QVRmVJ
That is a good idea , RJ Bush!!!
Yes, that is a good idea. Check out: Bill Szysz silent movie that shows how to write Custom Function seen in this video with mostly the user interface (silent movie is easy to follow): drive.google.com/open?id=1v7heN-GwV7OHN1vGTLHOxTujI8QVRmVJ
Your videos are always a great resource. Thanks
WOW! I think that's the method that I've been searching for. Thank you @ExcelisFun, you always get the answer!
Glad that I can help : ) : )
Very complex however very well explained and taught
Great video. Thanks. 22:05 besides the performance you needed Table.Buffer in order to maintain your sort. If you hadn't used it you would have lost your sort.
Thanks for that hot tip, Jan!!! Table.Buffer will maintain the sort too : )
Got my daxisfun mug today. Still waiting on the other mug & the 2 shirts.
Thank you for the support, Jack!!!
@@excelisfun really enjoying the videos. This one was great as it helped me with a budgeting project for meal expenses.
Need to go through it again ... thank a lot❤️
Here is some even more efficient M code technique for MAT, an update to this video: th-cam.com/video/itMtqU-walo/w-d-xo.html
@@excelisfun Thanks a lot ❤️😘🙏
27:45 "in Excel, we cannot delete the column "A" where another function column "B" is attached to it". If the function column "B" is selected and its frame is selected with right-click with a slight drag to the right and back to original place, a menu appear with the option "Copy here as Value". After doing this, column "A" can be deleted. This only works if no filter are applied to the table.
Great video again. The creating a list is something that just saved so much time for me. Thank you
Awesome, glad that the list creating part was so useful! Thanks for your support : )
Thank you Mike, great example how to use m code :)
You are welcome. Here is some even more efficient code, update to this video: th-cam.com/video/itMtqU-walo/w-d-xo.html
you are always updating my mind to better thank you Mr. Mike
You are welcome for the update - thanks you also for your amazing support, Ismail!!!!
He came, he saw, he Exceled!
Here is an even better way to do this: th-cam.com/video/itMtqU-walo/w-d-xo.html
@@excelisfunIt's like watching DaVinci creat a new invention!
@@bamakaze I love DaVinci!!! He was a rad dude : )
Ohhhhhhh myyyyyyy....I doubt, if I will ever use power query to this extent.... This wasn't easy :D
Yes, most of the time the user interface will get us what we want : ) Luckily : ) : )
Thank you very much for this amazing solution and continued investment in creating fun tutorials :-)
You arte welcome for the continual investment in fun and efficiency!!!
Hi. Thank you for the video. I don't get the part where buffered table is compared to external table. Aren't these exactly the same tables? Why they have to be compared?
The struggling things and poor is that M code doesn't pops out the suggested argument for us to reference such that it frequently made us to be keeping wrong. (E.g. Which the normal excel table =sum(number1, number2, ....) < they will pops out the suggestions). (Although I can know that it can type the power query formula for that function for more information, I don't want to always do this for reference and revert back again and loop if the syntax have expression error and so on). So how can this be improved?
Epic video! Thank you very much.
You are welcome!
Hi Mike, Could you show how to use this to determine forward forecast ? Many thanks in advance.
I do not have DAX videos on that topic, but I have a bunch of forecast videos in my Analytics class which shows Excel worksheet formulas: th-cam.com/play/PLrRPvpgDmw0mSJCZaqQPFj0eto4qnzkCZ.html Sections 10 and 11.
Is there an example of nested context List.Generate that someone can share? Trying to create a list of 12 dates starting with a date in a column (ex. {1/1/2020, 2/1/2020...12,1,2020}).
Hi Mike,
Is there a way to replace values in power query using wild card character like we do in excel. Example: like in above text we can write power* and it will replace entire text after power.
To replace values you can use Replace in Transform group in Home Ribbon Tab, or use the Text.Replace Power Query Function. I do not know of a wild card in Power Query. If you need to test whether a text string is contains in another text string there is this function: Text.Contains.
MAT from this video only worked fine for the first 12 months, after which the number didn't add up anymore for each product. That was because in the function Date.AddMonths the example used -11. When Mike added 2 more years of data (2019 and 2020) the number -11 should have been changed to -55. One would argue we only need annual running total not going forever. If that was the case the video couldn't cope with this requirement. It needs reset after every 12 months.
MAT was supposed to be only 12 months. Here is a video about running totals: th-cam.com/video/EP4L1FVcSUg/w-d-xo.html
Wonderful - video & notes & M code. Great!
You are welcome and glad it is winderful, Sandeep!!! Thanks for your support : )
M have the way of video.. good information.. thank you for sharing & making excelisfun 😇 for all
You are welcome for the share! Thanks for your support : )
I live in NZ which has a different date format than the US so PBI only recognizes my dates in text (ugh!!) - which means this formula won't work. Frustrating. Any solutions/work arounds, Mike?
This video (#12 in this class) can help: th-cam.com/video/JGppsnepJFo/w-d-xo.html
To reduce calculation time between the multiple queries, you can speed up calculation time by putting all the code together in one query.
1) This tip comes from
yearofmars
supplychaindatascientistcom.wordpress.com/2020/12/10/why-is-my-power-bi-query-so-slow-a-query-is-not-a-table/ :
Initial Table:
let
Source = Folder.Files("C:\Users\User\Desktop\010-TextFiles\010-TextFiles\Start"),
LowercasedText = Table.TransformColumns(Source,{{"Extension", Text.Lower, type text}}),
FilteredRows = Table.SelectRows(LowercasedText, each [Extension] = ".txt"),
FilteredHiddenFiles1 = Table.SelectRows(FilteredRows, each [Attributes]?[Hidden]? true),
InvokeCustomFunction1 = Table.AddColumn(FilteredHiddenFiles1, "Transform File from Step01GetExternalTextFiles", each #"Transform File from Step01GetExternalTextFiles"([Content])),
RemovedOtherColumns1 = Table.SelectColumns(InvokeCustomFunction1, {"Transform File from Step01GetExternalTextFiles"}),
ExpandedTableColumn1 = Table.ExpandTableColumn(RemovedOtherColumns1, "Transform File from Step01GetExternalTextFiles", Table.ColumnNames(#"Transform File from Step01GetExternalTextFiles"(#"Sample File"))),
ChangedType = Table.TransformColumnTypes(ExpandedTableColumn1,{{"Date", type date}, {"Sales", Int64.Type}, {"Product", type text}})
in
ChangedType
MAT Table:
let
Buffer1stTable = Table.Buffer(Step01GetExternalTextFiles),
ProductList = List.Sort(List.Distinct(Buffer1stTable[Product])),
MinDate = List.Min(Buffer1stTable[Date]),
MaxDate = List.Max(Buffer1stTable[Date]),
ListDates = List.Dates(MinDate,Number.From(MaxDate-MinDate)+1,#duration(1,0,0,0)),
ConvertedToTable = Table.FromList(ListDates, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
CalculatedStartOfMonth = Table.TransformColumns(ConvertedToTable,{{"Column1", Date.StartOfMonth, type date}}),
RemovedDuplicates = Table.Distinct(CalculatedStartOfMonth),
RenamedColumns = Table.RenameColumns(RemovedDuplicates,{{"Column1", "Date"}}),
AddedCustom = Table.AddColumn(RenamedColumns, "Product", each ProductList),
ExpandedProduct = Table.ExpandListColumn(AddedCustom, "Product"),
ChangedType2 = Table.TransformColumnTypes(ExpandedProduct,{{"Product", type text}}),
StartGroupToGetMonthSales = Buffer1stTable,
CalculatedStartOfMonth2 = Table.TransformColumns(StartGroupToGetMonthSales,{{"Date", Date.StartOfMonth, type date}}),
GroupedRows = Table.Group(CalculatedStartOfMonth2, {"Date", "Product"}, {{"MonthlySales", each List.Sum([Sales]), type number}}),
MergeToGetMonthSales = Table.NestedJoin(ChangedType2,{"Date","Product"},GroupedRows,{"Date","Product"},"New"),
ExpandedNew = Table.ExpandTableColumn(MergeToGetMonthSales, "New", {"MonthlySales"}, {"MonthlySales"}),
DateOneYearBackColumn = Table.AddColumn(FilteredRows2, "DateOneYearBack", each Date.AddMonths([Date],-11),type date),
Buffer2ndTable = Table.Buffer(DateOneYearBackColumn),
AddedMATCustom = Table.AddColumn(Buffer2ndTable, "MAT", (x) => List.Sum(Table.SelectRows(Buffer2ndTable, each [Date] >= x[DateOneYearBack] and [Date]
OMFG, I need watch more time step 05 !
I need learn polish kkk
Other great video, thanks!!!!!!
Glad you like it, linelson!!! Thanks you for the support with your comment, Thumbs Up and Sub!
MAT works like SUMIFS.
I did not quite understand the concept of "x" in custom function (external table x internal table)
Custom Functions can be defined if you put the name of any argument/s in parentheses. In our case we called the single argument "x". "x" is just the name of an argument in the newly defined function. The function we defined was (x) => and was working on the Outer Table. As the Custom Column is copied down the column in the Outer Table the x tells each Field Name that it should look to the Outer Table, rather than the Inner Table. Further, the Custom Function is defined inside the Table.AddColumn which is acting on the Outer Table. The Table.AddColumn Function copies the Custom Function Down the column and executes the Custom Function for each row. By putting x before each column name, we get access to those Outer Table columns.
Very good explanation!
Mike this was amazing! Great video
Glad it was amazing for you, Chris!!!!!
I dont get it... How does PowerQuery know what "x" refers to? Can anyone please help?
Did you watch the last video, MSPTDA #9? We defined Custom Functions in that video and this video was a follow up. Nevertheless, Custom Functions can be defined if you put the name of any argument/s in parentheses. In our case we called the single argument "x". "x" is just the name of an argument in the newly defined function. The function we defined was (x) => and was working on the Outer Table. As the Custom Column is copied down the column in the Outer Table the x tells each Field Name that it should look to the Outer Table, rather than the Inner Table. I hope that helped.
Thank you for your support with your comments, Thumbs Ups and Subscription, Hao Yu : )
Thanks for replying. I understand the concept of Inner and Outer table here. What confuses me is we didn't pass any value to the variable/argument "x" (as far as I understand we declared a variable "x" without assigning "x=outer table"), so how does M code know "x" refers to the outer table? Again, thank you for your videos - I found them extremely helpful.
Because the Custom Function is defined inside the Table.AddColumn which is acting on the Outer Table. The Table.AddColumn Function copies the Custom Function Down the column and executes the Custom Function for each row. By putting x before each column name, we get access to those Outer Table columns.
I do a lot of year over year analysis for sales by product SKU. Is this possible to do in Power Query?
For example, I want 4 columns:
(1) Product
(2) Sales
(3) 2018 Year
(4) 2019 Year
This can be done very easily with a pivot table. Just not sure how to do it with power query. Thank you!
OMG!! This is terrible..Thanks for showing this awesome trick..
Thanks Mike for this EXCELlent video
You are welcome for the EXCELlent M Fun!!!! Thanks for the support, Syed : )
Thanks for another great video! :)
You are welcome, Armond!!! Thank you for your support : )
For adding the custom column MAT on step 14 for some reason the M code Intellisense was making it very difficult to edit the code when jumping between web browser window and PowerBI editor. It would also not accept the code, always claiming that comma was missing even it typed out perfectly, with the "OK" button greyed out in the custom column editor. I had to disable Intellisense in options, restart PowerBI, and then had no issue typing the code which worked perfectly the first try.
Did anyone else face this issue?
If you shut down your computer at COB each day the "issue" won't be an issue anymore.
Thank you
You are welcome : ) : )
Amazing video, and amazing formula. Anxious to use it. Also I have a question about in what conditions do we use Table.Buffer function cause when I'm applying a new step it seems that power query is, performing the same calculations over and over again?
Use Table.Buffer when: 1) you want to access the table and then store a copy so you do not have to call back to the query over and over, 2) You have an SQL connection and you do not want Power Query to send the query back to the SQL optimizer in the SQL Database.
@@excelisfun Thank you for you answer. Do I get your message correctly that if we have on step 1 a connected SQL data base as a source, on step 2 we Table.Buffer source and on step 3 we apply changes to the buffered table. That when we use Table.Buffer on step 2 then calculating the changes applied on step 3 does not affect SQL optimizer in SQL database ? So power query makes a query to the SQL database only once, and if we didn't buffer table it would query it second time on the step when we apply our changes?
@@orcawilly54 Yes. If you are connected to an SQL database, and you do not buffer, then you can right-click any step in the list of Power Query Applied Steps, and click on Native Query and it will show you the SQL Code that is being sent back.
@@excelisfun Thanks, also found your video MSPTDA 11.5 where you showed this feature.
@@orcawilly54 Cool : )
Hello Sir,
I am wondering that I don't know how to extract months from the dates lets say I have a data table consists of dates like 9/4/2018 etc and what I need to extract September only from the dates. How can I do this in PQ?
Regards
you can use function, month
You’re a data wizard!
Reaching the 500K subs milestone: just give it another couple of days...
You are so awesome, Geert!!!! : )
1 more K, 1 more K (subs).
Any day now! ;-)
Wow!!!! Thanks for helping the Team count, Geert!!!! : )
Thanks this was great! I’ve been trying to figure out how to do this for a week and then this video magically appeared.
I tried using this approach in a 50K row data table using Power Query in Excel. It worked but took a LONG time to return the results, and after closing Power Query and loading to the data model it wouldn’t load. I waited 2 hours and then abandoned. I triple checked that I had the buffered table and custom function right and it looks good.
Any thoughts?
I am not sure. I have not tried it out on a large set. The example I used was about 400 rows. I wonder if it is the 50,000 rows, or something else. If this transformation fails on 50,000 rows because it takes too long to load or refresh, then it is really not to useful. What data source are you using?
Thanks for the reply. I thought I knew just about everything about Excel until I stumbled on your Power Query series of videos. I’ve rearchitected some of my most important work with great results! I really appreciate you!
I’m using other Excel workbooks that contain tables as a data source. These are supplied to me monthly. I save them into a designated folder, point a query to that folder, filter for specific names, and load/transform the data. This gets loaded to the data model and then drives 20+ reports that use cube functions to retrieve the data. It’s pretty slick, highly portable, and built mostly from lessons I learned from you.
Awesome! I am glad that the videos help. Power Query is really a HUGE game changer. How do you create the Cude function reports? Do you create the formula yourself, or do you click the button to convert the Data Model PivotTable to Cube Functions? What are the advantages for you of using the Cube Functions, rather than the Data Model PivotTable? Do you have a different structure for your reports, like cutting and pasting the formulas in different ways, or is it formatting or is it something other that is the reason that you use the Cube Functions rather than PivotTables?
Let me know if you pin point the issue in the 50 K row report that is so slow in refreshing.
Initially I converted Data Model PivotTables to cube functions but now just type them in. I seem to have reached a limit on cube functions, however. Each time I make a change that triggers a data refresh, the cube functions seem to refresh individually rather than en masse. Hence, the more I have, the longer it takes. Maybe this is a data architecture issue but it feels more like a cube function limitation. I'm also using the CUBESET function to roll up Last Twelve Months (LTM) data. I wanted to test whether having the LTM data in the data model would make the Cube functions more efficient, hence this video was timely.
I've used cube functions more for convenience than any inherent advantage. To teach myself PowerPivot and PowerQuery, I redesigned an existing workbook to make use of these tools. That workbook was essentially a report writer, summarizing large amounts of data using SUMIFS functionality. The data source was a cross-tabulated worksheet table with months across the columns and accounting data in the rows. Additional columns were included for summarizing the months into MTD, QTD, YTD, LTM, and other accumulations, and each row was tagged with multiple report labels that the SUMIFS functions looked to. All this worked great except that with thousands of rows, the workbook essentially choked. Once I had the data in a proper data set inside a data model, it was easy to replace all those SUMIFS formulas with CUBEVALUE functions, and to still retain the professional formatting and layout (which is enormously important).
In this workbook, an end user can simply change a single date on a Control Sheet and all the reports update accordingly. Hence, historical and forecast reports can be created with a single simple date change. There are other global inputs on this control page that drive the reports as well. Some of these have been replaced with slicers, but in some cases I don't want to use slicers.
If it weren't for the (seemingly) extra overhead of cube functions, this would be a great solution. I'll probably end up back in PowerPivot, however. I want to keep Control Sheet Date functionality, but haven't yet figured out how to use that to drive a PivotTable filter other than with slicers (perhaps the CALCULATE function?).
I'll let you know if I figure out the 50K row issue. I have another workbook that is nearly 200K rows and 25 or so columns and also needs the LTM accumulation. For now I'm sticking with CUBESET functionality on that one.
I don't really understand the use of Moving Annual Total. Can someone please explain?
MAT can be useful for keeping track of revenue movements over a rolling (in this case) 12 months. At any given month in your reporting, you can track a products sales history over the past year. Not the last calendar year, but from today's month going back 12 months. I've used this for customer valuation when paying commissions to salespeople. If a customer's rolling 12mo. revenue is above 100k, pay out 10% of margin generated from that customer. 50k: 6% payout, 25k: 3% payout. This allows you to be more responsive in your customer valuation than just once a year creating a customer valuation.
Hi Mike. A quick question on this video: in Step03 you changed the dates to the start of the month, calculated min/max and datelist to create a table, and then converted to the start of the month. This new manufactured table is essentially the same as the initial 'Calculated Start of Month' table a few steps previous. Was there are reason for this (other than for tutoring purposes :) )?
Am loving your series here and all your BI videos in general. Thanks for the work you put in.
One contains a complete calendar from MinDate to MaxDate, the other incomplete from the sales (not every day had sales)
This is more difficult M, but still very interesting
Glad it is difficult and interesting, Excel.i Adam!!! Thanks for your support : )
great teacher!!
So million $ question, how does Bill Szysz come up with this stuff?Second million # question, how does this perform with table with millions of rows?
I am not sure how he comes up with it. I do know that Bill Szysz is THE smartest Power Query guy I know. I do not think it performs well on millions of rows. Better to do it with DAX.
I think it is better practice to use named parameters or unnamed parameters and not combine them. In this instance the nested function would look something like:
= Table.AddColumn(BufferedTable, "MAT", (outer) => List.Sum(Table.SelectRows(BufferedTable, (inner)=> inner[Date] >= outer[DateOneYearBack] and inner[Date]
I agree : )
My Head is spinning, but amazing Mike and thanks!!! :)
Head spinning is not a bad thing - cus when it stops, then we are smarter : ) Thanks for the support, John!
Going to have to practice some easier examples for this one to make sense. Difficult. Very Power though
Could you show it with earlier?
I am not sure what you are asking. Are you asking about the EARLIER DAX Function? If yes, this video is about the computer language M Code in Power Query. EARLIER is in the DAX Computer language.
Hi Mike, thanks for your reply. You menioned in the (great by the way) video that you could also do the MAT in DAX using the earlier. My question was whether you could show this as well.
best regards
I did not say that MAT could be done with EARLIER. I said two separate things: 1) MAT can be done with CALCULATE and a few other functions. 2) EARLIER is what allows you to jump back to an earlier row context. I also said that we would do this later in this class, a few months away, though. Luckily, I already have a similar video that shows Moving Average with DAX. Here it is: th-cam.com/video/FgsTNJ5dCDg/w-d-xo.html
Please always help support with your comments, Thumbs Up, Sub and tell all your friends : )
OK sorry misunderstood that in my rush.
No problem. I hope that video link helps : ) And thanks for your support!
Great video Mike
Glad it is great for you, K B!!! Thanks for your support : )
This was a big leap but I was left behind couldnt follow , couldn't understand functions.
Did you watch MSPTDA #9, where I introduced Custom Functions? Also, Here is some even more efficient M code technique for MAT, an update to this video: th-cam.com/video/itMtqU-walo/w-d-xo.html
@@excelisfun yes Sir I did watch the series from the start and the series was really good until the M function, and I am starting to realise that my understanding of excel functions is bigners level, pls guide me any good videos or materials to catchup .
@@sajjadsmeinbox Here is my 2 minute video about all the classes that i have from beginner to advanced: th-cam.com/video/l1-1aVgFth4/w-d-xo.html
learnt a lot
Impressive.
Rad!!!!!
A 100 like is mine ;)
Thank you for the 100 likes, Teammate!!!!
I got the concept but i find it too complicated.
I need to watch it more than two times i guess.
Yes, but as you said in another comment, you did not watch this whole series. You MUST watch and study Video #9 in this series to understand this video, #10 : )
Sure. Thank you.
Many Thanks
Thanks Mike
You are welcome. Here is some even more efficient M code technique for MAT, an update to this video: th-cam.com/video/itMtqU-walo/w-d-xo.html
I do not understand what he is doing on this case.
Working on running totals in M functions rather than in DAX. As a rule of thumb when solving an issue the earlier/closer to the root the better.
Great!!!
good
Maybe it is my lack of comprehension skills but I understood very little of this video.
Yes, sorry about that. This is #10 in a class. Did you watch the previous 10 videos?
Yup, watched and solved all of them thoroughly. Still having some trouble understanding lookups though. Regardless, it is on me and I will probably have to watch this a few more times before completely getting it. Thanks regardless, Mike!
@@mahneelmusa7469 Well, I can tell you that it took me a long time to approach M Code and Custom Function. M Code, and DAX, are totally new and different types computer languages, which makes then harder to learn. Even though they are function based, like Excel Worksheet Fomulas, each is unique and requires time and practice. Thanks to you, Mahneel, you watching and studying this series so far. You can continue watching this class, which is a combo of M Code, DAX, Power BI, Power Query and Data Modeling, or if you want to immediately practice some other M Code Custom Functions, here is a different playlist of M Code Custom Functions: th-cam.com/play/PLrRPvpgDmw0kuIuvn2rCQI14AthSZ43ca.html
@@excelisfun Thank you, Mike. Seriously can't thank you enough. You have enabled me to work and analyze data with Power Query. Though my skills are not quite that advanced but hopefully I'll get there eventually. Thanks again!
@@mahneelmusa7469 You will get there!!!!