Wow! You have changed my understanding of Excel capabilities!!! I used to use MS Access to do what you did with Power Query in seconds. I wonder if Access has much use still! Brilliant video: super clear, concise, well produced and useful! Thank you!
OMG, I am sitting here Wide-Eyed and Overwhelmed with Joy...Thank you so much for this.....I have been racking my brain on how to reduce and control our data to a professional and organized manner!...AMAZING!!!
Amazing!! Working up a prototype model in MS Access but wanted to report it in excel with pivots and slicers. Was ready to give up and hand it over to the BI developers but tried your method. 6m rows and extremely responsive. Also used your whole number trick and file size only 22mb. Thanks heaps!
The 3 tips in Excel video took me here. Thank you so much for the short but very informative tutorial video. I was trying to compare almost 9,000 rows with a couple hundreds rows, and Excel kept saying “Not responding.” I used the transpose function which is very very helpful without knowing it’s Power Query.
forgot about the decimal trick. That was awesome . So just by trimming the decimals to rounded numbers , you can reduce your file size. I am going to try that at work
Thank you so much Wyn for this video. I was able to take a 132 MB report and consolidate it down to just 9 MB! Now everyone in my company who does not have Excel 365 can view it as well as slice and dice.
Really nice intro to Data Model and why I should probably start using it. My base item set is 150k entries, and there are various dimension tables which can be attached, the numbers get big quickly and generic Excel M.O. starts choking immediately when you throw index(match()) at all of that. Thank you! P.S. it took me far longer to understand those manager names than I would like to admit. :p
Ohmygosh, that, my friend, was simply amazing!! 👏👏👏👏 I can't explain to you how many files I have that are so large that they respond slowly and then eventually crash! I've been trying to save these large files as .xlsb, and it helps, but it isn't fixing the ultimate problem. I cannot wait to try out this technique to see how it affects my monthly reports. Thank you so much for taking the time to go through this exercise. I am officially a subscriber!
Wyn, your content is always helpful and timely. Can you comment here about how to best use the same dataset but doing the analysis in Power BI? Since you don't have a Table, what is the connector with PBI?
Awesome video!!!! How much RAM does your PC have to handle this? It seems very snappy and fast in the Pivot Tables. I have 32 GB of RAM and im wondering if that is good enough
Hi, thanks for this. Normally in a traditional pivot if we click total column, it will typically open another sheet containing the break-up of that total column. How this will work with power query and data model?
Thanks for this excellent video. Could you please tell me how you added calendar, location table and cost center table on Queries and Connections section at the beginning of the video? Thanks in Advance
I love your video and was able to use every part of it, however, I do not understand how you incorporated the additional tables (cost centers, etc). I get the ability to build the references but how did you get them into the mix?
This is great and very helpful! Thank you for making this. May I know what PC specification that you used? I want to buy a laptop that can be used to analyze big data.
Here’s a video on the Cslendar table. th-cam.com/video/LfKm3ATibpE/w-d-xo.html The other 2 tables I just created manually in excel and pulled in using Power Query
I loved this video. Incredible work. I have a very large CSV file with 1.2 million rows and approx 300 columns. I want to extract only few particular columns from this data. How can it be done? Many thanks in advance.
Thanks Amit, that should be quite straightforward, use Get Data > From File > From Text/CSV connect to the file, then Ctrl Click on the columns to keep and then right-click REMOVE OTHER COLUMNS, then close and load to Data Model
Hi thanks for your reply. Is it a good idea to select columns manually if I have around 300 columns. In that case I have to do a lot of scrolling. Any idea if I can extract those desirable column headers name from some other excel or CSV file.. and then keep only these columns in my original large CSV file. Thanks.
Hi @@amitchaudhary6, yes it's possible but involves writing some M code. However, the Choose Columns button makes it really easy to tick the columns you want to keep / untick the ones you want to remove.
Great video! Do you have a video that show how to edit the data in massive data sets of up to 10 million rows? I've tried in vain to do it with a million rows using IF formulae (i.e if a record has "x" add the value "y" to field) but excel just gives up under the sheer weight of the formula...
Thanks, No video, but it really depends on how many columns you have, what the data source is and how much RAM you have, and if you have Excel 64 Bit. I just tested on 5 million from 10 CSV files in a folder, adding an IF Column value begins with A grab column B else 0. It ran in 30 seconds. 32GB RAM, 64 Bit Excel, 5 columns of data.
This was excellent! One question please. I have many csv files like in this video, but their columns structure is only slightly different from each other. Some of them have a few extra columns that I don't need. Is it possible to tell the query which columns to take (I can only include the columns that I know exist in all fyles)? Thank you!
As long as the first file in your folder has all of the columns you need then it won’t matter that other files have extra columns you don’t need. Make sure you use the Choose Columns or Right-Click Remove other columns option
This is excellent ASSUMING you want to analyze and break down all that data. If you actually need to see all that data and scroll or page through it, Excel just can't handle that. Correct me if I'm wrong.
@AccessAnalytic That's fair. Most would not. In my case we needed the ability to scroll through all the data to validate and to analyze differences between columns in a table in one database vs another. So those columns were side by side in pairs. Filtering or using aggregate functions would not help with that. A friend of mine years ago made a grid that would locally cache it's data (over a certain size) and enable virtual scrolling. So clearly it's something Excel could be doing if MS thought it would be a valuable feature.
@johnnyvlee there’s a lot of more accurate and better ways to compare data than eye balling millions of rows. If it were me I’d investigate further to automate the process.
i just came across this video to understand the capabilities of power query, which has widened my horizon. can I get a download link to the other data for the connections you created before the one of 10 million rows? i want to use that to understand it perfectly if you don't mind.
I don't have the precise ones but I have extra data sets including the well known AdventureWorks one here under "Dummy Data Sets" accessanalytic.com.au/free-excel-stuff/free-excel-templates/
Thanks for your video! What if my data size is far more than that and i need to update it with new data everyday? If I simple refresh the power query, it took me an hour loading time. Is it possible that the power query only add the new data to existing data and do not refresh all of them? Millions thanks!
Check out this video. th-cam.com/video/op6f-3uUFYg/w-d-xo.html If you’re using Power BI you can also use bravo.bi ( it’s free ) I’ve a video showing bravo.bi use here at 9:33 th-cam.com/video/g4oZ0pOpn-4/w-d-xo.html
Thank you for the video. Very informative. You have an extra zero on the far right in your first graphic "Ten Million rows of data and 100,000,0000".. I think it should be "100,000,000".
How do I get it to use the Exact formula so that it consolidates against a unique code from two sets of databases by also recognising lower case and upper case differences. For example AbC and ABC need to be treated as two unique codes.
How do you select data from a huge .csv file to only get rows for the current year (there is a field for that)? I've not discovered how I can do that with Data Model or Power Pivot. Additionally, if you select a second year, can we separate it into two different sheets?
In power query you can apply a filter to date column and choose IS Current Year. Yes you could reference the main query two times, and filter the 2 new queries by the 2 different years and load to 2 different sheets
Awesome, I would to ask you a question, if I create the model and everything in Excel 64 bits, my coworkers with Excel 32 bits will have any problem???. Right now with Excel 32 bits an a model I’m having a problem that consume More than 2 GB of RAM, in total with all the tables no more than 50k rows. Thanks
I wouldn’t rely on it working well on 32 Bit, but it could be ok if they are just slicing and dicing. Newer versions of 32 Bit Excel can now utilise 4GB Ram. 64 bit can utilise all the spare RAM on your machine
@@AccessAnalytic thank you very much for your answer so quickly, it seems I can ask to IT department to change my version to 64 bits but I was a little worried about that and about the macros I developed. Like you wrote, the people will only slice and see the data. It’s important to work on both arquitecture for this particular tool. Again thanks.
So I have a table in MS Access that is close to 4 million records, that I would love exported into excel. Unfortunately excel would only allow a little over 1 million records. Would this work in that instance?
Check out this video th-cam.com/video/RV47yX70NN8/w-d-xo.html Essentially click the Get Data button, connect to your other files and Load to... Connection Only... Data Model
within this, is there a way to change the interval? as in, can i make it so it only pulls every 4 hours, weekly, daily etc? also, will the data update automatically if its through a folder? meaning if i add another excel file to the same path, with the power pivot recognize there is more data? do you have a video on how to make the fields?
There's no simple way to schedule a refresh, for that something like Power BI could be a better solution. There are techniques with VBA or Power Automate or a third party tool called Power Update. ALL files in a folder will be reloaded into Power Pivot each time the refresh runs. When you say make the fields, can you explain a little more what you mean please
@@AccessAnalytic hmm let me just explain what i need to do. I was able to get the power pivot to work, but all i want to be able to do is take 2 columns and make a graph without summing all the data. im using this for engineering, so i just want to see how a pump in a system changes every hour each day. the only math i would need is also taking 2 columns and being able to divide them between each other to get velocity. I also want to be able to change the time interval (daily, hourly, every 4 hours). how would i go about doing all these things? Thanks for the quick response- I appreciate it
Need Help: My all CSV data is in same format..in raw header there is unique I'd and in coloum header there is Date(7 days date) and in field are there is different kpi. After load all data I need to add Average and countif >x function at last. I try for average..first take sum of 7 date and divide by 7. got the result but when I use power pivot and use this average data then only count shows error occurred sum is not working on this data.
You could publish the entire file to Power BI , or put the Power Query code into Power BI desktop and export using DAX studio Or copy the M code to a Dataflow There’s no direct way to export power query code currently
Hey! I need your help. I am trying to convert a large XML file into Excel unfortunately whenever I try to convert it I am only getting incomplete data in excel form. e.g Emails are there but the name is missing or vice-versa. Could you please help me out. Thanks in advance
Wow! You have changed my understanding of Excel capabilities!!!
I used to use MS Access to do what you did with Power Query in seconds. I wonder if Access has much use still!
Brilliant video: super clear, concise, well produced and useful!
Thank you!
Thank you for the very kind comments. Access is indeed being slowly made obsolete by several other Microsoft technologies
OMG, I am sitting here Wide-Eyed and Overwhelmed with Joy...Thank you so much for this.....I have been racking my brain on how to reduce and control our data to a professional and organized manner!...AMAZING!!!
Great to know
SUBSCRIBED AND FOLLOWING
You took the words out of me! 😁 following too.
Thanks @@natah1284
Amazing!! Working up a prototype model in MS Access but wanted to report it in excel with pivots and slicers. Was ready to give up and hand it over to the BI developers but tried your method. 6m rows and extremely responsive. Also used your whole number trick and file size only 22mb. Thanks heaps!
Awesome, I really appreciate you letting me know you found it useful
I've seen this a few times before, but no one has ever drilled down into the performance ROI... thank you, this is going to help a lot!
You’re welcome
This is a powerful example from the Big Data era.
Amazing video! Thank you Wyn!
Glad you liked it Iván
I can't say how much I am thankful for this video,i struggled for two days and everything I tried end with not responding, you save me thank youuuuuu🤩
Glad it helped
I’m currently working on +11k rows of data and currently experiencing challenges. Glad I came across your YT channel 😃
Good to know, thanks for the kind comments
The 3 tips in Excel video took me here. Thank you so much for the short but very informative tutorial video. I was trying to compare almost 9,000 rows with a couple hundreds rows, and Excel kept saying “Not responding.” I used the transpose function which is very very helpful without knowing it’s Power Query.
You’re welcome
forgot about the decimal trick. That was awesome . So just by trimming the decimals to rounded numbers , you can reduce your file size. I am going to try that at work
Excellent
Thank you so much Wyn for this video. I was able to take a 132 MB report and consolidate it down to just 9 MB! Now everyone in my company who does not have Excel 365 can view it as well as slice and dice.
Excellent to hear Phil. Thanks for letting me know this was useful.
@phil danley how did you consolidated it down to 9 MB??
The Power Pivot (Data Model) engine performs some amazing compression when columns contain non unique values
@@PEACOCKQ Wyn explained it better than I could. My answer would be "magic" LOL. I don't know how the compression works but it does.
Just amazing.. Your decimal rounding of to reduce space ... worked brilliantly for my data model. Data model pull from SSAS. Thank you so much.
Glad to help 😀
Short and precise explanation. This is amazing and thank you for providing data set for practice.
Thank you Byregowda, kind of you to leave a comment
very beautifully explained. Thank for for this informative lecture on power query and power pivot power
I appreciate you taking the time to let me know you found it useful
Thank you very much Access Analytic for sharing your knowledge in handling big data using ower query ..God bless
Thank you
Very nice, Wyn! Thank you.
Cheers Houston
Thanks, lesson completed. Greetings from Costa Rica.
No worries 🇨🇷
What a beautiful way of explaining how to handle big data. Loved this !!
Thank you Shiraj, glad you enjoyed it
This is fantastic. I feel like I have a new superpower. Great explanation video.
That’s great Phoebe! Thanks for letting me know.
Well done. Thank you for the instruction.
Thanks
Very well explained and easy to understand. Thank you.
Thank you
Really nice intro to Data Model and why I should probably start using it. My base item set is 150k entries, and there are various dimension tables which can be attached, the numbers get big quickly and generic Excel M.O. starts choking immediately when you throw index(match()) at all of that. Thank you!
P.S. it took me far longer to understand those manager names than I would like to admit. :p
😄. Glad it helps Lee
This is AMAZING!!!! Still in awe.
Thanks for the brilliant content.
You’re welcome
Excellent tutorial, thank you!
Cheers!
That is SUPER AWESOME! 😍 Thank you so much, Wyn!
No worries, you're welcome Dimitris
Power Query, Data model, very efficient. It became much easier to work with such heavy data. Thank you so much!👍
Thanks for leaving a comment Luciano
Great video. Thank you 🎉
You’re welcome
Ohmygosh, that, my friend, was simply amazing!! 👏👏👏👏
I can't explain to you how many files I have that are so large that they respond slowly and then eventually crash! I've been trying to save these large files as .xlsb, and it helps, but it isn't fixing the ultimate problem. I cannot wait to try out this technique to see how it affects my monthly reports. Thank you so much for taking the time to go through this exercise. I am officially a subscriber!
That’s great, glad to help flag what Excel is capable of
You have no clue the amount of sleep I have lost trying to figure out how to control this amount of data
Great job! Incredibly powerful!
Cheers
One of your best videos 😊
Thank you
This is helped me out TREMENDOUSLY! Thank you!
That’s great Joe. Thanks for letting me know
Very informative. I may be a little bit of topics but what software you use for making your video. Thank you!
Camtasia and a green screen
Thank you, as many say, short and precise. Perfect
Cheers Mark
Pretty awesome you are right!! Thank you for your videos are so easy to follow!
Thank you so much
Tremendous. Thanks a lot for knowledge sharing.
Thank you so much Raghavendra
This was mega! Thanks for sharing your knowledge mate! Appreciate it.
No worries, thanks for taking the time to leave a kind comment
No worries, thanks for taking the time to leave a kind comment
No worries, thanks for taking the time to leave a kind comment
No worries, thanks for taking the time to leave a kind comment
I am grateful of you
Thanks for taking the time to leave a kind comment
Great tutorial! Thank you
You're welcome Salvador
this looks super cool, thank you very much :)
You’re welcome
Wyn, your content is always helpful and timely. Can you comment here about how to best use the same dataset but doing the analysis in Power BI? Since you don't have a Table, what is the connector with PBI?
Thanks Bernie. I’m not quite sure I follow. You can do it the same way in Power BI by pulling in the folder of CSV files.
Awesome video!!!! How much RAM does your PC have to handle this? It seems very snappy and fast in the Pivot Tables. I have 32 GB of RAM and im wondering if that is good enough
Thanks 😀. I think that was done on 16GB. I use 32GB these days
You saved me a lot of time. Wonderful explanation.
You’re welcome Hussein
Awesome! Thanks for your sharing.
No worries
Good Video. Very helpful and now ready to try it.
Thanks for leaving a comment Ed. Good luck with it!
Wynn... this was amazing.. i completely forgot the 3D maps... great video!
Thanks for taking the time to leave a kind comment, I appreciate it,
This is great! Thanks 😊
You’re welcome
Fantastic, thank you for sharing your knowledge
You’re welcome Ed, thanks fir leaving a comment
Fantastic! Thank you! However I couldn't figure out where consolidation tables came from. Is there a video about that and a link to it???❓❓
You're welcome Atomic Blue Life
Hi, thanks for this. Normally in a traditional pivot if we click total column, it will typically open another sheet containing the break-up of that total column. How this will work with power query and data model?
It can work to some extent, it depends on a few different scenarios so may / may not work, and may limit how many records display
Amazing stuff
Thank you
Thank you for producing this excellent video - very easy to follow and very informative
You’re welcome Tim
Thanks for this excellent video. Could you please tell me how you added calendar, location table and cost center table on Queries and Connections section at the beginning of the video? Thanks in Advance
Hi Debarshi,
Those were in an Excel file and I used Get Data from Excel Workbook.
You saved my life 🙏
Glad to help :)
I love your video and was able to use every part of it, however, I do not understand how you incorporated the additional tables (cost centers, etc). I get the ability to build the references but how did you get them into the mix?
Hi Robert, watch this explanation of Data Models and let me know if that helps th-cam.com/video/RV47yX70NN8/w-d-xo.html
That was awesome.
Thanks :)
This is great and very helpful! Thank you for making this. May I know what PC specification that you used? I want to buy a laptop that can be used to analyze big data.
Glad it helps Ivena. I'm using an XPS 17 9700 with 32 GB RAM and i7 processesor
Super👍👍
😄
Awesome, awesome, Thank You Very Much :)
You're welcome Elfrid
Great !!! thank you
You’re welcome
Thanks for the video
You’re welcome
fantastic video thanks!
You're welcome Simon
Really awesome !!
please, DR. can u explain how to search string in big data before?! thank u. you are a great teacher!
I don't understand could you provide more details please.
is there a video on how you created the calendar, costcenter, and location table?
Here’s a video on the Cslendar table. th-cam.com/video/LfKm3ATibpE/w-d-xo.html
The other 2 tables I just created manually in excel and pulled in using Power Query
Awesome!
Thanks 😀
Awesome...
Cheers 😃
Eish!?!? Excellent presentation, goodness!
Thanks ( I think ) Willy 😀
I loved this video. Incredible work. I have a very large CSV file with 1.2 million rows and approx 300 columns. I want to extract only few particular columns from this data. How can it be done? Many thanks in advance.
Thanks Amit, that should be quite straightforward, use Get Data > From File > From Text/CSV connect to the file, then Ctrl Click on the columns to keep and then right-click REMOVE OTHER COLUMNS, then close and load to Data Model
Hi thanks for your reply. Is it a good idea to select columns manually if I have around 300 columns. In that case I have to do a lot of scrolling. Any idea if I can extract those desirable column headers name from some other excel or CSV file.. and then keep only these columns in my original large CSV file. Thanks.
Hi @@amitchaudhary6, yes it's possible but involves writing some M code. However, the Choose Columns button makes it really easy to tick the columns you want to keep / untick the ones you want to remove.
Thanks for your inputs. It is exactly what I was looking for. 👍
300 columns?!
Amazing!
So cool, thank you!
You're welcome Al
Great video! Do you have a video that show how to edit the data in massive data sets of up to 10 million rows? I've tried in vain to do it with a million rows using IF formulae (i.e if a record has "x" add the value "y" to field) but excel just gives up under the sheer weight of the formula...
Thanks, No video, but it really depends on how many columns you have, what the data source is and how much RAM you have, and if you have Excel 64 Bit. I just tested on 5 million from 10 CSV files in a folder, adding an IF Column value begins with A grab column B else 0. It ran in 30 seconds.
32GB RAM, 64 Bit Excel, 5 columns of data.
This was excellent! One question please. I have many csv files like in this video, but their columns structure is only slightly different from each other. Some of them have a few extra columns that I don't need. Is it possible to tell the query which columns to take (I can only include the columns that I know exist in all fyles)? Thank you!
As long as the first file in your folder has all of the columns you need then it won’t matter that other files have extra columns you don’t need.
Make sure you use the Choose Columns or Right-Click Remove other columns option
@@AccessAnalytic Thank you
This is excellent ASSUMING you want to analyze and break down all that data. If you actually need to see all that data and scroll or page through it, Excel just can't handle that. Correct me if I'm wrong.
It’s unlikely you’d want to scroll 1 million rows. You’d just apply a filter in Power Query and load to a table instead.
@AccessAnalytic That's fair. Most would not. In my case we needed the ability to scroll through all the data to validate and to analyze differences between columns in a table in one database vs another. So those columns were side by side in pairs. Filtering or using aggregate functions would not help with that.
A friend of mine years ago made a grid that would locally cache it's data (over a certain size) and enable virtual scrolling. So clearly it's something Excel could be doing if MS thought it would be a valuable feature.
@johnnyvlee there’s a lot of more accurate and better ways to compare data than eye balling millions of rows. If it were me I’d investigate further to automate the process.
@johnnyvlee I’d look into Power Query Anti Join Merges to spit out exceptions
superb...
Thank you Saleem
i just came across this video to understand the capabilities of power query, which has widened my horizon. can I get a download link to the other data for the connections you created before the one of 10 million rows? i want to use that to understand it perfectly if you don't mind.
I don't have the precise ones but I have extra data sets including the well known AdventureWorks one here under "Dummy Data Sets" accessanalytic.com.au/free-excel-stuff/free-excel-templates/
Please make more videos on excel
Here's 28 others 😁
th-cam.com/play/PLlHDyf8d156Xnoph4CbOiMrqQKiJZ8mhn.html.
Thanks a ton! Got to see a practical Big Data example handled through Excel and it's amazing! 😊👍👌
Thanks Vijay
Awesome
Cheers
Thanks for your video! What if my data size is far more than that and i need to update it with new data everyday? If I simple refresh the power query, it took me an hour loading time. Is it possible that the power query only add the new data to existing data and do not refresh all of them? Millions thanks!
Hi Warren, there's no way to do that with Excel currently. What is your data source?
How can I export the table created by Power Query to one CSV file?
Check out this video. th-cam.com/video/op6f-3uUFYg/w-d-xo.html
If you’re using Power BI you can also use bravo.bi ( it’s free )
I’ve a video showing bravo.bi use here at 9:33 th-cam.com/video/g4oZ0pOpn-4/w-d-xo.html
Thank you for the video. Very informative. You have an extra zero on the far right in your first graphic "Ten Million rows of data and 100,000,0000".. I think it should be "100,000,000".
Thanks Tom, well spotted on the mistake. After I’d published it was too late to change 😬.
How do I get it to use the Exact formula so that it consolidates against a unique code from two sets of databases by also recognising lower case and upper case differences. For example AbC and ABC need to be treated as two unique codes.
I don’t know sorry. The DAX engine encodes ABC and abc the same
Are the calendar, location, and cost center tables available for download?
The calendar and various datasets are available here: accessanalytic.com.au/free-excel-stuff/free-excel-templates/
Also check out pbi.guide/resources/
@@AccessAnalytic thanks very much
How do you select data from a huge .csv file to only get rows for the current year (there is a field for that)? I've not discovered how I can do that with Data Model or Power Pivot. Additionally, if you select a second year, can we separate it into two different sheets?
In power query you can apply a filter to date column and choose IS Current Year.
Yes you could reference the main query two times, and filter the 2 new queries by the 2 different years and load to 2 different sheets
Thanks....
You’re welcome
Awesome, I would to ask you a question, if I create the model and everything in Excel 64 bits, my coworkers with Excel 32 bits will have any problem???. Right now with Excel 32 bits an a model I’m having a problem that consume More than 2 GB of RAM, in total with all the tables no more than 50k rows. Thanks
I wouldn’t rely on it working well on 32 Bit, but it could be ok if they are just slicing and dicing.
Newer versions of 32 Bit Excel can now utilise 4GB Ram. 64 bit can utilise all the spare RAM on your machine
@@AccessAnalytic thank you very much for your answer so quickly, it seems I can ask to IT department to change my version to 64 bits but I was a little worried about that and about the macros I developed. Like you wrote, the people will only slice and see the data. It’s important to work on both arquitecture for this particular tool. Again thanks.
The only real way of knowing is to test it out
@@AccessAnalytic I will do it and I’ll let you know. Thanks
Hi Teacher, do all csv files contain hearders or only the first csv file should contain headers?
All contain same headers
If you want to export the 10 MM rows as a csv after transforming the data, is there a way to do it?
Yep Export Power Query Tables to CSV using DAX Studio. Even 5 Million records!
th-cam.com/video/op6f-3uUFYg/w-d-xo.html
Great, how to make thé date table ?
Here you go What is a Date Table / Calendar table in Power BI / Excel
th-cam.com/video/LfKm3ATibpE/w-d-xo.html
Brilliant
Glad you liked it
So I have a table in MS Access that is close to 4 million records, that I would love exported into excel. Unfortunately excel would only allow a little over 1 million records. Would this work in that instance?
You could certainly pull the data into the Excel data model and then create Pivot tables to analyse it
i am having only the consolidation file...how other files can be inserted in this diagram view?
Check out this video th-cam.com/video/RV47yX70NN8/w-d-xo.html Essentially click the Get Data button, connect to your other files and Load to... Connection Only... Data Model
within this, is there a way to change the interval? as in, can i make it so it only pulls every 4 hours, weekly, daily etc? also, will the data update automatically if its through a folder? meaning if i add another excel file to the same path, with the power pivot recognize there is more data? do you have a video on how to make the fields?
There's no simple way to schedule a refresh, for that something like Power BI could be a better solution. There are techniques with VBA or Power Automate or a third party tool called Power Update. ALL files in a folder will be reloaded into Power Pivot each time the refresh runs. When you say make the fields, can you explain a little more what you mean please
@@AccessAnalytic hmm let me just explain what i need to do. I was able to get the power pivot to work, but all i want to be able to do is take 2 columns and make a graph without summing all the data. im using this for engineering, so i just want to see how a pump in a system changes every hour each day. the only math i would need is also taking 2 columns and being able to divide them between each other to get velocity. I also want to be able to change the time interval (daily, hourly, every 4 hours). how would i go about doing all these things? Thanks for the quick response- I appreciate it
Sounds like you might need a Time Table : th-cam.com/video/-q7v56p192M/w-d-xo.html
And a Calendar table:
th-cam.com/video/LfKm3ATibpE/w-d-xo.html
@@AccessAnalytic thank you so much, you are such a helpful person. any chance you can upload the time table as an excel? I do not have Power Bi.
@@cela9482 - done 😁 accessanalytic.com.au/free-excel-stuff/free-excel-templates/
Need Help: My all CSV data is in same format..in raw header there is unique I'd and in coloum header there is Date(7 days date) and in field are there is different kpi.
After load all data I need to add Average and countif >x function at last.
I try for average..first take sum of 7 date and divide by 7. got the result but when I use power pivot and use this average data then only count shows error occurred sum is not working on this data.
I’d recommend posting screenshots and a sample file here aka.ms/excelcommunity
does it have to be CSV? I have a large excel sheet that I need to do this for.
It can be Excel. Refresh wil take a little longer
Is this possible in the older version of Excel?
Excel 2016 onwards
Hi, thanks to how can export this data file to Power BI or any other tools?
You could publish the entire file to Power BI ,
or put the Power Query code into Power BI desktop and export using DAX studio
Or copy the M code to a Dataflow
There’s no direct way to export power query code currently
Is there an easy way to extract data in an Excel data model into a CSV? Thank you!
How about this? th-cam.com/video/op6f-3uUFYg/w-d-xo.html
Capture the decimal in another integer field to save room
Yeah might be worth it in some scenarios
May I ask your computer spec.(cpu, ram) to handle above 250mb excel files?
Currently using Dell XPS 32GB RAM i7-10875H CPU @ 2.30GHz (that video was done with Surface Book 16 GB RAM)
Hey! I need your help. I am trying to convert a large XML file into Excel unfortunately whenever I try to convert it I am only getting incomplete data in excel form. e.g Emails are there but the name is missing or vice-versa. Could you please help me out.
Thanks in advance
I’d recommend posting the issue with screenshots and sample data here aka.ms/excelcommunity