Question: is there still a place for dataflows? What is the best practice? Setting up a datamart to import data from an on prem sql server? Or keep using dataflows and then plugging datamart to it? Which is more efficient?
Wondering the same thing, if the end result of the datamart is, say, a dataset and not the individual dataflow queries, then I still prefer dataflow for the ability to reuse dataflow across multiple dataset... Lots of exploration to do!
With dataflow, you cant connect from Excel, SSMS. It is not the case for datamart. Dataflow still has its own position, since you can give them different refresh schedules, and reuse them in different datamart, or power BI dataset
Does this mean we don't need to create pbix files for data sets? For me it sounds like most of the development we do in pbi desktop can be eliminated, and especially the "download/publish pbix" cycle.
I'm super excited because my company uses both PowerBI and Tableau. I can do my ETL in Power BI, have some PowerBI reports, and reuse the same ETL in Tableau visualizations in those departments that are on Tableau! We don't have anyone in IT to do ETL, so I'm doing it on the side as a business user.
I find it cheap to sell an online version of PBI(x) as something new. It is not. And: It is definitly not a anything near datawarehouse, all the pain of admin your many databases is still there. It is only the "view" of modelling some of your tables in a relational way. Is it not? So it would be no more than the waited for "MS Access online". Was that the project name? :-)
Question ??: wondering about use-cases for PBI datamarts. The Cube Guys gave an example of a datamart being set up for a business area for that department's own use. What about taking a datamart approach to centralize core datasets to support all enterprise PBI reporting? What would be the pros/cons in that use case vs. using dataflows?
This has been a huge dilemma for me in recent years! Democratization is great but there’s a BIG DANGER looming with regards to data governance! Here’s a use case : A friend of mine was working at a big furniture store. When covid hit, deliveries became a huge challenge and customer complained a lot; there was a lot of cancellation too. The company decided to charge customers on delivery date (instead of purchase date); hence it was decided by the data governance comity to modify the MONTHLY_SALES measure to use delivery date instead of purchase date. Despite the fact that this change took place in the primary enterprise DW, which is the official source of all the smaller datamarts across the company, Marketing and HR_sales_commission DID NOT update their local personalized MONTHLY_SALES measure. It was a mess! So, I’d say, if you decide to go the self-service route, make sure to invest money and ressources in a good CoE (CENTER of EXPERTISE), that will create a strong HORIZONTAL community of users across all departments. In anyway, that’s my solution to mitigate the danger.
Thank you Adam & Patrick for diving into this in a way that is straight to the point :) I am excited for datamarts in Power BI as I can see this being a boon for our newer international offices where they don't have all of the bells and whistles we have in our US offices!
This looks almost too good to be true :) One question though: can you use datamart as a source for building another dataset? E.g. to combine data from different datamarts (and datamarts becoming a kind of reusable ETL just like dataflows :))
Context : We currently use Azure Analysis Services to power our dashboards. It's getting extremely costly for us to maintain the AAS - memory wise. Question: Can we shift our architecture to datamarts ? Provided that we need a daily refresh of those data sets and our current model size is around 120 GB.
While this seems cool, I’m not sure if the common business user would see much more benefit from connecting to this versus a shared power bi dataset. Main difference I can see is the SQL server access, anything else?
@Nick D And a premium-only feature. Hate how a lot of these new features are creeping in as a premium only feature which simply makes no financial sense for my company.
In the past, dataflows were saved as flat files. Querying those flat files was slow, especially if it was a large file, since PowerQuery had to load the entire file. To address they, MSFT created the Enhanced Compute Engine, but it was buggy and didn't seem to work that well. To me the big value add is that the output of PQ gets saved to a standalone, queryable SQL Server instance. Folding and query optimizations should now happen natively, so I expect good performance benefits from this, especially when loading the Datamart tables and doing further transformations into a new PBIX.
Question: if user could save their sql query result as a new dataset in datamart? like using the datasets there to query and then doing like create view or something to save what they queried as a new table saved in the same location where the data table stored?
Amazing feature!! How do you separate when to build a DW (+SSAS ??) and when to use a DataMart straight into PBI Service?, Does BI / DW Dev + Data Engineers skills are no longer need? or maybe less needed? A Business User with good skills in PQ and data modeling can create a super powerfull datamart without a BI Team?
I don't really see the difference in an user perpsective from a bigger dataset I built up in PBI Desktop and published it. Maybe the only difference is you can set up different refresh schedules for the different data sources and you don't need PBI Desktop - but then it should be called PBI Online or something. Maybe an IT head is excited that you can query SQL but I don't see why is that better than making some report page in a PBI report.
My thoughts exactly. Seems like they just moved the dataset to the cloud, which means any business user would still have to deal with the complexity of Power BI Desktop, just in different steps. There's still power query, row level security, DAX, modeling, SQL statements in PQ have been kinda made into their own thing. Then since you're already in Premium, of course you could connect with SSMS since you got xmla endpoints enabled. But you got limitations like no calculated columns, no calculated tables, requires Power BI Premium, will likely chew away at your included premium capacity, and no write XMLA. Likely just going to push the dataset sprawl into datamarts instead of datasets, and a far more costly at that. A PBIX containing a dataset and dataflows for each data source would likely be far more manageable and reusable than a datamart. Feels pretty underwhelming to be honest. Like a bundle of existing components that are marketed as a revolutionary product still looking for a need to address, rather than a product designed to meet a certain need. I just don't see the value proposition here.
Hello! Great explanation, i have a doubt. Why are you saying that now calculated tables are not needed anymore because we have sql queries? the queries i create cant be used in the model, they are just for analizing the data. What am i not understanding?
Question: What about stored procedures, triggers, views? So in our good old SQL Server we got some stored procedures and triggers that help with automation. Can we create stored procedures PowerBI datamart?
This will be an amazing experience! The most awesome thing was... copy that SQL endpoint... and query it into SSMS... oooh!! I am sure.. it has a promising path in future.
Hey guys! Love your videos! Great explanation here. Question for you, I stood up a Datamart for a client and was wondering how each query executes. With individual flows, I can schedule each to run concurrently. If one fails, the rest still run. Do Datamarts function in a similar fashion? Similarly, if I do not want to refresh specific queries (think master data connected to csv on SharePoint) when the model refreshes, is this possible?
Hi, it is possible to feed the datamart in workspace1 with dataset from workspace2? or into the datamart need to be loaded the sources from the dataset is created?
Question: I am on premium per user license, and have created premium workspace, can i make use of the dataset in datamarts to publish in power bi but to be consumed by user having pro license.
Is there any place I can find info/guidance on how to successfully use the Datamart connection string in various 3rd party tools? It works automagically in PBID & Excel (using the SQL Server connector) but, at least for me, not so much in any other 3rd party tools.
Be aware that Power BI datamart backend is a managed Azure SQL database, but with a limited database size up to 100GB. For more than 100GB database, Azure SQL is still your option. For big data in PB size then still AZURE Synapse
Looks nice. We only want to make this functionality available to a select group of users instead of everyone within the organization. How do I set this up in the tenant?
Hi Pat, quick one. If i can create datamart in PBI, store data in PBI and use it in SSMS or Excel etc... how will i incremental refresh the data and create history if required..
This is bananas! Will we be able to *easily* copy over existing Power BI desktop queries into the Datamart? Or even import an entire model (including all measures)? Like other commenters, I'm excited to push more of the power query ETL workload into a server and free up my local machine to do...well, I'll figure out something else for it to do!
Thank you for this video!! However, I have questions: - Do we need a gateway for datamarts syncing? If no, that will be great in fact - Can we do ETL on an existing Azure SQL database?If yes, this feature is a revolution !! I think the only situation we need on-prem gateway is when use have data behind a VNET, please correct me if I am wrong! Thank you again!!
Can you use a PBI Datamart as a data source in PowerApps? If the backend of the PowerApps is sql and the PA captures data and sends it to SQL, which feeds the datamart, can I refresh the datamart from PowerApps?
Hi guys! I have been exploring datamarts this month and I have found a big issue with RLS. If I use a dataflow to extract data from a datamart with RLS, the RLS does not work. I can still see all the rows in the table. I have done the test in Power BI Desktop connecting to the datamart or the dataset (using both connectors), and here the RLS does apply. I have also connected to a datamart with management studio, and the RLS here does not work. So, there are still some BIG SECURITY issues using the SQL or Azure SQL connection.
Hey Guys.. It's a cool feature but can we allow business to do the self serve just by loading related tables? And how much data capcity it allows to hold and what kind of licensing one would need to hold that data and do the analysis.
How does the dataset refresh happens with datamart? Does it work like a Power BI dataset that refresh should be done from the model or can the refresh be triggered from Power BI Desktop. Can it be used outside the workspace it was created in?
Would like to explore how can we sync back DataMart to the on-premises SQL server . Datamart sql connection to SQL Connection to keep a copy on local Sql server.
Hi! great video! What is the difference between using an Excel file uploaded to OneDrive/SharePoint and using a datamart? The first thing that comes to mind is update speed: if I have a table that pulls data from an Excel file in OneDrive/SharePoint and this Excel file gets updated with new rows or changes, it takes some time for these changes to be reflected in Power BI. On the other hand, with a datamart, changes to the data source could lead to an immediate update of data in Power BI. Is that correct? Thank you.😃
0:32 "Power BI dataflows was just announced at Build 2022" ... was it? Dataflows have been around for a while so why would they just be announcing it now?
Hi, Does the dataset (auto created) has to be in direct query (will it allow different storage modes) or can it have agg models? I believe just like database endpoint, there's a dataset endpoint which we can connect to Tabular editor or any external tool. [PS: learnt from Reza Rad, it has dataset endpoint too] Wish the read/write will be available instead of just read, permissions to the database, in case IT has to be involved in something like optimization of queries. Finally, is it possible to download the bim / pbix , for version control? Thanks.
Is there a way to use calculation groups in datamarts? Tested by loading my dataflows into one and it seems so much faster than my current dataset but I have all of my time intelligence built using calculation groups so hoping there is a way to accomplish the same thing in datamarts.
Nice explanation! Can we use the output of a visual or SQL query of a datamart in a visual of PowerBI report ? I do not see the visual / SQL queries that I have written in the datamart are visible in PowerBI desktop once I connect/import in to PBIX in PowerBI desktop. Please suggest. Thanks!
Hi how are you, same here am wandering since we are able to write queries, we should be able to use it to create dashboard in PowerBI, if not what would be the use for this in the datamart
I've been experimenting with a datamart in Power BI, but it seem like it doesn't support date hierarchies... Does that sound correct or am I missing something?
Great video, great new feature!! A question I have is related to refreshes. Can we setup separate refreshes for each table or only to the datamart itself?
Question: Can I add data more data to tables in a datamart. For example, I need to take regular snapshots of real time data in order to extrapolate the trends over time. Is there a way to insert the next snapshot into the same table where the previous snapshot lies?
I tried using it to connect to another dataflow(used both power bi connectors as well as power platform)but as of now it doesn’t support linked entities/tables hope to see soon. So I should start using datamart instead of dataflow??
Calling these data marts is a little disingenuous. Is it possible to persist data in these tables? meaning, do an upsert, keep records that were deleted in the source, create a slowly changing dimension, persist fact records regardless of what is happening in sources. If a source system goes away then data should persist in the data mart. Maybe these are features that will come in the future. I like the spirit here, and I like the idea of separating data models from power bi reports. This feels closer to low-code SSAS tabular. I saw they only allow reads for now, but maybe at some point you can do DML statements via low-code interface.
If I understand correctly we can easily set up a curated DB through datamarts so that anyone I share it with can access it regardless of their actual accessibility of the sources I populate in my datamart ?
Yes - you can create a curated Db and apply row level RLS as needed and share with other recipients (regardless of their accessibility of the original source)
@@watchthewaj That's awesome ! Because you can secure you server access or even for data confidentiality you can do the ETL within the datamart so people can just access what's needed without having them directly accessing the source, that's awesome.
Question: is there still a place for dataflows? What is the best practice? Setting up a datamart to import data from an on prem sql server? Or keep using dataflows and then plugging datamart to it? Which is more efficient?
Wondering the same thing, if the end result of the datamart is, say, a dataset and not the individual dataflow queries, then I still prefer dataflow for the ability to reuse dataflow across multiple dataset... Lots of exploration to do!
That was my immediate question I would love a Dataflow vs Datamart use case video guys
You can check this video, might help with the concepts: th-cam.com/video/CAZlaiq24S0/w-d-xo.html
With dataflow, you cant connect from Excel, SSMS. It is not the case for datamart.
Dataflow still has its own position, since you can give them different refresh schedules, and reuse them in different datamart, or power BI dataset
This will help clarifying this aspect. There is still place for both.
docs.microsoft.com/en-us/power-bi/transform-model/datamarts/datamarts-overview
Does this mean we don't need to create pbix files for data sets? For me it sounds like most of the development we do in pbi desktop can be eliminated, and especially the "download/publish pbix" cycle.
I'm super excited because my company uses both PowerBI and Tableau. I can do my ETL in Power BI, have some PowerBI reports, and reuse the same ETL in Tableau visualizations in those departments that are on Tableau! We don't have anyone in IT to do ETL, so I'm doing it on the side as a business user.
You can connect regular powerbi datasets to tableau too since 2019.
Love it!
This is the most complete and concise introduction to datamarts that I've seen to date. Very well done!
I find it cheap to sell an online version of PBI(x) as something new. It is not. And: It is definitly not a anything near datawarehouse, all the pain of admin your many databases is still there. It is only the "view" of modelling some of your tables in a relational way. Is it not? So it would be no more than the waited for "MS Access online". Was that the project name? :-)
can you connect datamart datasets to tabular editor?
Question ??: wondering about use-cases for PBI datamarts. The Cube Guys gave an example of a datamart being set up for a business area for that department's own use. What about taking a datamart approach to centralize core datasets to support all enterprise PBI reporting? What would be the pros/cons in that use case vs. using dataflows?
Same reason I’m here. Please do a video on this, cube guys!
Great for democratizing data but how do we avoid this being another data silo.
This has been a huge dilemma for me in recent years! Democratization is great but there’s a BIG DANGER looming with regards to data governance!
Here’s a use case : A friend of mine was working at a big furniture store. When covid hit, deliveries became a huge challenge and customer complained a lot; there was a lot of cancellation too. The company decided to charge customers on delivery date (instead of purchase date); hence it was decided by the data governance comity to modify the MONTHLY_SALES measure to use delivery date instead of purchase date. Despite the fact that this change took place in the primary enterprise DW, which is the official source of all the smaller datamarts across the company, Marketing and HR_sales_commission DID NOT update their local personalized MONTHLY_SALES measure. It was a mess!
So, I’d say, if you decide to go the self-service route, make sure to invest money and ressources in a good CoE (CENTER of EXPERTISE), that will create a strong HORIZONTAL community of users across all departments. In anyway, that’s my solution to mitigate the danger.
Wow, this is really cool. Connecting with SSMS, etc. and SQL is a real game changer.
Thank you Adam & Patrick for diving into this in a way that is straight to the point :)
I am excited for datamarts in Power BI as I can see this being a boon for our newer international offices where they don't have all of the bells and whistles we have in our US offices!
This looks almost too good to be true :) One question though: can you use datamart as a source for building another dataset? E.g. to combine data from different datamarts (and datamarts becoming a kind of reusable ETL just like dataflows :))
Context : We currently use Azure Analysis Services to power our dashboards. It's getting extremely costly for us to maintain the AAS - memory wise.
Question: Can we shift our architecture to datamarts ? Provided that we need a daily refresh of those data sets and our current model size is around 120 GB.
Hi Neel, Can you suggest if this worked for you ?
While this seems cool, I’m not sure if the common business user would see much more benefit from connecting to this versus a shared power bi dataset. Main difference I can see is the SQL server access, anything else?
@Nick D And a premium-only feature. Hate how a lot of these new features are creeping in as a premium only feature which simply makes no financial sense for my company.
In the past, dataflows were saved as flat files. Querying those flat files was slow, especially if it was a large file, since PowerQuery had to load the entire file. To address they, MSFT created the Enhanced Compute Engine, but it was buggy and didn't seem to work that well. To me the big value add is that the output of PQ gets saved to a standalone, queryable SQL Server instance. Folding and query optimizations should now happen natively, so I expect good performance benefits from this, especially when loading the Datamart tables and doing further transformations into a new PBIX.
Thanks!
The Power BI DM assumes matching keys and granularities so it's perfect to use on top of a DWH that has done all the integration work.
Question: if user could save their sql query result as a new dataset in datamart? like using the datasets there to query and then doing like create view or something to save what they queried as a new table saved in the same location where the data table stored?
Excellent new feature! Thanks for the fast update!
Amazing feature!! How do you separate when to build a DW (+SSAS ??) and when to use a DataMart straight into PBI Service?, Does BI / DW Dev + Data Engineers skills are no longer need? or maybe less needed? A Business User with good skills in PQ and data modeling can create a super powerfull datamart without a BI Team?
I don't really see the difference in an user perpsective from a bigger dataset I built up in PBI Desktop and published it. Maybe the only difference is you can set up different refresh schedules for the different data sources and you don't need PBI Desktop - but then it should be called PBI Online or something. Maybe an IT head is excited that you can query SQL but I don't see why is that better than making some report page in a PBI report.
My thoughts exactly. Seems like they just moved the dataset to the cloud, which means any business user would still have to deal with the complexity of Power BI Desktop, just in different steps. There's still power query, row level security, DAX, modeling, SQL statements in PQ have been kinda made into their own thing.
Then since you're already in Premium, of course you could connect with SSMS since you got xmla endpoints enabled.
But you got limitations like no calculated columns, no calculated tables, requires Power BI Premium, will likely chew away at your included premium capacity, and no write XMLA.
Likely just going to push the dataset sprawl into datamarts instead of datasets, and a far more costly at that. A PBIX containing a dataset and dataflows for each data source would likely be far more manageable and reusable than a datamart.
Feels pretty underwhelming to be honest. Like a bundle of existing components that are marketed as a revolutionary product still looking for a need to address, rather than a product designed to meet a certain need. I just don't see the value proposition here.
Hello Patrick, I am really appreciate your work including your greetings😃. I am learning power bi and your videos help me a lot.
Hello! Great explanation, i have a doubt. Why are you saying that now calculated tables are not needed anymore because we have sql queries? the queries i create cant be used in the model, they are just for analizing the data. What am i not understanding?
Question:
What about stored procedures, triggers, views?
So in our good old SQL Server we got some stored procedures and triggers that help with automation.
Can we create stored procedures PowerBI datamart?
This will be an amazing experience! The most awesome thing was... copy that SQL endpoint... and query it into SSMS... oooh!! I am sure.. it has a promising path in future.
This new feature at 7:00 just blew me away !
Thanks for sharing yours perspective. It's so easy to understand the concepts with you guys. And I'm very excited to test this Christmas present.
Question: Is there any way to do version control of Datamart? If somebody deletes it how to recover that?
Hey guys! Love your videos! Great explanation here. Question for you, I stood up a Datamart for a client and was wondering how each query executes. With individual flows, I can schedule each to run concurrently. If one fails, the rest still run. Do Datamarts function in a similar fashion? Similarly, if I do not want to refresh specific queries (think master data connected to csv on SharePoint) when the model refreshes, is this possible?
This amazing! Can't believe it's here!
Can you do a video on incremental refresh within the datamart
so inspiring and great functionalities! thanks for sharing Adam and Patrick.
Is there any parameters option to change the data source connections like in desktop?
Is there any ideas on when this'll be out of preview? I love the Azure SQL integration!
can you create temp tables in the data mart using SQL?
Hi, it is possible to feed the datamart in workspace1 with dataset from workspace2? or into the datamart need to be loaded the sources from the dataset is created?
okay how do you access PBI Datasets inside the PBI Datamarts? Could not find a way.
How does this work in the Power BI deployment pipeline?
Been waiting years for this. Great stuff.
This is really cool technology and I can't wait to try it.
Question: I am on premium per user license, and have created premium workspace, can i make use of the dataset in datamarts to publish in power bi but to be consumed by user having pro license.
I love the banter. You guys need to meet up more often! 3:27 😂
so excited about this new great feature ! Thank you for the explanation ^^
Thank you! Has this not been already there through datasets and live connection oder through SSAS/tabular editor?
Hi fellas, Do you know if a datamart retreive data from different power bi tennants? or do we have to copy them manually to datamart on main tenant?
can you connect datamart datasets to tabular editor?
Is there any place I can find info/guidance on how to successfully use the Datamart connection string in various 3rd party tools? It works automagically in PBID & Excel (using the SQL Server connector) but, at least for me, not so much in any other 3rd party tools.
Be aware that Power BI datamart backend is a managed Azure SQL database, but with a limited database size up to 100GB. For more than 100GB database, Azure SQL is still your option. For big data in PB size then still AZURE Synapse
so is this creating a local sql server db? Can you create temporary tables in it? Can you create stored procs?
What about the Azure SQL capacity pricing, it cannot be offered for free? Can you use existing Azure SQL database?
Hi
I have a question, with the Datamart SQL Editor, am i able to write a query in SQL and use the query to build a dashboard
How to copy the data mart to different environments like Dev, UAT, Prod etc.
Whats the difference between this and creating a dataset?
Looks nice.
We only want to make this functionality available to a select group of users instead of everyone within the organization. How do I set this up in the tenant?
Very cool! But wonder how it affects the capacity and performance of other reports within that capacity?
can we connect to Power BI data mart server as a sql connection in another Power BI desktop ?
I watched the presentation on Microsoft Build 2022 thinking I hope GuyInACube talk about these tomorrow
but you do not wait for tomorrow huh?
Hi Pat, quick one. If i can create datamart in PBI, store data in PBI and use it in SSMS or Excel etc... how will i incremental refresh the data and create history if required..
This is bananas! Will we be able to *easily* copy over existing Power BI desktop queries into the Datamart? Or even import an entire model (including all measures)? Like other commenters, I'm excited to push more of the power query ETL workload into a server and free up my local machine to do...well, I'll figure out something else for it to do!
It's Power Query, so you can just copy and paste your query's. Like on Data Flows.
Q: Is it possible to uses Gateway in Datamart? I need to pull data from Jira which is on-premises.
I am trying to create a data group (grouping) like Power BI Desktop, but this option is blocked in web, what can I do??
Thank you for this video!! However, I have questions:
- Do we need a gateway for datamarts syncing? If no, that will be great in fact
- Can we do ETL on an existing Azure SQL database?If yes, this feature is a revolution !!
I think the only situation we need on-prem gateway is when use have data behind a VNET, please correct me if I am wrong!
Thank you again!!
I have a small dought can we push power bi service data in to database is it possible
Can you use a PBI Datamart as a data source in PowerApps? If the backend of the PowerApps is sql and the PA captures data and sends it to SQL, which feeds the datamart, can I refresh the datamart from PowerApps?
Whats the data size limitation?-or same as that of data model size of premium capacity?
Hi guys!
I have been exploring datamarts this month and I have found a big issue with RLS. If I use a dataflow to extract data from a datamart with RLS, the RLS does not work. I can still see all the rows in the table. I have done the test in Power BI Desktop connecting to the datamart or the dataset (using both connectors), and here the RLS does apply.
I have also connected to a datamart with management studio, and the RLS here does not work.
So, there are still some BIG SECURITY issues using the SQL or Azure SQL connection.
This looks amazing
so the main feature is creating the dataset directly in web?
Hey Guys.. It's a cool feature but can we allow business to do the self serve just by loading related tables? And how much data capcity it allows to hold and what kind of licensing one would need to hold that data and do the analysis.
Love you guys!
Thanks for the video.
What are the main differences with using dataflows?
It also can be seen as a basic warehousing.
Wow this looks amazing!...
Does this replace dataflows?
How does the dataset refresh happens with datamart? Does it work like a Power BI dataset that refresh should be done from the model or can the refresh be triggered from Power BI Desktop. Can it be used outside the workspace it was created in?
Would like to explore how can we sync back DataMart to the on-premises SQL server . Datamart sql connection to SQL Connection to keep a copy on local Sql server.
Hi! great video! What is the difference between using an Excel file uploaded to OneDrive/SharePoint and using a datamart? The first thing that comes to mind is update speed: if I have a table that pulls data from an Excel file in OneDrive/SharePoint and this Excel file gets updated with new rows or changes, it takes some time for these changes to be reflected in Power BI. On the other hand, with a datamart, changes to the data source could lead to an immediate update of data in Power BI. Is that correct? Thank you.😃
0:32 "Power BI dataflows was just announced at Build 2022" ... was it? Dataflows have been around for a while so why would they just be announcing it now?
Does it support field parameters? :-) That would be a must have.
Can I consume the DataMart with just a PBI Pro licence?
I dont get it tbh. Is it just like a shared dataset but it can be queried using SQL?
How does it relate to synapse ?
Hi,
Does the dataset (auto created) has to be in direct query (will it allow different storage modes) or can it have agg models?
I believe just like database endpoint, there's a dataset endpoint which we can connect to Tabular editor or any external tool.
[PS: learnt from Reza Rad, it has dataset endpoint too]
Wish the read/write will be available instead of just read, permissions to the database, in case IT has to be involved in something like optimization of queries.
Finally, is it possible to download the bim / pbix , for version control?
Thanks.
I too am wondering about version control. - If you (or your dog :)) deletes the wrong table accidentally, can you go back to the earlier version?
Is there a way to use calculation groups in datamarts? Tested by loading my dataflows into one and it seems so much faster than my current dataset but I have all of my time intelligence built using calculation groups so hoping there is a way to accomplish the same thing in datamarts.
Hi, did you found an answer to this? I have the same question
Thanks for your amazing video!
If I have 3 environments: dev, uat, and pro, how can I copy/ sync the datamart from one env to the other one?
How would you go about migrating an existing Power BI report to push its data instead into a Datamart?
How is this different from creating a PBI dashboard with no visuals but same data sources and using dashboard's dataset instead?
That is amazing!!
I am just curios if we could connect datamarts with the SSRS. I mean same sources for BI and SSRS reporting.
WOW!! This is a game changer
Nice explanation!
Can we use the output of a visual or SQL query of a datamart in a visual of PowerBI report ?
I do not see the visual / SQL queries that I have written in the datamart are visible in PowerBI desktop once I connect/import in to PBIX in PowerBI desktop.
Please suggest. Thanks!
Hi how are you, same here am wandering since we are able to write queries, we should be able to use it to create dashboard in PowerBI, if not what would be the use for this in the datamart
I've been experimenting with a datamart in Power BI, but it seem like it doesn't support date hierarchies... Does that sound correct or am I missing something?
This is awesome really!!
Great video, great new feature!! A question I have is related to refreshes. Can we setup separate refreshes for each table or only to the datamart itself?
Yes this is what I wanted to know, otherwise I cannot see the purpose of using this datamart and stick to dataflow
The "DataMart builder" enables quick 'Business built/requested DataMarts'.
The 'Self Service' acronym was used in the 1950's for Gas Stations ...
This seems like SSAS in the PBI service but using Power Query and DAX. Is there a benefit to using SSAS over Datamarts or Datamarts over SSAS?
I think SSAS is probably better for large scale and Datamarts is better for ease of use
I was trying it out and was getting "jsonreaderexception unexpected character" after saving. I tried with 2 data sources with the same results.
This is a great functionality! Is there a chance for it to ever get GA?
Question: Can I add data more data to tables in a datamart. For example, I need to take regular snapshots of real time data in order to extrapolate the trends over time. Is there a way to insert the next snapshot into the same table where the previous snapshot lies?
Does a data mart replace an Azure Analysis Services environment?
I tried using it to connect to another dataflow(used both power bi connectors as well as power platform)but as of now it doesn’t support linked entities/tables hope to see soon. So I should start using datamart instead of dataflow??
Calling these data marts is a little disingenuous. Is it possible to persist data in these tables? meaning, do an upsert, keep records that were deleted in the source, create a slowly changing dimension, persist fact records regardless of what is happening in sources. If a source system goes away then data should persist in the data mart. Maybe these are features that will come in the future. I like the spirit here, and I like the idea of separating data models from power bi reports. This feels closer to low-code SSAS tabular. I saw they only allow reads for now, but maybe at some point you can do DML statements via low-code interface.
Great news!
This looks really promising. Any issues with connecting to a Azure Synapse Pool?
Hi guys,
Amazing video as always.
Any idea if Dynamic RLS works? I can only see capabilities for basic RLS.
Thanks
Same here, for sure dax style -> userprincipalname() filtering would be amazing. Any ideas?
If I understand correctly we can easily set up a curated DB through datamarts so that anyone I share it with can access it regardless of their actual accessibility of the sources I populate in my datamart ?
Yes - you can create a curated Db and apply row level RLS as needed and share with other recipients (regardless of their accessibility of the original source)
@@watchthewaj That's awesome ! Because you can secure you server access or even for data confidentiality you can do the ETL within the datamart so people can just access what's needed without having them directly accessing the source, that's awesome.
Hey These guys,
Do we know if we can do incremental refresh or hybrid tables here in datamart?
Yes to incremental