Perfect video that helped me write back to postgresql. I published my pbix and tried a incremental refresh so that every week I write back only new records to postgresql table which isn't writing back but the refresh is successful and its showing up new modified date in report
Its so amazing that when you need to find something related to MSBI and you get it on the BI Elite channel. Thank you so much for lending me this logic I've been struggling to find. Love from India❤❤❤
WOW!!! That's a really awesome idea to write the INSERT Statement in a Custom Column and use a Custom Function to run it. Great work Parker! Thanks for sharing and stay assured I will totally "steal" this and use it in my Daily work ! Thanks again!
@@BIElite I've just tested it getting data from a SharePoint list with Power Query in Excel and it totally works!!! This opens a lot of use-cases !!! Thanks again for sharing!
I have been looking to something as such to be done from power bi itself. My firm doesn't allow me to use powerapps for writeback to database. You just gave me a solution which will be very beneficial to me.
Its a great video and hats off to you for sharing this. In case there are multiple sources just make sure to set the Privacy settings to same such as Private or Organizational in both the sources. Many Thanks.
It took a little re-jiggering for my purposes, but I got it to work. If you want to replace the data, you would some how do a drop table function at some point. Trying to figure that part out now. Thanks!
Excellent demo. I am looking at something like this to insert records into an Access db housed in SharePoint. One concern I have with this approach is that there is no way (I am aware off) to write the record asynchronously. So if multiple users try to insert records at the same time, sometimes Access has real problems. But I am continually amazed by what Power Query can do. I would estimate that at least 50% of VBA code can now be automated with Power Query. I have subscribed and I look forward to your other videos.
This is an interesting solution to my challenge to store historical copies of the data produced inside Power Query. I do have a problem though with initiating a new SQL session for every insert. It's not truly "bulk", and may be quite slow with a decent number of rows. I will try to move it outside of the function, and may be pass the session object as a parameter.
This has been BRILLIANT! I do have one request though - you created an ID column to load - but you don't cover the creation of that column, I'm trying to piece loose things together now.
This really helps a lot..I am actually searching and struggling for this option to write back to SQL where I have billions of records to be written back to SQL from my Power BI report
Hi i follow your step and script exactly but some how the message column doesnt show: "INSERT INTO FromView (rownum,Category) VALUES ( '" & Text.From([rownum]) & "', '" & Text.From([Category]) & "', ); SELECT 'Inserted Row' AS [Message];"
As always great video.. will it be possible to write data based on DAX conditions.. Let say the user interacts with the report and changes some slicers etc.. Then the final table | results should be written back ? Thanks
@BI Lite I tried the above solution but at the end am getting please rebuild the data combination error.. At it is exactly pointing at invoked custom function step in the power query Bi Lite please guide us from here.. Thanks
Parker, nice video to explain this. For many rows wouldn't it be more efficient to create a stored procedure in SQL and pass the rows as one JSON value, chunking into a reasonable number of rows?
Parker - Great video. I applied to a local MySQL database and it worked like a charm. I had to modify my table a bit to create a unique constraint but everything is working fine now. Do you think this same approach would work with a snowflake database? Seems like a database server is a server and an insert statement is an insert statement. Might be a good weekend project. Thanks again.
Hi Parker, Thank you verry much for making this video. I have copied your code and when i publish it to PowerBi Service it is not able to refresh, is that something on my end or do you experiance the same? In the Desktop app it works great !
Great video. thanks. gives me some ideas to work on. this isnt true write back? its just moving data from one source to another. is it possible for users to write a comment on the report and write that back. without power apps? I use a pagenated report with an insert query and pass parameters to it, but surely there is an easier way?
Excellent Solution as always by Parker ! Thanks a ton ! Your solutions keep us going back to check to implement something new in our report. Also would like to know if you are planning to have Power Apps Integration with Power BI, Financial reporting or NLP kind of implementation with the help of Power Apps and Power BI on TH-cam! Thanks a ton again:)
Trying to take this process and modify it to write data to an sqLite .db via an ODBC connection. It works OK until the final step (Invoke Custom Function), where it fails with this message: Expression.Error: Native queries aren't supported by this value. Any ideas?
Logically you won't be able to write measures back using this method because it's modelled calculation. Any column you wanna write back to the SQL must be available in the querying data while applying this logic. You can invoke custom column while transforming your query and then add it in the function column created. That should work.
Hi Parker, many thanks for sharing this excellent video. Just one question, would there be any efficiency issue if the size of the table is so big like 50000+ rows?
Good one, can you also delete from sql too? I assume I need to delete some rows too... p.s your page BIElite is quite bad responsive....I cant access it easily.
Amazing technique! How do we add another column as insert timestamp as well please? And is it better to create that column in power query then insert to sql or use sql function while inserting?
Add one more column for timestamp in the SQL format you like (I prefer UNIX timestamp)... He's basically passing a SQL query, and it's always best to have insert timestamp at place where you're actually inserting data (this makes sure there isn't any discrepancy in case of insert failure for a specific row).
@@jimihieu add a custom column with hardcoded value for all rows as- UNIX_TIMESTAMP() Then just add the column in the m query to pass it just like other 3 column values are passed in insert statement. Use - "&Text.From([column_name])&"
Are you using MS SQL? What I did was create a date column datetime not null then set the default Value or Binding to (CONVERT([datetime],datediff(day,(0),getdate()))) in your SQL table . Whenever you add a new row it will automatically stamp the date for you. No need to pass or stamp the date manually through PowerBI or SQL.
how to make it automatically detect the Item code and system time and datae and user email. I just want them to add comemnts and they cant edit the date time, email or item code? I have tables having relationship based on Item code?
@@BIElite Great video. In Desktop it's perfect but when I attempt it in the Power BI Service nothing happens (the refresh is successful but no write back occurs and I receive no error messages). Can you show this working?
it doesnt seems to write new rows when data set is refreshed thru gateway :( or its just me? when i open PBI report and refresh manualy, it makes records into DB
I guess this solution is also working on Excel, if it is, we can design a template in Excel, and ask user to input the data, then eventually save the data into data lake.
Thank you again for an amazing video, useful. But I need more guidance to overcome a challenge I am facing: my Power BI is connected to a file that is adjourned every day. In other words, each file is replaced by a new file each day. The structure doesn't change of course. My question is: since I don't have those indexes to keep track of which row to replace or not, is there a way to make sure that I have replaced all old data in the "old" table with new data that just came? Thank you again.
If you are receiving files of same structure. Create an SFTP (legacy or custom) location and put original file there. Create a shell script to rename any incoming file to the original file's name and then replace the file. This way you'll always have a full load in Power BI. In case you can create an index which is highly recommended (maybe try using a composite key) to determine the changes, then go for incremental refresh on the composite key.
@@ThankYouGoogle4 Remaining in Power Query, during fiction creation, what if a code is added there with two purposes: 1- Drop the existing table (with data to replace, if it exists) 2- Create another table in the database and lastly load data...?
@@padreloco9733 I don't think i fully understood the requirement here because from what i understood is that you want to drop the data, and then load the new data. But that what a full load (refresh) does. Can you create your question on power bi community with your requirement and I'll be able to create a sample pbi file for you.
Hi there, I am getting following error - An error occurred in the ‘’ query. Expression.Error: Native queries aren't supported by this value. Details: [Table]
I like the idea for the possibilities it brings in very specifics use cases but Ouch! I can already see Cyber Security teams pulling their hair out and having a heart attack with this kind of tricks! Hopefully people will understand what they are doing otherwise it might become messy. Power Automate might be a "cleaner" alternative to move data, not sure if bulk upload exists with this either.
Perfect video that helped me write back to postgresql. I published my pbix and tried a incremental refresh so that every week I write back only new records to postgresql table which isn't writing back but the refresh is successful and its showing up new modified date in report
Its so amazing that when you need to find something related to MSBI and you get it on the BI Elite channel.
Thank you so much for lending me this logic I've been struggling to find.
Love from India❤❤❤
Haha glad to hear it, Ifran!
WOW!!! That's a really awesome idea to write the INSERT Statement in a Custom Column and use a Custom Function to run it. Great work Parker!
Thanks for sharing and stay assured I will totally "steal" this and use it in my Daily work ! Thanks again!
No problem! Yes, please use this!
@@BIElite I've just tested it getting data from a SharePoint list with Power Query in Excel and it totally works!!!
This opens a lot of use-cases !!!
Thanks again for sharing!
Love it, Cristian! Thanks for sharing your use case.
I have been looking to something as such to be done from power bi itself. My firm doesn't allow me to use powerapps for writeback to database. You just gave me a solution which will be very beneficial to me.
Its a great video and hats off to you for sharing this. In case there are multiple sources just make sure to set the Privacy settings to same such as Private or Organizational in both the sources. Many Thanks.
You save my day! thank you
This is really great! Congrats and you have a new sub :)
This is an excellent solution. We just have to mindful that we don't start using this method as semi ETL :)
Sir it is really good tutorial.
Thanks!
Great video Parker 👍
It took a little re-jiggering for my purposes, but I got it to work. If you want to replace the data, you would some how do a drop table function at some point. Trying to figure that part out now. Thanks!
Very very interesting. Well done
Thanks!
This can be very usefull! Thank you for this tutorial.
No problem!
A simple and very intuitive solution. It was easy for me to implement! thanks a lot and much appreciated.
That's really cool. Thanks for sharing.
Thanks for sharing. It is really helpful
Glad to hear, Rakesh!
Excellent as always! Thanks!
Great writeback! Thanks!
Excellent demo. I am looking at something like this to insert records into an Access db housed in SharePoint. One concern I have with this approach is that there is no way (I am aware off) to write the record asynchronously. So if multiple users try to insert records at the same time, sometimes Access has real problems. But I am continually amazed by what Power Query can do. I would estimate that at least 50% of VBA code can now be automated with Power Query. I have subscribed and I look forward to your other videos.
This is an interesting solution to my challenge to store historical copies of the data produced inside Power Query. I do have a problem though with initiating a new SQL session for every insert. It's not truly "bulk", and may be quite slow with a decent number of rows. I will try to move it outside of the function, and may be pass the session object as a parameter.
That’s a very good point… please let me know what you come up with.
Good thought, I shall also try the same
This has been BRILLIANT!
I do have one request though - you created an ID column to load - but you don't cover the creation of that column, I'm trying to piece loose things together now.
It’s really awesome technique
This really helps a lot..I am actually searching and struggling for this option to write back to SQL where I have billions of records to be written back to SQL from my Power BI report
Hi i follow your step and script exactly but some how the message column doesnt show: "INSERT INTO FromView (rownum,Category)
VALUES (
'" & Text.From([rownum]) & "',
'" & Text.From([Category]) & "',
);
SELECT 'Inserted Row' AS [Message];"
Great one, thank you, does it work with MS Access database?
As always great video..
will it be possible to write data based on DAX conditions..
Let say the user interacts with the report and changes some slicers etc..
Then the final table | results should be written back ?
Thanks
@BI Lite
I tried the above solution but at the end am getting please rebuild the data combination error..
At it is exactly pointing at invoked custom function step in the power query
Bi Lite please guide us from here..
Thanks
Something very new :) thnk you!
Very useful. Thank you
No problem!
Thanks for sharing the great video! Would this require the premium license of PowerBI?
Parker, nice video to explain this. For many rows wouldn't it be more efficient to create a stored procedure in SQL and pass the rows as one JSON value, chunking into a reasonable number of rows?
yip - way more efficent
Kindly share what the procedure would look like
Thanks
Any follow up on this
Hi Parker,
then we can publish and configure scheduled updates so that the process continues in Power Bi Service? or is only done on power bi desktop?
What abour sql injection? Does power bi or forms handle that behind the scenes?
Very good point. No, it runs the “native query” against your database, so anything goes.
Parker - Great video. I applied to a local MySQL database and it worked like a charm. I had to modify my table a bit to create a unique constraint but everything is working fine now. Do you think this same approach would work with a snowflake database? Seems like a database server is a server and an insert statement is an insert statement. Might be a good weekend project. Thanks again.
Hi Parker,
Thank you verry much for making this video.
I have copied your code and when i publish it to PowerBi Service it is not able to refresh, is that something on my end or do you experiance the same?
In the Desktop app it works great !
Great video. thanks. gives me some ideas to work on. this isnt true write back? its just moving data from one source to another. is it possible for users to write a comment on the report and write that back. without power apps?
I use a pagenated report with an insert query and pass parameters to it, but surely there is an easier way?
This is really awesome!!
But from which form we are writing back?
Can this be done using Power BI Report Server, or is this for the cloud/service only?
After I write the insert statement and click ok .. i get null for about 90% of my rows. The odd part is ... that it worked for about 10% !? Any ideas?
Is it possible to do updates in a similar way? What about Bulk Updates - such as every record with the same issue?
Can this be done using Excel PowerQuery?
Excellent Solution as always by Parker ! Thanks a ton ! Your solutions keep us going back to check to implement something new in our report. Also would like to know if you are planning to have Power Apps Integration with Power BI, Financial reporting or NLP kind of implementation with the help of Power Apps and Power BI on TH-cam! Thanks a ton again:)
Trying to take this process and modify it to write data to an sqLite .db via an ODBC connection. It works OK until the final step (Invoke Custom Function), where it fails with this message: Expression.Error: Native queries aren't supported by this value.
Any ideas?
Same issue here
Kindly reply, if we need to replace data with fresh records however it must stay unique, what I need to write as a code.
Astonishing !!!!!!
It’s really awesome, and thank you for sharing! Can we write measures and calculated column values with this trick?
Logically you won't be able to write measures back using this method because it's modelled calculation. Any column you wanna write back to the SQL must be available in the querying data while applying this logic. You can invoke custom column while transforming your query and then add it in the function column created. That should work.
can this type of query work with MariaDB too?
Hi Parker, many thanks for sharing this excellent video. Just one question, would there be any efficiency issue if the size of the table is so big like 50000+ rows?
use ADF :)
It works but when I run the refresh of the Semantic Model in a Workspace - nothing happens.
Good one, can you also delete from sql too? I assume I need to delete some rows too...
p.s your page BIElite is quite bad responsive....I cant access it easily.
Amazing technique! How do we add another column as insert timestamp as well please? And is it better to create that column in power query then insert to sql or use sql function while inserting?
Add one more column for timestamp in the SQL format you like (I prefer UNIX timestamp)... He's basically passing a SQL query, and it's always best to have insert timestamp at place where you're actually inserting data (this makes sure there isn't any discrepancy in case of insert failure for a specific row).
@@ThankYouGoogle4 im not good in sql, can you please help to expand the code please?
@@jimihieu add a custom column with hardcoded value for all rows as- UNIX_TIMESTAMP()
Then just add the column in the m query to pass it just like other 3 column values are passed in insert statement. Use - "&Text.From([column_name])&"
Are you using MS SQL? What I did was create a date column datetime not null then set the default Value or Binding to (CONVERT([datetime],datediff(day,(0),getdate()))) in your SQL table . Whenever you add a new row it will automatically stamp the date for you. No need to pass or stamp the date manually through PowerBI or SQL.
how to get pbix file of this
how to make it automatically detect the Item code and system time and datae and user email. I just want them to add comemnts and they cant edit the date time, email or item code? I have tables having relationship based on Item code?
but how will it take input from front end if I give to the user?
Wow. Well done.
Is it possible to publish this pbix file and schedule a refresh in power bi service ?
Yes! That’s one of the great things about this. Scheduled refreshes will write data to the db on a schedule 8 times per day with pro.
@@BIElite can you show that also on video?
@@BIElite Great video. In Desktop it's perfect but when I attempt it in the Power BI Service nothing happens (the refresh is successful but no write back occurs and I receive no error messages). Can you show this working?
it doesnt seems to write new rows when data set is refreshed thru gateway :( or its just me?
when i open PBI report and refresh manualy, it makes records into DB
I guess this solution is also working on Excel, if it is, we can design a template in Excel, and ask user to input the data, then eventually save the data into data lake.
Thank you again for an amazing video, useful. But I need more guidance to overcome a challenge I am facing: my Power BI is connected to a file that is adjourned every day. In other words, each file is replaced by a new file each day. The structure doesn't change of course. My question is: since I don't have those indexes to keep track of which row to replace or not, is there a way to make sure that I have replaced all old data in the "old" table with new data that just came? Thank you again.
If you are receiving files of same structure. Create an SFTP (legacy or custom) location and put original file there. Create a shell script to rename any incoming file to the original file's name and then replace the file. This way you'll always have a full load in Power BI. In case you can create an index which is highly recommended (maybe try using a composite key) to determine the changes, then go for incremental refresh on the composite key.
@@ThankYouGoogle4 thank you for your answer. Theoretically this answer seems to me feasible. I have to try it in practice though. Thanks again.
@@ThankYouGoogle4 Remaining in Power Query, during fiction creation, what if a code is added there with two purposes: 1- Drop the existing table (with data to replace, if it exists) 2- Create another table in the database and lastly load data...?
@@padreloco9733 I don't think i fully understood the requirement here because from what i understood is that you want to drop the data, and then load the new data. But that what a full load (refresh) does. Can you create your question on power bi community with your requirement and I'll be able to create a sample pbi file for you.
Hi there, I am getting following error - An error occurred in the ‘’ query. Expression.Error: Native queries aren't supported by this value.
Details:
[Table]
Can you provide the PBIX file used for this demo ?
The PBIX would be very personal to my data and SQL table, but I will try to post something. Might not be too helpful though.
Now do the SSIS version. I tried it, but none of my column names from PowerBI showed up
getting an error. I don't think you can do this anymore
I like the idea for the possibilities it brings in very specifics use cases but Ouch!
I can already see Cyber Security teams pulling their hair out and having a heart attack with this kind of tricks! Hopefully people will understand what they are doing otherwise it might become messy.
Power Automate might be a "cleaner" alternative to move data, not sure if bulk upload exists with this either.
⭐⭐⭐⭐⭐
I can't see Chris Wagner in the comments
Should we be expecting Chris?
You know what else starts with "SQL In-" ...