Hi @VivekDevalkar per Bradley "Hi Vivek, great question! Yes this is possible. As a matter of fact this inspired my video this week. A full demo of this should be posted at 11 am EST with links to the documentation on how to do this!"
Hi @acxhgg per Bradley, "I hate to start off with saying it depends, but kinda. You only have 4 data types for parameters that can be passed from Pipeliens to Notebooks (currently). Int (integer), Bool (boolean), Float, or String. You would need the data type of the parameter is string. However you could pass some type of small JSON array in text, and then have the notebook parse the string to make it an array, but it wouldn't be native. I hope this helps!"
@@Tales-from-the-Field Thank you for your answer ! I tried implementing a logic that would convert data from an array to a single String variable. My current use case is to retrieve latest JSON files from a specific folder of a lakehouse, and to call a notebook with these latest files as parameter, in order to do certain actions on my JSON files. Hopefully it'll be native soon !
cool demo. Am I seeing that you didn't have to get rid of the myYear = 2013 in your notebook? Looks like you used the same variable name when passing the parameter to the activity, but you didn't get rid of the line myYear = 2013 in the notebook itself. I would think that this would override the parameter value (2015) that was passed into the activity.
Hi @johncochran5852, per Bradley "Hi John, nice work sir! You saw that correctly. The default parameter in the Notebook, will be overriden by the parameter passed from the pipeline. If you want you could make the default a value that will always throw an error.... which would probably be a best practice... but as long as we pass a valid parameter in the pipeline this will work just fine!"
Thanks for the demo@@Tales-from-the-Field ! In my case, I had to comment out the parameter in the Notebook as it kept prevailing over the parameter passed from the pipeline!
I am trying to research if the Fabric Pipelines can be triggered from external CI/CD tools like Jenkins, Gitlab, Github Actions etc., Basically, I am looking for this solution so that the fabric projects align with all other projects that follow a CI/CD process from a central location. I would also love to see if we can create any scaffolding with IAC tools like Terraform, which then i can include into my CI/CD process.
May I ask a security/ permission question please ? I guess on Fabric notebooks, we are fully UPN SSO regarding permissions ? Permission to access Workspaces objects like Lakehouses & Power BI semantic model ( via semantic link library) fully SSO on interactive notebook sessions , but what happen on "batch" sessions , with data pipeline runs ? - Is it the notebook creator UPN (the owner) which is used on batch pipelines runs ? - What happen during Life cycle CDCD deployment pipeline ? Along side Workspaces - Do we need the notebook owner to be granted access on each environment WS ? - Sounds Microsoft works on a substitution UPN for pipelines runs Thanks a lot for any help
Hi @claudiovasquezcampos9558 are you looking for this in a Spark SQL Statement in a Notebook or a T-SQL Statement as part of a data pipeline task against an Azure SQL DB, Azure SQL MI, SQL Server, or Fabric DW?
I've been trying to get something like this working in Fabric. Thanks for a great demo!
Thank you @DataBard255!! Really glad the video was helpful!
Can I call the parameterized notebook from another notebook too or can I just call it from a pipeline?
Hi @VivekDevalkar per Bradley "Hi Vivek, great question! Yes this is possible. As a matter of fact this inspired my video this week. A full demo of this should be posted at 11 am EST with links to the documentation on how to do this!"
Hello, thank you for this demonstration.
Could it be possible to pass a type 'Array' as base parameter for our notebook ?
Hi @acxhgg per Bradley, "I hate to start off with saying it depends, but kinda. You only have 4 data types for parameters that can be passed from Pipeliens to Notebooks (currently). Int (integer), Bool (boolean), Float, or String. You would need the data type of the parameter is string. However you could pass some type of small JSON array in text, and then have the notebook parse the string to make it an array, but it wouldn't be native. I hope this helps!"
@@Tales-from-the-Field Thank you for your answer !
I tried implementing a logic that would convert data from an array to a single String variable. My current use case is to retrieve latest JSON files from a specific folder of a lakehouse, and to call a notebook with these latest files as parameter, in order to do certain actions on my JSON files.
Hopefully it'll be native soon !
It is very useful. Is it possible to pass dynamic parameters from Copy Data activity to Note Book Activity in the Data Pipeline
cool demo. Am I seeing that you didn't have to get rid of the myYear = 2013 in your notebook? Looks like you used the same variable name when passing the parameter to the activity, but you didn't get rid of the line myYear = 2013 in the notebook itself. I would think that this would override the parameter value (2015) that was passed into the activity.
Hi @johncochran5852, per Bradley "Hi John, nice work sir! You saw that correctly. The default parameter in the Notebook, will be overriden by the parameter passed from the pipeline. If you want you could make the default a value that will always throw an error.... which would probably be a best practice... but as long as we pass a valid parameter in the pipeline this will work just fine!"
Thanks for the demo@@Tales-from-the-Field ! In my case, I had to comment out the parameter in the Notebook as it kept prevailing over the parameter passed from the pipeline!
Great demo!!
Thank you @TheSQLPro!
I am trying to research if the Fabric Pipelines can be triggered from external CI/CD tools like Jenkins, Gitlab, Github Actions etc., Basically, I am looking for this solution so that the fabric projects align with all other projects that follow a CI/CD process from a central location. I would also love to see if we can create any scaffolding with IAC tools like Terraform, which then i can include into my CI/CD process.
May I ask a security/ permission question please ?
I guess on Fabric notebooks, we are fully UPN SSO regarding permissions ?
Permission to access Workspaces objects like Lakehouses & Power BI semantic model ( via semantic link library)
fully SSO on interactive notebook sessions , but what happen on "batch" sessions , with data pipeline runs ?
- Is it the notebook creator UPN (the owner) which is used on batch pipelines runs ?
- What happen during Life cycle CDCD deployment pipeline ? Along side Workspaces
- Do we need the notebook owner to be granted access on each environment WS ?
- Sounds Microsoft works on a substitution UPN for pipelines runs
Thanks a lot for any help
How can you pass parameters to
where in Select statement ?
Hi @claudiovasquezcampos9558 are you looking for this in a Spark SQL Statement in a Notebook or a T-SQL Statement as part of a data pipeline task against an Azure SQL DB, Azure SQL MI, SQL Server, or Fabric DW?