It could be that you are now able to connect to an Eventstream directly by using a custom app as a source. I haven't been playing around with Eventstreams lately and I am not totally aware what is the latest status with it since Fabric is updating so fast. I would be curios to know if you have tried to connect to it directly with a custom app option? :)
So we have two Azure SQL sources with CDC enabled. I can connect it using event stream and it's good but my issue is that the data is received as different columns "schema", "payload(before,after)" etc instead of directly receiving the actual columns of our data in the data preview tab of event stream. The payload column does contain what we need, the 'before' key in the json gives us the original data and the 'after' key in the json gives us the changed data but to process this in real time we are currently using a notebook to send the received data from the event stream to a kafka endpoint, use spark with a streaming notebook to get the data and then only update the lake house in real time. Can anyone help us, basically the pipeline is Azure SQL(CDC) -> Event stream -> Custom Endpoint(Kafka) -> Spark streaming notebook in fabric -> Lakehouse This has become a problem in terms of resource and managing so much code.
Thanks so much !
You're welcome!
Thanks Aleksi. Great stuff!
You’re welcome! :)
Great demo - thank you.
Glad you liked it!
Great tutorial , but how i can connect with the power BI experience to visualize it ?
Connecting PBI should be quite straight forward operation in Fabric
What is the purpose of the the event hub-step? is there an issue with consuming the data directly in the eventstream?
It could be that you are now able to connect to an Eventstream directly by using a custom app as a source. I haven't been playing around with Eventstreams lately and I am not totally aware what is the latest status with it since Fabric is updating so fast. I would be curios to know if you have tried to connect to it directly with a custom app option? :)
@@AleksiPartanenTech Yes, you can use eventstream to create a custom endpoint which gives you an endpoint you can add in your source/ python script
@@hallvardkuly3313 good to know that it works that way as well. Thanks! :)
So we have two Azure SQL sources with CDC enabled. I can connect it using event stream and it's good but my issue is that the data is received as different columns "schema", "payload(before,after)" etc instead of directly receiving the actual columns of our data in the data preview tab of event stream. The payload column does contain what we need, the 'before' key in the json gives us the original data and the 'after' key in the json gives us the changed data but to process this in real time we are currently using a notebook to send the received data from the event stream to a kafka endpoint, use spark with a streaming notebook to get the data and then only update the lake house in real time.
Can anyone help us, basically the pipeline is
Azure SQL(CDC) -> Event stream -> Custom Endpoint(Kafka) -> Spark streaming notebook in fabric -> Lakehouse
This has become a problem in terms of resource and managing so much code.
I would have to dig deeper to this problem in order to be able to answer to this one