Microsoft Fabric Event Stream - No-Code data stream to Delta table!

แชร์
ฝัง
  • เผยแพร่เมื่อ 7 พ.ย. 2024

ความคิดเห็น • 14

  • @azobensadio260
    @azobensadio260 2 หลายเดือนก่อน +1

    Thanks so much !

  • @janisdata7874
    @janisdata7874 4 หลายเดือนก่อน +1

    Thanks Aleksi. Great stuff!

  • @geehaf
    @geehaf 7 หลายเดือนก่อน +1

    Great demo - thank you.

  • @syednayyar
    @syednayyar 2 หลายเดือนก่อน +1

    Great tutorial , but how i can connect with the power BI experience to visualize it ?

    • @AleksiPartanenTech
      @AleksiPartanenTech  2 หลายเดือนก่อน

      Connecting PBI should be quite straight forward operation in Fabric

  • @hallvardkuly3313
    @hallvardkuly3313 22 วันที่ผ่านมา +1

    What is the purpose of the the event hub-step? is there an issue with consuming the data directly in the eventstream?

    • @AleksiPartanenTech
      @AleksiPartanenTech  21 วันที่ผ่านมา

      It could be that you are now able to connect to an Eventstream directly by using a custom app as a source. I haven't been playing around with Eventstreams lately and I am not totally aware what is the latest status with it since Fabric is updating so fast. I would be curios to know if you have tried to connect to it directly with a custom app option? :)

    • @hallvardkuly3313
      @hallvardkuly3313 16 วันที่ผ่านมา +1

      @@AleksiPartanenTech Yes, you can use eventstream to create a custom endpoint which gives you an endpoint you can add in your source/ python script

    • @AleksiPartanenTech
      @AleksiPartanenTech  16 วันที่ผ่านมา +1

      @@hallvardkuly3313 good to know that it works that way as well. Thanks! :)

  • @SauravRajJoshi
    @SauravRajJoshi 18 วันที่ผ่านมา

    So we have two Azure SQL sources with CDC enabled. I can connect it using event stream and it's good but my issue is that the data is received as different columns "schema", "payload(before,after)" etc instead of directly receiving the actual columns of our data in the data preview tab of event stream. The payload column does contain what we need, the 'before' key in the json gives us the original data and the 'after' key in the json gives us the changed data but to process this in real time we are currently using a notebook to send the received data from the event stream to a kafka endpoint, use spark with a streaming notebook to get the data and then only update the lake house in real time.
    Can anyone help us, basically the pipeline is
    Azure SQL(CDC) -> Event stream -> Custom Endpoint(Kafka) -> Spark streaming notebook in fabric -> Lakehouse
    This has become a problem in terms of resource and managing so much code.

    • @AleksiPartanenTech
      @AleksiPartanenTech  16 วันที่ผ่านมา +1

      I would have to dig deeper to this problem in order to be able to answer to this one