CI CD with Dataflow on Google Cloud

แชร์
ฝัง
  • เผยแพร่เมื่อ 3 ต.ค. 2024
  • This video shows full examples of CI CD pipelines with Dataflow.
    It shows previous approaches (before) and the recommended approach (today) with Dataflow Flex Template.
    CI CD pipelines are presented with different CI CD tools :
    Cloud Build
    Gitlab CI
    Github Actions
    Dagger IO
    Little detail 🤒 : the executions of Dataflow jobs didn't work for a temporary Compute Engine availability issue in the 𝐞𝐮𝐫𝐨𝐩𝐞-𝐰𝐞𝐬𝐭𝟏 region.
    ▸ Github Dataflow Java :
    github.com/tos...
    ▸ Github Dataflow Python :
    github.com/tos...
    ▸ Article Medium Dataflow Java : / ci-cd-for-dataflow-jav...
    ▸ LinkedIn : www.linkedin.c...
    ▸ X : x.com/MazlumTo...
    ▸ Slides : docs.google.co...
    #GoogleCloud #Dataflow #ApacheBeam #CICD #FlexTemplate #CloudBuild #GitlabCI #GithubActions #Dagger #Terraform
    Feel free to subscribe to the channel and click on the bell 🔔 to receive notifications for the next videos.
    📲 Follow me on social networks :
    ▸ Articles : / mazlum.tosun
    ▸ X : / mazlumtosun3
    ▸ LinkedIn : / mazlum-tosun-900b1812
    ▸ WhatsApp : whatsapp.com/c...

ความคิดเห็น • 7

  • @sounishnath513
    @sounishnath513 8 หลายเดือนก่อน +2

    fantastic. as always. this is one of the best GCP learning channel for practical implementations.
    one request can you prepare a tutorial on data sharing. like people will send the CSV file for bulk report in a cloud function likwsie. and thru the event based pubsub it will trigger a bigquery jobs and ones the query out is ready it stores in GCS bucket and send a email trigger to the user back and downloadable signed URL.

    • @GCPLearning-ce9bg
      @GCPLearning-ce9bg  8 หลายเดือนก่อน +1

      Thanks you so much for your feedback 🙏🏻🙂, I am glad you like the videos and also the channel.
      Your proposition is interesting and if I understood it correctly, it corresponds to an event driven architecture.
      This architecture and steps are :
      - upload a CSV file to GCS
      - stores the data to BigQuery (transformations maybe)
      - generate data from BQ to GCS
      - then send an email with the downloadable signed URL for the GCS file
      Don’t hesitate to correct me if I am wrong.

    • @sounishnath513
      @sounishnath513 8 หลายเดือนก่อน

      You got exactly what I meant to. Just to add up you may consider u have the data already in BQ. The CSV file contains certain columns which I need to search and get the data like bulk export based on the CSV file required data.
      Considering it might be a heavy query job which can execute asynchronously. Ones done it will put the event in pubsub and send the email to the users .
      Note: you may make it simple if it completes you. But u got the gist. 😊😊
      Anyways .. I look forward for a demo video

  • @heenachhabra2977
    @heenachhabra2977 4 หลายเดือนก่อน +1

    how would a python beam pipeline look in this case. I understand only the cloud build part may vary

    • @GCPLearning-ce9bg
      @GCPLearning-ce9bg  27 วันที่ผ่านมา

      Thanks for your question.
      In the video and GitHub repositories, I also show a Python Beam pipeline, the Flex Template and the CI CD part for Python.
      Please check in the video and keep me posted, if it’s not clear for you.

  • @shamilak1
    @shamilak1 8 หลายเดือนก่อน

    Share the documents sir

    • @GCPLearning-ce9bg
      @GCPLearning-ce9bg  8 หลายเดือนก่อน

      Thanks for your comment. I shared all the links in the video description.