How to Create an ELT Pipeline Using Airflow, Snowflake, and dbt!

แชร์
ฝัง
  • เผยแพร่เมื่อ 10 ต.ค. 2024
  • In this video, I'll go through how you can create an ELT pipeline using Airflow, Snowflake, and dbt, with cosmos to visualize your dbt workflows! Check out the repo below!
    github.com/ast...

ความคิดเห็น • 20

  • @sainithinreddy6633
    @sainithinreddy6633 ปีที่แล้ว +3

    Very clear explanation 👍

  • @shubhamkawade7351
    @shubhamkawade7351 7 หลายเดือนก่อน +1

    Nice explanation! For a specific dag, how do I run a specific dbt command? e.g. How would you execute 'dbt run --select +third_day_avg_cost_run' for this project in the video
    ?

    • @thedataguygeorge
      @thedataguygeorge  7 หลายเดือนก่อน

      You could use the cosmos filtering mechanism to filter for just that specific step, but by default cosmos will automatically render each individual dbt model trigger as a task in the DAG

  • @maxpatrickoliviermorin2489
    @maxpatrickoliviermorin2489 11 หลายเดือนก่อน +1

    Thank you!
    Would you mind making a much more elaborate version please?

    • @thedataguygeorge
      @thedataguygeorge  11 หลายเดือนก่อน

      Sure! What would you like to see?

  • @WiseSteps.D
    @WiseSteps.D 6 หลายเดือนก่อน +1

    All good,
    But where we able to see snowflake connection details?

    • @thedataguygeorge
      @thedataguygeorge  6 หลายเดือนก่อน

      Go to the connection management UI and select the snowflake connection there!

    • @Rajdeep6452
      @Rajdeep6452 4 หลายเดือนก่อน

      in airflow go to admin>connections and type in your {
      "account": "-",
      "warehouse": "",
      "database": "",
      "role": "",
      "insecure_mode": false
      }

  • @ameyajoshi6588
    @ameyajoshi6588 3 หลายเดือนก่อน

    can we have cyclic pre hook applied for a model? If so how to achieve it using dbt airflow

    • @thedataguygeorge
      @thedataguygeorge  3 หลายเดือนก่อน

      Yes definitely, if you have the pre-hook applied as part of your dbt model build process it should still work!

  • @vijayjoshi-mw8cr
    @vijayjoshi-mw8cr 3 หลายเดือนก่อน

    Hello, I have build a ETL pipeline using python,pandas,airflow,snowflake but problem is when i run the task that time it will not load the data into snowflake.. so please can you help with us!!

    • @thedataguygeorge
      @thedataguygeorge  3 หลายเดือนก่อน

      What errors are you getting?

    • @VijayJoshi-eg2zq
      @VijayJoshi-eg2zq 3 หลายเดือนก่อน

      @@thedataguygeorge when i will run the task that time it will get success message but when check snowflake warehouse i am not able to see the table

  • @Rajdeep6452
    @Rajdeep6452 4 หลายเดือนก่อน +1

    So we are supposed to make the dbt init project first and then create the databases in snowflake? And what schema are you using?

    • @thedataguygeorge
      @thedataguygeorge  4 หลายเดือนก่อน

      The dbt init that's part of the project should create the databases for you as long as you have the proper permissions/setup