#Snowflake

แชร์
ฝัง
  • เผยแพร่เมื่อ 4 ม.ค. 2025

ความคิดเห็น • 44

  • @rajwedhikar2955
    @rajwedhikar2955 4 หลายเดือนก่อน +1

    Thanks a lot, very helpful

  • @zohebshumaeel5804
    @zohebshumaeel5804 หลายเดือนก่อน +1

    Hello Anupam, Great Insight for us aspiring data engineers.
    Just have a request could you please create a video regarding auto ingestion from Google cloud storage in snowflake just as you have done for this AWS.

  • @hirenparikh1364
    @hirenparikh1364 ปีที่แล้ว

    Excellent resource !!!

  • @TexasCoffeeBeans
    @TexasCoffeeBeans ปีที่แล้ว

    Very informative video. Thank you.

  • @archanapattanaik8713
    @archanapattanaik8713 10 หลายเดือนก่อน +1

    Thank you for this video. I appreciate your effort for sharing knowledge .Can you please make some video on Acryl Datahub ,DataHub ,metadata, glossary , domain, data lineage out of the data by ingesting them

  • @ashishchaurasiya106
    @ashishchaurasiya106 5 หลายเดือนก่อน +1

    nice

  • @pratz3198
    @pratz3198 10 หลายเดือนก่อน

    Hello Anupam, Do we have to create the sqs queue in AWS frst or does creating snow pipe directly creates the queue?

    • @anupam.kushwah
      @anupam.kushwah  10 หลายเดือนก่อน

      SQS notification channel will automatically created when you create the snowpipe

  • @mickyman753
    @mickyman753 8 หลายเดือนก่อน

    for that sns. do we need to make anew sns in our aws account to so that it works or that sns is alreayd provided by aws, and also do we need to change trust policy of s3 bucket or add any policy

  • @vaibhavverma1340
    @vaibhavverma1340 ปีที่แล้ว

    Thank you for the video, I have a doubt Suppose the source file deleted I mean the file in S3 so are we still able to get the rows when we querying the data ..the select * from ??

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว +1

      if the data is loaded into the snowflake table and then s3 file deleted, we will be able to see the data as the data loaded into the table and can be used for further analysis. it is not like external table where you just define the table structure in the snowflake and actual data reside in the s3.

  • @nvkishorekumar
    @nvkishorekumar ปีที่แล้ว

    Thanks for the video . I have a small doubt, lets say if we are using a internal stage and there are couple of different files i.e., employee,salary,dept. All are of type csv. Now if i have to load them to their respective tables in snowflake i.e, employee, salary, dept using pipe, is it possible ? if yes what will be the notification channel?

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว

      It is possible to load the data from a stage with different file names using pattern. Please refer this document for more info - docs.snowflake.com/en/sql-reference/sql/copy-into-table#syntax

  • @hienngo8761
    @hienngo8761 7 หลายเดือนก่อน

    Thank you for the video. It is short but has enough information. I have two questions: How does snowflake detect duplicate files? Is there any query to identify those duplicated files processed in Snowflake ?

    • @anupam.kushwah
      @anupam.kushwah  6 หลายเดือนก่อน

      It keeps track of the file name in the internal metadata tables. check this page if it helps - community.snowflake.com/s/article/COPY-INTO-completes-with-Copy-executed-with-0-files-processed

  • @maryam4071
    @maryam4071 9 หลายเดือนก่อน

    hi, where can i find the previous video? You are mentioning that in the last session ..

    • @anupam.kushwah
      @anupam.kushwah  6 หลายเดือนก่อน

      Please check videos on this channel

  • @muralikrishna-gk4hx
    @muralikrishna-gk4hx 8 หลายเดือนก่อน

    Thanks sir
    can i create snowpipe on internal named stage

  • @aniketkardile
    @aniketkardile 5 หลายเดือนก่อน

    for ingesting multiple different tables to multiple different stages

  • @rameshram1117
    @rameshram1117 ปีที่แล้ว

    how to load multiple files with diffrent meta data using one single pipe and single stage its possible ?

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว

      It’s better to keep different type of files into different stages and load with separate snowpipes

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว

      With one pipe it’s not possible

    • @rameshram1117
      @rameshram1117 ปีที่แล้ว

      @@anupam.kushwah ok sir thank you

  • @princynandan9389
    @princynandan9389 ปีที่แล้ว

    Thanks for the video, that's very informative :). However I have a doubt.. Let's say my csv file has more no of columns than in the table and since AUTO_INGEST = True, So in this case will it load the data into table or it won't process the file?. If it doesn't process , is there any log file which keeps a track of it?

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว

      Hello Princy,
      Please follow the below document for more details on it
      docs.snowflake.com/en/sql-reference/functions/pipe_usage_history

  • @basavaraju1466
    @basavaraju1466 ปีที่แล้ว

    Hi Bro, I need to configure create pipe connector, especially copy statement; please help me

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว +1

      Please use this link to create Snowpipe anupamkushwah.medium.com/snowflake-snowpipe-automated-data-loading-from-s3-bucket-b395f8d508da
      If you have any further question, please mention.

    • @basavaraju1466
      @basavaraju1466 ปีที่แล้ว

      @@anupam.kushwah Thanku for your response, we don't want to use S3 or any other cloud, we just need direct integration between mule and snowflake that too using create pipe due to cost concerns

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว +1

      Snowpipe only support loading data from public cloud stages. please refer this documentation - docs.snowflake.com/en/user-guide/data-load-snowpipe-intro

  • @FurqaanShafi
    @FurqaanShafi ปีที่แล้ว

    sir how to load data from internal storage i.e laptop ,csv file?

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว

      You can load CSV files from laptop using snowsql tool and put command. Follow this link for step by step instructions
      docs.snowflake.com/en/user-guide/data-load-internal-tutorial

  • @aakashyadav4062
    @aakashyadav4062 ปีที่แล้ว

    i have done everything but data isn't showing in the table

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว

      could you please elaborate the problem and share the scripts here

    • @aakashyadav4062
      @aakashyadav4062 ปีที่แล้ว

      @@anupam.kushwah In the beginning when I was using the copy command to load data from stage to my table. Then it was showing error on integer type columns. But when I took another dataset it works with copy command. Now it is not working with pipe.

    • @aakashyadav4062
      @aakashyadav4062 ปีที่แล้ว

      @@anupam.kushwah CREATE OR REPLACE TABLE HEALTHCARE(
      Patientid VARCHAR(15),
      gender CHAR(8),
      age VARCHAR(5) ,
      hypertension CHAR(20),
      heart_disease CHAR(20),
      ever_married CHAR(30),
      work_type VARCHAR(60),
      Residence_type CHAR(30) ,
      avg_glucose_level VARCHAR(20),
      bmi VARCHAR(20) ,
      smoking_status VARCHAR(20),
      stroke CHAR(20)
      );
      CREATE FILE FORMAT CSV
      TYPE = CSV;
      --create storage integration
      CREATE OR REPLACE STORAGE INTEGRATION snowpipe_integration
      TYPE = external_stage
      STORAGE_PROVIDER = s3
      STORAGE_AWS_ROLE_ARN = 'arn:aws:iam::922199053730:role/my-snowpipe-role'
      ENABLED = true
      STORAGE_ALLOWED_LOCATIONS = ('*');
      desc integration snowpipe_integration
      --create stage
      create or replace stage patient_snow_stage
      url = 's3://my-snowflake-bucket-akas'
      file_format = CSV
      storage_integration = snowpipe_integration;
      LIST @patient_snow_stage
      show stages
      --pull data from stage directly
      select $1, $2 from @patient_snow_stage/test1.csv
      create or replace pipe patients_snowpipe auto_ingest = TRUE AS
      copy into DEMO_DATABASE.PUBLIC.patients3
      from @patient_snow_stage
      ON_ERROR = 'skip_file';
      show pipes
      alter pipe patients_snowpipe refresh;
      create or replace table patients3 like HEALTHCARE;
      select * from patients3;

    • @anupam.kushwah
      @anupam.kushwah  ปีที่แล้ว

      @@aakashyadav4062 Did you setup the sqs in AWS for snowflake pipe?

    • @aakashyadav4062
      @aakashyadav4062 ปีที่แล้ว

      @@anupam.kushwah yes i have created the event notification and paste the sqs key of notification channel in it.