Data World Solution
Data World Solution
  • 123
  • 13 040
How to Use Cursors in Snowflake: Step-by-Step Tutorial
Learn how to efficiently use cursors in Snowflake for data processing. This tutorial covers the basics, practical examples, and advanced techniques to help you master cursors in your Snowflake workflow
----------------------------------------------------------------
Video Code :-
drive.google.com/file/d/1d33aFp5jqgwT7uV49Sa2M1w6XjnHxSFh/view?usp=sharing
----------------------------------------------------------------
#Snowflake, #SnowflakeTutorial, #DataProcessing, #SQL, #DataEngineering, #SnowflakeDB, #Cursor, #SQLTutorial, #BigData, #DataAnalysis, #TechTutorial #dataworldsolution
มุมมอง: 15

วีดีโอ

How to Create and Attach Event Triggers to Pipelines | Type of Trigger |Azure Data Factory Tutorial
มุมมอง 252 ชั่วโมงที่ผ่านมา
Learn how to create and attach event triggers to pipelines in Azure Data Factory. This tutorial walks you through setting up event-driven workflows to automate your data processes efficiently #AzureDataFactory #EventTriggers #DataAutomation #PipelineAutomation #DataEngineering #CloudComputing #AzureTutorial #DataIntegration #TechTutorial #CloudData #dataworldsolution
How to Use FOR Loops in Snowflake Scripting | Type of for Loop | Snowflake tutorial | SQL
มุมมอง 144 ชั่วโมงที่ผ่านมา
Learn how to implement and master FOR loops in Snowflake scripting. This tutorial will guide you through the basics, including syntax, practical examples, and how to optimize loops for efficient data processing in Snowflake Video Code drive.google.com/file/d/1Q4VrYbtKLAwTDDaMsgDQj1oHLogRM1-v/view?usp=sharing #Snowflake #SQLTutorial #ForLoop #DataEngineering #SnowflakeScripting #DataAutomation #...
ADF Mapping Design: Identify Employees Earning Below Departmental Average Salary.
มุมมอง 10114 วันที่ผ่านมา
ADF mapping to find employees earning below their department's average salary. #AzureDataFactory #ADF #DataMapping #EmployeeSalaries #DataTransformation #DepartmentalAverage #dataworldsolution
WHILE Loops in Snowflake Scripting Explained | Practical Examples| Snowflake |SQL
มุมมอง 5014 วันที่ผ่านมา
Learn how to use WHILE loops in Snowflake scripting to efficiently automate and control data processing tasks. This tutorial covers the basics, provides practical examples, and walks you through common use cases. Perfect for both beginners and experienced users looking to enhance their Snowflake scripting skills! Video Code drive.google.com/file/d/1Tfk2qtxaEbpgY8TiGmMpCAk_w_d1Pc2e/view?usp=shar...
Basics of Loops in Snowflake Scripting | Snowflake Scripting Tutorial | Loop
มุมมอง 6414 วันที่ผ่านมา
Learn the basics of loops in Snowflake SQL with this comprehensive tutorial. We'll cover the syntax, usage, and practical examples to help you effectively utilize loops in your Snowflake scripting. Perfect for beginners looking to enhance their Snowflake skills! code file :- drive.google.com/file/d/1wGef1bJOrczJdM1NqxvRuX72BCHyaqwS/view?usp=sharing #Snowflake #SnowflakeSQL #SnowflakeScripting #...
Snowflake Scripting Tutorial: CASE Statements and Types of Case Statement Explained
มุมมอง 5214 วันที่ผ่านมา
In this tutorial, we'll dive into using CASE statements in Snowflake scripting. Learn about the different types of CASE statements and how to apply them to enhance your SQL scripts for more dynamic and flexible data processing. Code CREATE OR REPLACE TABLE employees ( employee_id NUMBER, first_name VARCHAR2(50), salary NUMBER, department_id NUMBER ); INSERT INTO employees (employee_id, first_na...
Snowflake Scripting Tutorial: Using IF Statements with Example
มุมมอง 7321 วันที่ผ่านมา
In this tutorial, we'll explore how to use IF statements in Snowflake scripting with practical examples. Learn how to incorporate conditional logic into your SQL scripts to make your data processing more efficient and dynamic. Code :- drive.google.com/file/d/134-g1Nlywrai9hhCf-N1kWMY7NW0pnUU/view?usp=sharing #Snowflake #SQL #Scripting #IFStatements #SnowflakeTutorial #SQLScripting #DataEngineer...
Load Records with 2nd Highest Salary into Target Table using Azure Data Factory | ADF tutorial
มุมมอง 15921 วันที่ผ่านมา
In this tutorial, you'll learn how to use Azure Data Factory to extract and load records from a source dataset into a target table, focusing specifically on records with the 2nd highest salary. We'll walk you through the step-by-step process of setting up the data pipeline, configuring activities, and ensuring accurate data transformation and transfer. Perfect for data engineers and anyone look...
How to Load Unique and Duplicate Rows into Separate Tables Using Azure Data Factory | ADF Tutorial
มุมมอง 19521 วันที่ผ่านมา
Learn how to use Azure Data Factory to separate unique and duplicate rows from a source table, loading them into two different target tables. This step-by-step guide will help you efficiently manage and organize your data. Business Scenario There is a source dataset that contains duplicate rows. Using Azure Data Factory (ADF), create a process to put all the unique rows into one target table an...
How to Use the RETURN Command in Snowflake Scripting | Return as Tables
มุมมอง 5021 วันที่ผ่านมา
In this video, we dive into the RETURN command in Snowflake scripting. Learn how to effectively use RETURN to control the flow of your scripts, return values from stored procedures, and enhance your Snowflake scripts' functionality. Whether you're new to Snowflake or looking to refine your scripting skills, this tutorial will guide you through the essentials of the RETURN command with practical...
How to Declare Variables in Snowflake Scripting: Declaration, Scope, Assignment, SELECT Statements
มุมมอง 5321 วันที่ผ่านมา
In this video, you'll learn how to declare variables in Snowflake scripting, understand their scope, assign values, and use them effectively in SELECT SQL statements. Whether you're new to Snowflake or looking to deepen your scripting skills, this tutorial will guide you through the essentials of working with variables in Snowflake. code - declare v_date date; v_no number:=10; v_name varchar2(1...
From Raw to Refined: Clean, Customize, and Save Data as .tds in Tableau
มุมมอง 1721 วันที่ผ่านมา
Learn how to clean and customize your data in Tableau and save it as a .tds file. This step-by-step guide will help you streamline your data preparation process for more efficient analysis #Tableau #DataCleaning #DataCustomization #TDSFile #DataPreparation #DataAnalytics #TableauTutorial #DataScience #DataVisualization #ETL
Bulk Data Transfer: Azure Data Storage to SQL Database Using ADF
มุมมอง 289หลายเดือนก่อน
Bulk Data Transfer: Azure Data Storage to SQL Database Using ADF
Tableau Data Connections: A Comprehensive Tutorial
มุมมอง 29หลายเดือนก่อน
Tableau Data Connections: A Comprehensive Tutorial
Introduction to Snowflake Scripting: Basics of Begin and End Blocks
มุมมอง 84หลายเดือนก่อน
Introduction to Snowflake Scripting: Basics of Begin and End Blocks
How to Use FIRST_VALUE and LAST_VALUE Functions in Snowflake
มุมมอง 105หลายเดือนก่อน
How to Use FIRST_VALUE and LAST_VALUE Functions in Snowflake
How To Calculate Running Totals in Snowflake SQL | Cumulative Sum Using Analytical Function #sql
มุมมอง 115หลายเดือนก่อน
How To Calculate Running Totals in Snowflake SQL | Cumulative Sum Using Analytical Function #sql
How to Implement SCD Type 2 Using Azure Data Factory: Best Practices
มุมมอง 361หลายเดือนก่อน
How to Implement SCD Type 2 Using Azure Data Factory: Best Practices
How to Calculate Moving Averages with Window Functions in Snowflake SQL For Data Analysis.
มุมมอง 127หลายเดือนก่อน
How to Calculate Moving Averages with Window Functions in Snowflake SQL For Data Analysis.
Mastering Density Maps in Tableau for Data Visualization
มุมมอง 41หลายเดือนก่อน
Mastering Density Maps in Tableau for Data Visualization
How to Use LEAD and LAG Functions in Snowflake Database
มุมมอง 142หลายเดือนก่อน
How to Use LEAD and LAG Functions in Snowflake Database
SCD Type 1 Implementation in Azure Data Factory: Beginner to Expert
มุมมอง 291หลายเดือนก่อน
SCD Type 1 Implementation in Azure Data Factory: Beginner to Expert
Azure Data Factory Tutorial: Splitting Columns into Rows Using Data Flow
มุมมอง 259หลายเดือนก่อน
Azure Data Factory Tutorial: Splitting Columns into Rows Using Data Flow
Visualize Data with Symbol & Filled Maps in Tableau
มุมมอง 68หลายเดือนก่อน
Visualize Data with Symbol & Filled Maps in Tableau
How to create Bullet Graph to Compare Sales Performance to Targets
มุมมอง 67หลายเดือนก่อน
How to create Bullet Graph to Compare Sales Performance to Targets
Deep Dive into Analytical Function ROW_NUMBER() in Snowflake: Practical Examples and Use Cases
มุมมอง 74หลายเดือนก่อน
Deep Dive into Analytical Function ROW_NUMBER() in Snowflake: Practical Examples and Use Cases
How to Make Bar-in-Bar Charts in Tableau: Easy Tutorial
มุมมอง 38หลายเดือนก่อน
How to Make Bar-in-Bar Charts in Tableau: Easy Tutorial
Mastering Unpivot Transformation in Azure Data Factory: Step-by-Step Tutorial
มุมมอง 63หลายเดือนก่อน
Mastering Unpivot Transformation in Azure Data Factory: Step-by-Step Tutorial
Load data from Azure SQL and Blob Storage to Snowflake | using Sort and Union Transformations | ADF
มุมมอง 141หลายเดือนก่อน
Load data from Azure SQL and Blob Storage to Snowflake | using Sort and Union Transformations | ADF

ความคิดเห็น

  • @mohammedvahid5099
    @mohammedvahid5099 2 วันที่ผ่านมา

    Amaxing ❤bro

  • @rajeevjosephk
    @rajeevjosephk 2 วันที่ผ่านมา

    how to conver 0,00 as 0.00

  • @rajeevjosephk
    @rajeevjosephk 2 วันที่ผ่านมา

    how to conveert 0,00 as 0.00 ??

  • @mohammedvahid5099
    @mohammedvahid5099 25 วันที่ผ่านมา

    Ur awesome bro and explaining is very well thnk u❤

  • @RajeevKumar_SQH
    @RajeevKumar_SQH หลายเดือนก่อน

    This is helpful.. Thank you 😊

  • @suryasurendrareddy
    @suryasurendrareddy หลายเดือนก่อน

    How to implement scd type2 in multiple tables

    • @dataworldsolution
      @dataworldsolution หลายเดือนก่อน

      I will make video on this

    • @tiago5a
      @tiago5a 15 วันที่ผ่านมา

      Does it works on Fabric?

  • @Prashanthi248
    @Prashanthi248 หลายเดือนก่อน

    Good projection

  • @NomanKhan80USA
    @NomanKhan80USA หลายเดือนก่อน

    Hi , Please can you upload SCD Type 3 Implementation in Azure Data Factory, there is no vidoe on this . thanks

  • @mohammedvahid5099
    @mohammedvahid5099 หลายเดือนก่อน

    Thnk u bro

  • @NirmalJoseph-r4p
    @NirmalJoseph-r4p หลายเดือนก่อน

    Thank you

  • @ryanhancox6921
    @ryanhancox6921 หลายเดือนก่อน

    Thank you this was very helpful!

  • @AnkitBilone
    @AnkitBilone 2 หลายเดือนก่อน

    Super helpful!

  • @mohammedvahid5099
    @mohammedvahid5099 2 หลายเดือนก่อน

    Pleas bro can u make videos on how to bigger files like large volume TB's of data into small chunks(files) through python code scripts on python languge make that video's of playlist bro🙏

  • @mohammedvahid5099
    @mohammedvahid5099 2 หลายเดือนก่อน

    Nice video bro thnk u

  • @cherukurid0835
    @cherukurid0835 2 หลายเดือนก่อน

    Hi Bro , Thanks for video , which one we have to select if client ask loading data using ADF or loading the data using snowflake copy command from azure storage ? and what is the reason ?

    • @dataworldsolution
      @dataworldsolution 2 หลายเดือนก่อน

      If you want to automate data ingestion process through ETL than we need ADF option as well, it’s also depend on work process.

    • @cherukurid0835
      @cherukurid0835 2 หลายเดือนก่อน

      @@dataworldsolution Bro , can you elaborate on the work process means

  • @c.4800
    @c.4800 2 หลายเดือนก่อน

    Fantastic tutorial, thank you!

  • @mohammedvahid5099
    @mohammedvahid5099 2 หลายเดือนก่อน

    Excellent bro thnk u ❤🎉

  • @mohammedvahid5099
    @mohammedvahid5099 2 หลายเดือนก่อน

    Great job bro well explained ❤

  • @vikaspanchal1746
    @vikaspanchal1746 3 หลายเดือนก่อน

    Nice brother 🎉

  • @Thehonestreview99
    @Thehonestreview99 3 หลายเดือนก่อน

    Help me with resume building for snowflake developer

  • @Thehonestreview99
    @Thehonestreview99 3 หลายเดือนก่อน

    You are doing a great job keep posting

  • @mohammedvahid5099
    @mohammedvahid5099 3 หลายเดือนก่อน

    BRO PLEASE PROVIDE orders dataset csv FILE for practice....where u get that csv file pleas give the link bro

    • @dataworldsolution
      @dataworldsolution 3 หลายเดือนก่อน

      drive.google.com/drive/folders/1E0rGggx27gZdcfLeR0g5zpP1JqaSbli-?usp=sharing

    • @dataworldsolution
      @dataworldsolution 3 หลายเดือนก่อน

      # python scripts to generate dataset -------------------------------------------------------- import random import csv # Define the categories and subcategories categories = ["Electronics", "Books", "Clothing"] electronics_subcategories = ["Laptops", "Headphones", "Smartphones", "Accessories"] books_subcategories = ["Fiction", "Non-Fiction", "Science"] clothing_subcategories = ["T-Shirts", "Jeans", "Hoodies"] # Function to generate random data for a row def generate_row(): order_id = "ORD" + str(random.randint(1, 10000)).zfill(4) amount = random.randint(50, 500) profit = int(amount * random.uniform(0.1, 0.3)) quantity = random.randint(1, 5) category = random.choice(categories) if category == "Electronics": subcategory = random.choice(electronics_subcategories) elif category == "Books": subcategory = random.choice(books_subcategories) else: subcategory = random.choice(clothing_subcategories) return [order_id, amount, profit, quantity, category, subcategory] # Generate 1000 rows of data rows = [] for _ in range(25): rows.append(generate_row()) # Write the data to a CSV file with open("OrdersAzuredataset2.csv", "w", newline="") as csvfile: writer = csv.writer(csvfile) writer.writerow(["ORDER_ID", "AMOUNT", "PROFIT", "QUANTITY", "CATEGORY", "SUBCATEGORY"]) writer.writerows(rows) print("Dataset created successfully!")

  • @mohammedvahid5099
    @mohammedvahid5099 3 หลายเดือนก่อน

    Nice explain bro thnk u

  • @shanezinno8077
    @shanezinno8077 3 หลายเดือนก่อน

    🎶 'Promo sm'

  • @mohammedvahid5099
    @mohammedvahid5099 3 หลายเดือนก่อน

    Ur video on tasks is excellent 👌 bro want to also know how to use SCD TYPE 1, 2 AND 3 THNK U

  • @mohammedvahid5099
    @mohammedvahid5099 3 หลายเดือนก่อน

    Thnk u bro... BHAI kya ADF aur bolo ge kya ADF videos ka intezar kar rha hu bro. Pls make ADF completely ❤

    • @dataworldsolution
      @dataworldsolution 3 หลายเดือนก่อน

      Busy in some family functions bro , i will prepared also video on ADF Completely , thanks bro

  • @user-ex2xe6es1w
    @user-ex2xe6es1w 4 หลายเดือนก่อน

    i am not geetin your point i check my wh as well as i use C:\Users\ANJI>snowsql -a UK82993.AWS_AP_SOUTHEAST_1 -u RAMA

  • @user-ex2xe6es1w
    @user-ex2xe6es1w 4 หลายเดือนก่อน

    I am getting this type of error could you please give me any suggestion !!!! 250001 (n/a): Could not connect to Snowflake backend after 2 attempt(s).Aborting If the error message is unclear, enable logging using -o log_level=DEBUG and see the log to find out the cause. Contact support for further help. Goodbye!

    • @dataworldsolution
      @dataworldsolution 4 หลายเดือนก่อน

      Hi Which statement are you used, look like this is connectivity issue, plz check your data warehouse activated

  • @ashok6644
    @ashok6644 4 หลายเดือนก่อน

    good one

  • @mohammedvahid5099
    @mohammedvahid5099 4 หลายเดือนก่อน

    Superb bro ur like master pro ❤

  • @mohammedvahid5099
    @mohammedvahid5099 4 หลายเดือนก่อน

    NICE APPROACH BRO THNK U

  • @mohammedvahid5099
    @mohammedvahid5099 4 หลายเดือนก่อน

    WELL EXPLAINED BRO THNK U

  • @astrohomedecor3516
    @astrohomedecor3516 4 หลายเดือนก่อน

    Nice sanu bhai

  • @mohammedvahid5099
    @mohammedvahid5099 4 หลายเดือนก่อน

    Amazing pls work on more azure to and also how to SCD TYPE 1 AND SCD 2 TWO ALSO BRO AFTR FINISHING STREAMS AND TASKS..AND I WANT TO SEE HOW U WORK ON QUERY OPTIMIZATION FOR QUERY PERFORMANCE TUNING UR SCENARIO BRO...THNKS UR WORKING AMAZING BRO

  • @cherukurid0835
    @cherukurid0835 4 หลายเดือนก่อน

    Hi Bro , Thanks for videos and which method is preferable to copy data from Azure storage to Snowflake tables : Using snowflake copy command or using ADF tool ?

    • @dataworldsolution
      @dataworldsolution 4 หลายเดือนก่อน

      Hi I am using ADF for automation and transformation with databricks but for add hoc copy command is best. So it depend on type of work.

    • @cherukurid0835
      @cherukurid0835 4 หลายเดือนก่อน

      Bro , do we have to provide access permission to storage for ADF to all the storage accounts we created ?

  • @mohammedvahid5099
    @mohammedvahid5099 4 หลายเดือนก่อน

    please make to more videos on i just created azure aaccount and adf also ples post further videos on how to integrete with snowflake and migrating on adf to snowflake..am waitng bro thnk u so much for this video

  • @mohammedvahid5099
    @mohammedvahid5099 4 หลายเดือนก่อน

    Nice session bro thnk u ples make on videos validation_mode optjon on copy into command and also how to use with azure blob on full video content bro ❤ using full load and incremental also wanto know more about merge on STREAMS on scd type 1 and SCD TYPE 2 ALSO🎉❤❤❤

  • @vru5696
    @vru5696 5 หลายเดือนก่อน

    Suppose we have varchar field date col where the value is MM/DD/YYYY how can we convert it to the date datatype while loading into target in the select query? Any function?

    • @dataworldsolution
      @dataworldsolution 5 หลายเดือนก่อน

      Bro , use cast function. Or to_date

    • @dataworldsolution
      @dataworldsolution 5 หลายเดือนก่อน

      Thanks for asking

    • @vru5696
      @vru5696 5 หลายเดือนก่อน

      @@dataworldsolution not able to give required format in cast. To_date is also not working - select to_date(datecol,'DD/MM/YYYY'). Can you help with code?

  • @mohammedvahid5099
    @mohammedvahid5099 5 หลายเดือนก่อน

    Nice session bro where can I get the the source loan datasets 1 and 2 bro can tel me pls..

    • @dataworldsolution
      @dataworldsolution 5 หลายเดือนก่อน

      I have python script to create dataset , I will provide you.

    • @dataworldsolution
      @dataworldsolution 5 หลายเดือนก่อน

      use python script to generate dataset :- ------------------ # conda install Faker import csv from faker import Faker import random from datetime import datetime, timedelta fake = Faker() # Define loan statuses loan_statuses = ['PAIDOFF', 'COLLECTION', 'COLLECTION_PAIDOFF'] # Define education levels education_levels = ['High School or Below', 'College', 'Bechalor', 'Master or Above'] # Define genders genders = ['male', 'female'] # Generate 50000 loan payment records num_records = 4000 with open('loan_payment_dataset4.csv', mode='w', newline='') as file: writer = csv.writer(file) writer.writerow(["Loan_ID", "loan_status", "Principal", "terms", "effective_date", "due_date", "paid_off_time", "past_due_days", "age", "education", "Gender"]) for i in range(num_records): loan_id = fake.uuid4() loan_status = random.choice(loan_statuses) principal = str(random.randint(500, 5000)) terms = str(random.randint(7, 30)) + " days" effective_date = fake.date_time_between(start_date="-1y", end_date="now", tzinfo=None) due_date = effective_date + timedelta(days=int(terms.split()[0])) paid_off_time = '' past_due_days = '' if loan_status == 'PAIDOFF': paid_off_time = fake.date_time_between(start_date=effective_date, end_date="now", tzinfo=None) elif loan_status == 'COLLECTION_PAIDOFF': paid_off_time = fake.date_time_between(start_date=due_date, end_date="now", tzinfo=None) past_due_days = str((paid_off_time - due_date).days) elif loan_status == 'COLLECTION': past_due_days = str((datetime.now() - due_date).days) age = str(random.randint(18, 70)) education = random.choice(education_levels) gender = random.choice(genders) writer.writerow([loan_id, loan_status, principal, terms, effective_date, due_date, paid_off_time, past_due_days, age, education, gender]) print("Dataset generation completed.")

  • @jaga8956
    @jaga8956 5 หลายเดือนก่อน

    Wow, learnt sub query concept with detailed explanation. Thanks BRO, it helped me alot. Hope to view further videos.

  • @mohammedvahid5099
    @mohammedvahid5099 5 หลายเดือนก่อน

    Nice session bro❤

  • @mohammedvahid5099
    @mohammedvahid5099 5 หลายเดือนก่อน

    Excellent explained in simple manner thnks bro also explain ON_ERROR AND VALIDATION MODE OPTION WITH COPY COMMAND BRO THNK U🎉 AFTER THIS PLS EXPLAIN internal stages and use cases and with error handling also bro❤

  • @mohammedvahid5099
    @mohammedvahid5099 5 หลายเดือนก่อน

    Amazing video thnk u it's very usefull for hands one exp bro🎉❤

  • @mohammedvahid5099
    @mohammedvahid5099 5 หลายเดือนก่อน

    Ur SQL essentials for simply in intermediate level excellent bro

  • @mohammedvahid5099
    @mohammedvahid5099 5 หลายเดือนก่อน

    Nice bro also explain please snowflake table functions also to search out Bulk loads and for snowpipes also bro...

  • @astrohomedecor3516
    @astrohomedecor3516 5 หลายเดือนก่อน

    Nice bro

  • @mohammedvahid5099
    @mohammedvahid5099 5 หลายเดือนก่อน

    Nice bro ❤

  • @mohammedvahid5099
    @mohammedvahid5099 6 หลายเดือนก่อน

    Amazing videos pleas make more video's bro

  • @sisindrisatteti830
    @sisindrisatteti830 6 หลายเดือนก่อน

    Hi sir. Please continue videos like this type of real time use case topics