Create event-based projects using S3, Lambda and SQS

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ส.ค. 2022
  • The main purpose of this video is to simulate a small portion of an event-based project which is very frequently used by many companies.
    Prerequisite:
    --------------------
    AWS SQS | AWS Simple Queue Service | How SQS Works | AWS Tutorial
    • AWS SQS | AWS Simple Q...
    Create and Use an Amazon SQS Queue | Practical
    • Create and Use an Amaz...
    Fan out Architecture in AWS with SNS + SQS + Lambda + Python
    • Fan out Architecture i...
    Lambda Code:
    ------------------
    import json
    def lambda_handler(event, context):
    TODO implement
    print(event)
    try:
    for i in event['Records']:
    s3_event = json.loads(i['body'])
    if 'Event' in s3_event and s3_event['Event'] == 's3:TestEvent':
    print("Test Event")
    else:
    for j in s3_event['Records']:
    print("Bucket Name : {} ".format(j['s3']['bucket']['name']))
    print("Object Name : {} ".format(j['s3']['object']['key']))
    except Exception as exception:
    print(exception)
    SQS-s3 Access Policy:
    ----------------------------------
    {
    "Version": "2012-10-17",
    "Id": "Policy1662050523224",
    "Statement": [
    {
    "Sid": "Stmt1662050521697",
    "Effect": "Allow",
    "Principal": "*",
    "Action": "sqs:*",
    "Resource": "{SQS Queue ARN}",
    "Condition": {
    "ArnEquals": {
    "aws:SourceArn": "{s3 bucket ARN}"
    }
    }
    }
    ]
    }
    Check this playlist for more Data Engineering related videos:
    • Demystifying Data Engi...
    Snowflake Complete Course from scratch with End-to-End Project with in-depth explanation--
    doc.clickup.com/37466271/d/h/...
    🙏🙏🙏🙏🙏🙏🙏🙏
    YOU JUST NEED TO DO
    3 THINGS to support my channel
    LIKE
    SHARE
    &
    SUBSCRIBE
    TO MY TH-cam CHANNEL
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 26

  • @deepaksingh9318
    @deepaksingh9318 11 หลายเดือนก่อน +2

    A perfect explanation with appropriate example and use case .
    So anyone who is new to the concept can easily understand after watching the video:
    1. what it is
    2. Why it is used and what is the need of it
    3 And how to do it end to end..
    So a perfect video i would say covering everything

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  5 หลายเดือนก่อน

      Thank you so much for your positive feedback @deepaksingh9318! I'm glad to hear that the explanation resonated well with you, and you found it comprehensive and helpful.

  • @RajasthaniINAmerica
    @RajasthaniINAmerica 7 หลายเดือนก่อน +1

    simple & straight forward

  • @SK-gn3rs
    @SK-gn3rs ปีที่แล้ว +1

    Thanks for the code, was struggling to read the event to extract bucket name and the object...this made my life easy

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  ปีที่แล้ว

      Glad to hear the video is helpful to you S K! Happy Learning

  • @mandarkulkarni9525
    @mandarkulkarni9525 ปีที่แล้ว

    What is the efficient and cost effective way of moving Messages from SQS Q to S3 bucket
    I have a Lambda function that is processing messages from SQS Q and deletes them once processing is done. I need to persist SQS messages in S3 for compliance. Thank you.

  • @diptarghyachatterjee6018
    @diptarghyachatterjee6018 ปีที่แล้ว

    Great explanation... Is there anyway instead of SQS we can have AWS eventbridge through which we can trigger the lamda .
    2. Also can you provide any python or pspark script through which we can load the CSV file to snowflake db

  • @harrior1
    @harrior1 ปีที่แล้ว

    Thanks a lot!

  • @manubansal9197
    @manubansal9197 หลายเดือนก่อน

    can you tell the all things you performed and used are free to use? i mean if i make a same as yours or make an integration of s3, sqs and lambda, aws would not apply charge na? and can you provide all the codes and steps in a docx format?

  • @ravikreddy7470
    @ravikreddy7470 ปีที่แล้ว +1

    Quick question: Don't we have to upload deployment zip with json package in it? how does lambda install that library?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  ปีที่แล้ว +1

      Hello Ravi K Reddy, json is available by default in AWS Lambda execution environment , so no need deployment zip or lambda layer to use json, you can find the list of available modules in Lambda Eexecution environment for different Python versions here -- gist.github.com/gene1wood/4a052f39490fae00e0c3 Happy Learning

  • @likitan4076
    @likitan4076 4 หลายเดือนก่อน +2

    Also, without adding SQS trigger to the lambda how did it detect the s3 file uploads from sqs trigger as seen in the cloudwatch logs?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  4 หลายเดือนก่อน

      @likitan4076, I have added the trigger at 9:27 ..

    • @likitan4076
      @likitan4076 4 หลายเดือนก่อน +1

      @@KnowledgeAmplifier1 To the SQS you added a lambda trigger..got it.. iwas thinking adding an sqs trigger to the lambda function

  • @DineshKumar-bk5vv
    @DineshKumar-bk5vv ปีที่แล้ว

    Hello Sir, Could you pls make a video to integrate Application using Amazon SQS .

  • @kspremkumar4869
    @kspremkumar4869 ปีที่แล้ว +1

    Hi. I have few doubts on kafka. can you please explain?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  ปีที่แล้ว

      Hello KS Prem Kumar, please share your doubt here , if I know that topic , I will surely try to help as much as possible..

  • @Polly10189
    @Polly10189 5 หลายเดือนก่อน +1

    is it possible to get the uploaded file content as well in the SQS message anyhow?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  5 หลายเดือนก่อน +1

      SQS has a message size limitation, and it's recommended to keep messages as small as possible. Including the actual content of a large file in an SQS message could potentially lead to exceeding these limitations. Moreover, SQS is more efficient when used to transmit metadata or information necessary to trigger subsequent actions.

    • @Polly10189
      @Polly10189 5 หลายเดือนก่อน +1

      ​@@KnowledgeAmplifier1 Thanks for your reply. I need to get the actual data of uploaded file, can we do this by using any AWS service?

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  5 หลายเดือนก่อน +1

      @@Polly10189 from the code explained in the video , you can get the s3 bucket name & key name , now you can use any python module like boto3 or s3fs to read the data from s3 and perform various computation.
      For example , if you want to read the csv data from s3 , then here is the code --
      s3 = boto3.client(
      's3',
      aws_access_key_id='XYZACCESSKEY',
      aws_secret_access_key='XYZSECRETKEY',
      region_name='us-east-1'
      )
      obj = s3.get_object(Bucket='bucket-name', Key='myreadcsvfile.csv')
      data = obj['Body'].read().decode('utf-8').splitlines()
      records = csv.reader(data)
      headers = next(records)
      print('headers: %s' % (headers))
      for eachRecord in records:
      print(eachRecord)
      Like this way for different file format , you can create the code and read from s3 ...

    • @Polly10189
      @Polly10189 4 หลายเดือนก่อน +1

      @@KnowledgeAmplifier1 I am reading the path of file uploaded to S3. It's working, Thanks

    • @KnowledgeAmplifier1
      @KnowledgeAmplifier1  4 หลายเดือนก่อน

      @@Polly10189 Glad to hear this! Happy Learning

  • @DineshKumar-bk5vv
    @DineshKumar-bk5vv ปีที่แล้ว

    How to reach out for more information...can I get contact details pls?