ความคิดเห็น •

  • @bryantTheFatBadger
    @bryantTheFatBadger 3 ปีที่แล้ว +1

    This is literally the best solution walk-thru I have watched on YT. Clear, instructive, and it actually works. You are a hero!

  • @MrElsocio
    @MrElsocio 3 ปีที่แล้ว

    This's pretty awesome. Thanks! We can also do the same with CRR or SRR within S3. But very helpful video to understand Lambda. Thanks again :).

  • @eminedogan3125
    @eminedogan3125 2 ปีที่แล้ว

    Great video, Thank you for the clear explanation!

  • @multitaskprueba1
    @multitaskprueba1 2 ปีที่แล้ว

    Fantastic video! Thank you! You are a genius!

  • @renukasrivastava1167
    @renukasrivastava1167 5 หลายเดือนก่อน

    Thank you for such a simple and good explanation

  • @kalyanijagtap4448
    @kalyanijagtap4448 3 ปีที่แล้ว

    Great video sir u have explained it very well

  • @sunnysandeep202
    @sunnysandeep202 3 ปีที่แล้ว +1

    Great Artice. It helped me a lot.

  • @kudaykumar1261
    @kudaykumar1261 3 ปีที่แล้ว

    Thank you so much sir ... its really work.

  • @nithinbhandari3075
    @nithinbhandari3075 2 ปีที่แล้ว

    Nice Video.
    Thanks.

  • @dodokwak
    @dodokwak 3 ปีที่แล้ว

    Thank you. I also use two buckets ( destination and incoming one)+ lambda function for resizing images.
    On the server side I use django-storages and it's const AWS_S3_CUSTOM_DOMAIN which points to a bucket with resized images. Buckets and their objects have public access. Everything works almost well but I've got a strange bug: 404 error when trying to get image for the first time which turns into 200 OK after refresh. Has somebody got the same issue?

  • @PraveenKumar-ic5zo
    @PraveenKumar-ic5zo ปีที่แล้ว

    Nice Video.

  • @PawanKumar-gl4yw
    @PawanKumar-gl4yw ปีที่แล้ว

    Great Explanation. I would like to know that when Lambda copy data from source bucket to target bucket, where it stores the data ? And if data is let's say 1Tb, then how Lambda would work ?

  • @abelrozario2757
    @abelrozario2757 2 ปีที่แล้ว

    Thank you 👍🏻, can we do similar copy using different aws account for input s3 bucket?

  • @krishnamurali8522
    @krishnamurali8522 3 ปีที่แล้ว

    Super

  • @knandi73
    @knandi73 2 ปีที่แล้ว +1

    There is a CSV file in the local computer and it is uploaded to AWS S3.
    If any changes are made in that CSV file on the local computer, the changes should reflect in AWS S3 automatically using AWS Lambda function.
    What are the steps to achieve this?

  • @ramyahello
    @ramyahello 3 ปีที่แล้ว

    Good video, please upload a video what will you do if you want to add prefix and suffix, like TXT file to one bucket and jpg to another

  • @poppadoesitpropa
    @poppadoesitpropa 2 ปีที่แล้ว

    Great demo, any chance there is a AWS LAMBDA to copy from S3 to FSx windows?

  • @gandheshiva8943
    @gandheshiva8943 3 ปีที่แล้ว

    thankyou

  • @jatin_khera
    @jatin_khera 3 ปีที่แล้ว

    I have one doubt if the bucket contains multiple objects and any file from one particular folder is overwritten will it reflect in the new bucket as well ?

  • @tokunbokazeem1299
    @tokunbokazeem1299 3 ปีที่แล้ว

    great video, can you upload the text to secret manager instead of another s3 bucket?

  • @AshokSharma-yv6mw
    @AshokSharma-yv6mw ปีที่แล้ว

    Good tutorial. However, the first 15 lines of Python/Boto3 code for the Lambda trigger are not readable. Please share.

  • @balajikubendran9120
    @balajikubendran9120 3 ปีที่แล้ว +1

    Hi @Prabhakar
    i need to unzip the zip file in the sub bucket, is this possible to extract the zip fin in its sub bucket, can you please inform

  • @sunnysandeep202
    @sunnysandeep202 3 ปีที่แล้ว

    Sir I want to add data and fetch data to postgresql through c# using lambda. Kindly help me here

  • @AjishPrabhakar
    @AjishPrabhakar 9 หลายเดือนก่อน

    But this can get easily failed for uploading large files, say for eg file size over 500GB . The lambda runtime execution timeout will happen.

  • @baluchittela3016
    @baluchittela3016 2 ปีที่แล้ว

    Thanks for your clear explanation. I followed your steps, as you said, but I am getting errors while running the lambda function. Could you please help me ASAP?
    Error:-
    {
    "errorMessage": "module 'urllib' has no attribute 'unquote_plus'",
    "errorType": "AttributeError",
    "requestId": "0cb2294e-a023-4ab2-8395-05f70689e10f",
    "stackTrace": [
    " File \"/var/task/lambda_function.py\", line 17, in lambda_handler
    object_key = urllib.unquote_plus(event['Records'][0]['s3']['object']['key'])
    "
    ]
    }

  • @Erwingupta1987
    @Erwingupta1987 ปีที่แล้ว

    Hi , that's a great info and thanks for the tutorial...i have question and if this can be answered, can solve my problem..so i have a custom app and we have integrated with AWS event bridge and wants events to be targeted out side of AWS ..one we are using is Google cloud storage..so will the similar python script which can solve my problem

  • @shreyashmakadia8951
    @shreyashmakadia8951 2 ปีที่แล้ว

    Error when trigger create "Unable to validate the following destination configurations"

  • @Videos-rj1ek
    @Videos-rj1ek 2 ปีที่แล้ว

    can we see the log of this copy event..labda copying...you put print statement...does it publish to cloudwatch?

  • @vishwarajgupta1963
    @vishwarajgupta1963 3 ปีที่แล้ว

    HI Sir, Do you teach as well ? I am looking for lambda coaching.

  • @franklinbulmez4989
    @franklinbulmez4989 ปีที่แล้ว

    how can I copy only the file and not all the prefix where it resides?

  • @kudlamolka1429
    @kudlamolka1429 2 ปีที่แล้ว

    Is the source_bucket name is obtained by the trigger?

  • @bikramchandradas4120
    @bikramchandradas4120 3 ปีที่แล้ว

    Any bodey help me to create website
    In the website Dashboard ,have to pute ,aws,start,stop, options...
    To give to user to use their own vps server

  • @mohammadanas6755
    @mohammadanas6755 ปีที่แล้ว

    Sir I want to transfer a file from one aws s3 to different aws s3 using bash script .

  • @dianaan2080
    @dianaan2080 3 ปีที่แล้ว

    I am getting key error: 'Records'.. What to do?

  • @shivamgarg4958
    @shivamgarg4958 ปีที่แล้ว

    can you create a lambda function to compress images using python

  • @lakshmisharon5756
    @lakshmisharon5756 ปีที่แล้ว

    i tried the code but its not working for me i dont know why. its not getting copied to target bucket
    can anyone help

  • @satyamKumar-mr5gc
    @satyamKumar-mr5gc ปีที่แล้ว

    How can run this program for long time

  • @hardikmaghrola
    @hardikmaghrola 2 ปีที่แล้ว

    Create an AWS Lambda function to count the number of words in a text file. The general requirements are as follows:
    Use the AWS Management Console to develop a Lambda function in Python and to create its required resources.
    Report the word count in an email using an Amazon Simple Notification Service (SNS) topic. Optionally, also send the result in an SMS (text) message.
    Format the response message as follows:
    The word count in the file is nnn.
    Replace textFileName with the name of the file.
    Specify the email subject line as: Word Count Result
    Automatically trigger the function when the text file is uploaded to an Amazon S3 bucket.
    Test the function by uploading several text files with different word counts to the S3 bucket.
    Forward the email produced by one of your tests to your instructor along with a screenshot of your Lambda function.

  • @shivagyaneshwar1106
    @shivagyaneshwar1106 ปีที่แล้ว +1

    {
    "errorMessage": "'Records'",
    "errorType": "KeyError",
    "stackTrace": [
    " File \"/var/task/lambda_function.py\", line 17, in lambda_handler
    source_bucket = event['Records'][0]['s3']['bucket']['name']
    "
    ]
    }
    getting this error

  • @shreerangaraju1013
    @shreerangaraju1013 2 ปีที่แล้ว

    where's the event json for this?

  • @sunnysandeep202
    @sunnysandeep202 3 ปีที่แล้ว

    can i delete object from destination bucket as soon as object with same name deleted from source bucket using lambda function? if yes how can i do that?

    • @technologyhub1503
      @technologyhub1503 3 ปีที่แล้ว +2

      Yes, we can tweak the lambda function code as per our requirement. We can delete an object, copy object, we can use copied object data to insert into mysql, postgreSQL, DynamoDB, we can also use this data for Alexa training data set and etc.
      If we want to delete an object from destination bucket as soon as source bucket object is deleted with object name.
      1. Apply lambda function on source bucket with DELETE event.
      2. As soon as you delete a file from source bucket, first we need to cross verify whether same object/file already exists in destination bucket then we can write a code snippet for deleting an object from destination bucket.
      s3.delete_object(Bucket=bucket, Key=destination_object_key)
      Please let me know if you need any help on the same.

  • @syedahmadzada3166
    @syedahmadzada3166 ปีที่แล้ว

    Your video is really helpful but the code keep giving me an issue line 16

  • @mejiger
    @mejiger 2 ปีที่แล้ว +1

    nice one but python 2.7 is not supported on aws anymore and the code is not working for me for python 3+

    • @carlosperal5163
      @carlosperal5163 ปีที่แล้ว

      Same

    • @TonySpark-er2hj
      @TonySpark-er2hj 9 หลายเดือนก่อน

      @@carlosperal5163 from __future__ import print_function
      import boto3
      import time, urllib
      import json
      """Code snippet for copying the objects from AWS source S3 bucket to target S3 bucket as soon as objects uploaded on source S3 bucket
      @author: Prabhakar G
      """
      print ("*"*80)
      print ("Initializing..")
      print ("*"*80)
      s3 = boto3.client('s3')
      def lambda_handler(event, context):
      # TODO implement
      source_bucket = event['Records'][0]['s3']['bucket']['name']
      object_key = urllib.unquote_plus(event['Records'][0]['s3']['object']['key'])
      target_bucket = 'techhub-output-data-andy'
      copy_source = {'Bucket': source_bucket, 'Key': object_key}
      print ("Source bucket : ", source_bucket)
      print ("Target bucket : ", target_bucket)
      print ("Log Stream name: ", context.log_stream_name)
      print ("Log Group name: ", context.log_group_name)
      print ("Request ID: ", context.aws_request_id)
      print ("Mem. limits(MB): ", context.memory_limit_in_mb)
      try:
      print ("Using waiter to waiting for object to persist through s3 service")
      waiter = s3.get_waiter('object_exists')
      waiter.wait(Bucket=source_bucket, Key=object_key)
      s3.copy_object(Bucket=target_bucket, Key=object_key, CopySource=copy_source)
      return response['ContentType']
      except Exception as err:
      print ("Error -"+str(err))
      return e
      This works for me with the newer version of PYTHON well 3.7 anyway:) cheers

  • @eladlevi47
    @eladlevi47 2 ปีที่แล้ว

    Someone has a manual for the same procedure but with python version 3.x ??

    • @devashree8884
      @devashree8884 2 ปีที่แล้ว

      import boto3
      import time, urllib
      import json
      print ("*"*80)
      print ("Initializing..")
      print ("*"*80)
      s3 = boto3.client('s3')
      def lambda_handler(event, context):
      # TODO implement
      source_bucket = event['Records'][0]['s3']['bucket']['name']
      object_key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'])
      target_bucket = 'name of yout target bucket'
      copy_source = {'Bucket': source_bucket, 'Key': object_key}
      print ("Source bucket : ", source_bucket)
      print ("Target bucket : ", target_bucket)
      print ("Log Stream name: ", context.log_stream_name)
      print ("Log Group name: ", context.log_group_name)
      print ("Request ID: ", context.aws_request_id)
      print ("Mem. limits(MB): ", context.memory_limit_in_mb)
      try:
      print ("Using waiter to waiting for object to persist through s3 service")
      waiter = s3.get_waiter('object_exists')
      waiter.wait(Bucket=source_bucket, Key=object_key)
      s3.copy_object(Bucket=target_bucket, Key=object_key, CopySource=copy_source)
      return 'Successfully copied files'
      except Exception as err:
      print ("Error -"+str(err))
      return err

  • @abhishekroxz
    @abhishekroxz 2 ปีที่แล้ว

    What if the data size is huge 10 tb can we transfer the entire data within 15 min?

    • @technologyhub1503
      @technologyhub1503 2 ปีที่แล้ว

      Above Lambda example to demonstrate Lambda capabilities to perform operations on S3 bucket.
      In this case huge data around 10TB+, we can perform the data transfer between buckets using one of the following option s:
      1. Cross-region replication or same-region replication
      2. S3 batch operation
      3. S3DistCp with Amazon EMR
      4. Use aws DataSync

  • @gridofmemories
    @gridofmemories 3 ปีที่แล้ว

    I followed the video but on uploading to my source bucket my file is not copying to the target bucket

    • @davidcloes9048
      @davidcloes9048 3 ปีที่แล้ว

      my uploaded file was also not copying to the target bucket. I had inadvertently not attached the AWSS3FullAccess policy to the role I had created. I only noticed because I had also neglected to add the AWSLambdaBasicExecutionRole to the role, so monitoring wasn't working either. Attached them both and viola!, file was copied to the 2nd bucket.

    • @sekmer009
      @sekmer009 3 ปีที่แล้ว

      @@davidcloes9048 i followed all the steps. but still didnt copy to destination. can you help pls. Anything to set permissions or enable at s3 bucket. One more observation that I dont see Enable check box while creating the trigger. Won't this work on AWS basic user login?

    • @RaamVersion2O
      @RaamVersion2O 3 ปีที่แล้ว +1

      you should maintain runtime python 2.7 only then only you got it.

    • @dianaan2080
      @dianaan2080 3 ปีที่แล้ว +2

      I am getting key error:'Records'

  • @whathowwhywhenandhere9168
    @whathowwhywhenandhere9168 3 ปีที่แล้ว +3

    While running above code I am getting this error lease somebody help
    Response
    {
    "errorMessage": "'Records'",
    "errorType": "KeyError",
    "stackTrace": [
    " File \"/var/task/lambda_function.py\", line 17, in lambda_handler
    source_bucket = event['Records'][0]['s3']['bucket']['name']
    "
    ]
    }

    • @jaimearielchitaybautista6719
      @jaimearielchitaybautista6719 2 ปีที่แล้ว

      I have the same error

    • @saipranav3153
      @saipranav3153 2 ปีที่แล้ว

      upload file in s3 and execute it through trigger.
      If I am not wrong. I guess you have used test to execute the labda

    • @shaikfarheen8906
      @shaikfarheen8906 2 ปีที่แล้ว

      Same thing I am getting ..pls give me a solution

    • @smdmatheen2245
      @smdmatheen2245 ปีที่แล้ว

      anyone fix the error

  • @divyanshjha7672
    @divyanshjha7672 ปีที่แล้ว

    hua hi nahi :(

  • @abhilashak1628
    @abhilashak1628 2 ปีที่แล้ว

    hi the solution didnot work for me can you help me can you shae me mail id so that i can share the error details