Why I often use AWS lambda and serverless architecture

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 เม.ย. 2023
  • 📘 T3 Stack Tutorial: 1017897100294.gumroad.com/l/j...
    🤖 SaaS I'm Building: www.icongeneratorai.com/
    💬 Discord: / discord
    🔔 Newsletter: newsletter.webdevcody.com/
    📁 GitHub: github.com/webdevcody
    📺 Twitch: / webdevcody
    🤖 Website: webdevcody.com
    🐦 Twitter: / webdevcody

ความคิดเห็น • 62

  • @vytasgavelis
    @vytasgavelis 2 หลายเดือนก่อน +1

    Would be keen to see a complete production setup that you are running.
    There are so many ways to deploy lambdas: SST, cdk by it's own, terraform etc... All of these have different drawbacks especially when it comes to local development environment experience but it is hard to find resources on the internet what real companies are doing

  • @w9914420
    @w9914420 ปีที่แล้ว +4

    Hi Cody, many thanks for your insights into lambda, would be cool to see an example of how you would use puppeteer to generate pdf's from html (with css styling)😊

  • @SeibertSwirl
    @SeibertSwirl ปีที่แล้ว +1

    Good job babe!!!! I’m finally first again! But most of all I’m so proud of you and all this work you’ve done ❤

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว +1

      Awww thanks babe!

  • @habong17359
    @habong17359 ปีที่แล้ว +2

    if you get a chacne, can you walkthrough your ci/cd for lamda functions? great video btw!

  • @INKILU
    @INKILU ปีที่แล้ว +2

    Enjoying the aws vids 👍🏼

  • @baracka448
    @baracka448 ปีที่แล้ว +3

    8:30 Lambda as Docker images supports a max 10GB, so that would help you with the 256MB limit.

  • @B1TCH35K1LL3R
    @B1TCH35K1LL3R ปีที่แล้ว +5

    serverless works fine when there are budget limitations, but another good approach for this might be to have your express/nest API web server as a container and deploy it using a management service such as ECS or even better, Kubernetes. However as I mentioned that won't apply to every project (specially if there are budget restrictions)

  • @TwenTV
    @TwenTV ปีที่แล้ว +4

    You can also get around the 250MB constraint by wrapping your lambda in a docker image and building your lambda from ECR instead. It will require a bit more on the development side with versioning but allows up to 10GB in dependencies

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว

      Any clue if that slows down cold starts

    • @constponf7403
      @constponf7403 ปีที่แล้ว

      @@WebDevCody Not too much, since AWS provides specific docker base images. I am currently deploying every lambda as a docker image, because it simplifies the deployment process.

    • @uome2k7
      @uome2k7 ปีที่แล้ว +1

      why would you want to do that? Lambas should be short running (there's a time limit on how long it can run for) and stateless. Anything that needs more than 250MB throws up a lot of red flags because it sounds like it would be doing way too much. Wrapping them in a docker image means you have to add the docker registry lookup, download that image and all those dependencies and then start up that docker image, even if its docker running inside docker.
      Being able to run a lamba inside docker for local testing makes sense, but I think keeping it within the base restrictions should be the goal.
      Remember you are paying for compute cycles and memory space on AWS so you want to be as small and as fast as you can get.

    • @TwenTV
      @TwenTV ปีที่แล้ว

      @@uome2k7 Its 250MB for dependencies. There are many cases where you have a quick and useful lambda, but library requirements just surpass the 250MB allowed in layers :)

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว +4

      @@uome2k7 there are many use cases, for example I need a lambda to generate a pdf for my users. In order to generate a pdf, you need a bunch of binaries which use a LOT of space. Generating the pdf takes maybe 1 second, but installing a chromium binary can take like 80mb by itself, not to mention your all your required node_modules. I'd also say using docker as a way to standardize your lambdas is actually very useful. I've wasted hours debugging issues that work locally but fail on lambda because of incorrect setup of the binary paths, etc.

  • @xorlop
    @xorlop ปีที่แล้ว +1

    I am very interested to hear your thoughts on CF Workers. You can even create them programmatically on an enterprise plan. CF Workers also have environments so you can do a test deploy. There is also built-in monitoring now, too. I have some basic experience with Lambda, but not much. The CF Edge also doesn't have cold starts, I think (I could be very wrong about this!). Idk about limits... but I do know there are ways to change and alter them to be unbound I think.
    I only use it for work, so idk about pricing.

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว +1

      I've never worked with CF workers, so if I do maybe work with them I'll make a video

  • @driden1987
    @driden1987 ปีที่แล้ว +3

    I've been using Serverless Stack (SST) lately, it's pretty awesome

    • @male3399
      @male3399 4 หลายเดือนก่อน

      Do you use lambda with SST?

    • @driden1987
      @driden1987 3 หลายเดือนก่อน

      @@male3399 Yes

  • @jatinhemnani1029
    @jatinhemnani1029 ปีที่แล้ว

    we can get an API url without API Gateway right directly using the function url

  • @saman6199
    @saman6199 ปีที่แล้ว

    Hey Cody, would it be possible to have a very small express app and run it on lambda to show us how we could configuring it. It would be appreciate it

  • @Pyrospower
    @Pyrospower ปีที่แล้ว

    thanks for the interesting video!

  • @corygrewohl8180
    @corygrewohl8180 ปีที่แล้ว +1

    so one thing I've wondered for a while, is does lambda/serverless architecture replace a normal backend built in Express, Django, etc.? I'm beginning to work on a web portal for an app startup idea, and I'm generally just confused out what the best way to build up a back end is. in the past I've used lambda for a lot, but I'm wondering if building an Express API is better, or if generally companies use both. I guess i'm just confused on where each comes into play. thanks for the help!

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว

      I typically have a single express app that I wrap and get deployed to a single lambda function. There is a library called aws-serverless-express you can use to wrap your express app and deploy to a lambda, super easy to use. You can also use existing tools such as the serverless framework to host an express app directly to lambda. Honestly, deploying a django app to heroku or some other container service host works perfectly fine, I’m just a big fan of only pay for what you use. Some people deploy a separate lambda for each endpoint, but that causes deployments to take forever

    • @corygrewohl8180
      @corygrewohl8180 ปีที่แล้ว

      @@WebDevCody oh that's actually really cool you can wrap it and then reap the benefits of lambda still. i guess im just wondering what the benefit of deploying it to a server is then.

  • @rustystrings0908
    @rustystrings0908 ปีที่แล้ว +2

    Can you show your set up on how you deploy this stuff to Lambda specifically? Do you use AWS SDK or are you writing shell scripts to do it?

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว +3

      Usually I’d use the serverless framework. I can try making a video on it if I get time

    • @chris94kennedy
      @chris94kennedy ปีที่แล้ว

      @@WebDevCody I'd also like to see that Cody. Thanks as always

  • @dandogamer
    @dandogamer ปีที่แล้ว +4

    I've worked with serverless and AWS for 3 years and never had any fun developing on it. Debugging was always painful, documentation is poor and you end up having to buy into a ton of other services just to have a functioning yet complicated system. Not to mention the developer experience sucks, issues always cropping up on AWS but not local or you accidentally forget a permission and have to wait 10+ mins each time for the build to upload.
    If you're a small company who needs to move fast I would stop and consider whether it's worth slowing down your team for the sake of penny pinching.
    On the other hand if you are a larger team I would probably adopt a platform engineering approach to help your developers so they dont have to worry about all the intricacies

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว

      I agree with everything you said 😂 aws can turn into a convoluted nightmare

  • @Xmasparol
    @Xmasparol ปีที่แล้ว +1

    Somehow lambda containerization is good I get pip issues in python psycopg2 and layers doesn't work I did google and chatgpt didn't help and got to the point to call AWS support

  • @johndebord7802
    @johndebord7802 ปีที่แล้ว +1

    Is it best practice to just have one lambda function for your application? Or one lambda function per DynamoDB? Or multiple lambda functions for multiple REST-like requests? Kind of confused about this

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว +1

      I typically do one lambda for my entire api, but you can deploy each lambda separate but it’ll take a while to deploy

    • @johndebord7802
      @johndebord7802 ปีที่แล้ว

      @@WebDevCody I suppose having one lambda would be best because it would minimize the possibility of cold-starts as well. But I'm almost sure that cold-starts will be a relic-of-the-past someday

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว +1

      @@johndebord7802 yeah much higher chance stuff will be warm, but it does mean every endpoint will need to be configured the exact same. So if one endpoint requires more memory, you need to increase it for all endpoints. If one endpoint needs a higher timeout, you need to increase for all. It’s just trade offs

  • @cringelord511
    @cringelord511 ปีที่แล้ว

    are they similar to azure functions?

  • @digitnomad
    @digitnomad ปีที่แล้ว

    Hi Cody, when 1 AppSync do all the job, why busy with Gateway+Lambda?

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว

      I don’t use graphql

  • @roach_iam
    @roach_iam ปีที่แล้ว

    Curious, what did you guys decide to do about the pdf issue?

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว

      We managed to get puppeteer working on lambdas by making sure the deployed .zip had as little as possible in it. If we his limits again I think we’ll need to move to running docker containers on lambda

  •  ปีที่แล้ว

    What about when you have your lambda backend connected to a database and there is a spike in the number of users (number of concurrent lambda executers at the same time) but your db has a maximum number of connections? How do you handle that?

    • @uome2k7
      @uome2k7 ปีที่แล้ว

      you have to scale everything based on expected demand, ideally with some extra margin, but none of these should have unlimited growth allowances because nobody has unlimited pockets.

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว

      You’d either need to scale or configure your db to accept more concurrent connections, or you need to wrap your database behind a connection pool service gateway which will limit how many connections can be made, and your lambdas will invoke that via rest. Lambdas do stay warm so you can potentially keep a database connection open between requests as long as you put the connection outside of the scope of the handler into a global code level. But like joe mentioned, there is no silver bullet, you need to re analyze your system when you hit certain levels of scale. If you choose to just use planetscale from the start, you’ll get that scaling in the future more than likely for a price tag

    • @yuhanna_kapali
      @yuhanna_kapali ปีที่แล้ว

      using rds proxies connect to the lamda will help on number of connection

    • @moodyhamoudi
      @moodyhamoudi 9 หลายเดือนก่อน

      http or websocket based connection models support serverless by design. There's no reason to try to finagle a stateful connection model to work with a stateless compute platform, you'd only be bandaiding an gunshot wound and you will hit a wall if you intend on going past ~100 concurrent users. Firestore is probably the most well established solution in the category but there are a few reasonable providers for both SQL and noSQL like AWS Aurora, Planetscale (not actually stateless but can handle a ton of connections), DynamoDB , MongoDB Atlas (only if you're using the new Data API) to name a few.
      Sorry for venting; this exact issue nearly ended me.

  • @michaelscofield2469
    @michaelscofield2469 ปีที่แล้ว

    please make project series

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว +4

      Starting tomorrow

  • @Harish-rz4gv
    @Harish-rz4gv ปีที่แล้ว

    When do u start project series??

  • @devippo
    @devippo ปีที่แล้ว

    I think there are many devs who love maintaining and fiddling with AWS.
    To be honest I just want the the app to work for the client ASAP.

  • @andriisukhariev
    @andriisukhariev ปีที่แล้ว

    Cool thanks

  • @illiakhomenko6405
    @illiakhomenko6405 8 หลายเดือนก่อน

    Ligma-lambda is now forever in my mind😂😂😂😂

    • @WebDevCody
      @WebDevCody  8 หลายเดือนก่อน

      🤣

  • @Grahamaan27
    @Grahamaan27 5 หลายเดือนก่อน

    Ok but the competition to serverless is not EC2, but ECS or containers. Why compare against VMs???

  • @freshhorizonswithjakub
    @freshhorizonswithjakub ปีที่แล้ว +1

    roll it to 69 right? I see what you did there.

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว

      By accident, but it works out

  • @rohangodha6725
    @rohangodha6725 ปีที่แล้ว +1

    leaked env vars unlucky 💀

    • @WebDevCody
      @WebDevCody  ปีที่แล้ว +1

      Nothing on that db

  • @Chris-se3nc
    @Chris-se3nc ปีที่แล้ว +2

    What a sticky mess and terrible local dev experience. I’ll stick to kubernetes. Multi cloud out of the box.

  • @Grahamaan27
    @Grahamaan27 5 หลายเดือนก่อน

    Zero benefits I heard are not native with using ECS on AWS. Logging, auto scaling, metrics are all supported out of the box. Lambda has so much overhead and duplicated effort , if you like being inefficient but stupidly simple I see the appeal. But if youre running a production enterprise environment, I can't imagine you wouldnt want to save money and execution time by implementing autoscaled container services.