Last DevOpsified project was performed on GitHub Actions and this time its is Jenkins. However, we have GitLab CI, Azure Pipeline and AWS codepipeline end to end setup videos on the channel as well. So if you want to replace the Jenkins part in the video with for example GitHub Actions, you can follow DevOpsified 1 project. I hope this project provides you an end to end overview. All the best ❤️
Hi bro, please do a video on Disaster recovery for eks cluster, I am using velero as my backup tool for my production grade cluster, please do a video on this one !
Thank you for this wonderful piece of work. Kudos!! Well done Pls, where can i find the user data shell script in the GitHub that Aman used in provisoning the EC2 jenkins server.
Successfully completed the end-to-end project, and learned a lot through the troubleshooting process. Big thanks to Abhi and Aman for the amazing work!
"This series is pure gold! 🎉 The way you've broken down the DevOps implementation on a MERN stack is incredibly detailed and easy to follow. As someone with experience in AWS, Kubernetes, and Docker, I appreciate how you've covered the end-to-end process. The integration of CI/CD, monitoring, and infrastructure as code is spot on! Can't wait to implement some of these strategies in my next project. Thank you so much, Abhishek & Amen! Keep up the fantastic work! 🚀"
i want to implement this but i only know about docker what should i learn and which resources should i follow before starting with this to understand all things properly????
00:02 Learn how to implement end-to-end DevOps on a MERN stack application 02:58 Complete DevOps Implementation on MERN Stack 09:15 DevOps implementation using Terraform, Jenkins, Argo CD, and MERN stack 12:20 MERN Stack simplifies web application development. 18:02 Setting up Jenkins server using Terraform on MERN stack application 20:09 Setting up Jenkins and SonarCube on an EC2 instance with specific configurations 24:19 Using session manager for AWS Instance connectivity 26:33 Complete setup of DevOps tools including Jenkins, Sonar, Terraform, AWS CLI 31:15 Storing and managing credentials in Jenkins for AWS. 33:31 Installing Terraform plugin and configuring global settings for tools. 37:51 Deploy EKS cluster using Jenkins file 39:45 DevOps implementation stages on MERN stack 43:50 Running Terraform plan for creating necessary resources 45:59 Creation and importance of Gem Server in VPC for private EKS clusters 49:55 Setting up tools like AWS CLI, Cube cutle, Helm, and E CLE for DevOps automation 52:06 Setting up environment and configuration using Terraform and Jenkins 56:10 Configure Argo CD and Application Load Balancer for DevOps implementation 58:06 Configuring Prometheus and Kubernetes 1:02:41 Troubleshooting connection issues with the VPC architecture 1:04:54 Setting up load balancer for EKS cluster 1:09:36 Implementing service account creation using cloud formation 1:11:32 Configuring AWS Load Balancer Controller for EKS 1:15:41 Configuring ALB and EC2 permissions for DevOps project 1:17:35 Configuring and exposing Argo CD with load balancer 1:23:13 DevOps implementation on MERN Stack application 1:25:32 Setting up SonarQube and Webhook in Jenkins 1:29:53 Setting up back end project and configuring Jenkins credentials 1:32:07 Creation of private repositories for front end and back end Docker images on ECR 1:36:21 Installing and configuring necessary tools for Docker and Node.js. 1:38:32 Installation of Node.js, SonarCube, Dependency Check, and Docker 1:42:42 Setting up DevOps pipeline for MERN Stack application 1:44:43 Implementing quality check and dependency check in the CI/CD pipeline 1:48:39 Implementing image scanning and updating tags in DevOps workflow 1:50:43 Configuring username and email for tracking changes 1:55:08 Troubleshooting missing personal access token for GitHub repository update 1:57:13 Successfully updated deployment image to version three 2:02:05 Setting up repository for Helm usage 2:04:04 Configuring Deployment, Service, and Persistence in DevOps Project 2:08:31 Setting up database, backend, and services dependencies 2:10:36 Setting up front end for the application on MERN stack 2:14:54 Mapping local domain to Route 53 domai 2:16:55 Provisioning and monitoring load balancers 2:21:14 Configuring Prometheus and Grafana using Helm repository 2:23:46 Changing service type from Cluster IP to Load Balancer 2:28:48 Setting up access to Grafana dashboard 2:30:57 Import and utilize pre-configured dashboards in Prometheus
I'm incredibly grateful for your TH-cam channel. I've gained a lot of knowledge about technology through your videos, and I must say that your explanations are even better than those of some DevOps instructors on Udemy.
I wanted to extend my sincere gratitude for your outstanding DevOps implementation series on the MERN stack application. Your comprehensive walkthrough, covering everything from infrastructure setup to deployment, was incredibly insightful and well-organized. The way you broke down complex concepts and demonstrated the end-to-end process was invaluable for someone like me, looking to deepen my understanding of DevOps in a practical setting. Your content has made a significant impact on my learning journey. Thank you for sharing your knowledge and expertise with the community @ Abhishek & Aman.
Thank you, Abhishek and Aman, for your hard work in recording this end-to-end deployment video and for providing valuable insights to the DevOps community. Your efforts are helping others to learn and grow. Happy learning, and much appreciation to both of you!
Thank you for all the hard work and effort you put into creating such valuable content. You are making a significant difference in the lives of many DevOps Engineers and DevOps Aspirants as well in gaining practical knowledge, and I am genuinely grateful for your contributions for our DevOps community.. Thanks a lot 😊
Abhishek, it's really helpful. Actually, I'm waiting for your videos, which are very informative and provide fruitful knowledge. Thank you to both Aman and Abhishek.
@@AbhishekVeeramalla Give me location for jenkins file which creates infra structure, we see only front end jenkins file and back end jenkins file, please check it once !
Its great thank you, I make sure that I will implement this project I believe this could add lot of value to my resume, Thanks a lot for your efforts in creating such valuable project for learners 🙏
I have 6 accounts I opened 6 accounts and liked and subscribed to your channel to reach more audience What a quality of presentation and explanation Thank you so much for saving our money and helping the community 💓
Thanks Abhishek. I tried doing this lab getting some issues when trying to run Jenkins job 1st time. Need to create S3 bucket and dynamo db table manually. This is going to be interesting to debug and run the pipeline. Edit: Deployed all things working fine 🙂
This is very good video Abhishek. I have few questions 1) Why the images are scanned using Trivy after uploading to the ECR cluster not before uploading 2) If I have 5 EKS cluster and I need to setup centralized monitoring using Prometheus, there we need to configure centralized Prometheus Can we please me with these questions?
Hello Abhishek Sir & @Amanpathak! first of all i have never seen this kind of amazing project in youtube. I'am grateful and can't thank both you guys enough i am extending my sincere gratitude towards both of you🙏🙏🙏...I have a doubt...on monitoring setup part while deploying prometheus my prometheus-alertmanager and prometheus-server pods not getting up..found that PVCs are not getting bounded...can help me on that?
Thanks for the video . Can you please consider creating one more session in which you will do all the jenkins part with Terraform + Helm and configure the Jenkins plugins / jobs / etc as a code? Thanks
Thanks both for the wonderful demo. Just a thought, would it be better to use aws role for the jump box rather than storing credentials on the box, from security point of view it's not best practice what i think.
Maybe I am missing something but using session manager requires opening port 443 on outbound, Did anyone encounter this issue, secondly, the user data didn't run and I was unable to install tools on the server, lastly the blog didn't align with this video at all. I stand to corrected.
Hi Abhishek, thanks for this video. Do you have an estimate of the cost of this project, assuming I delete all resources immediately afterwards? Usually LBs & other resources would come with a cost if I'm not wrong.
Thank you for your endless efforts. Abhishek how can we secure the jenkins as we store all the secrets like docker, sonar, aws, ecr so on.. Of course the https/ssl is there, but how the jenkins will configure in the real time production environment to achieve secure scalable and HA.
@42 min only we copy terraform code from jenkins file and run via jenkins job ........do we need to copy other terraform code somewhere ? ......might be looking a silly question but at the moment looking bit confused to me .
hey bro..if i dont have any knowledge on terraform and jenkins then should i proceed with this??? i was thinking to learn these while doing this project...
Thanks a lot Bhaiya for this Gold. Can you please help us with a course for pipeline as code series. Through which we can learn things like how we can work in a real production environment. Most people aware about structure of it but they don't know what we have to write in stages so want a in-depth course for same. Waiting for your response. Thanks a lot for help.
Thankyou Aman and Abhishek for the exceptional Project. Here, setting up Secrets for various services was mind-boggling to me, like, from which service Secret Token was created and what service it was copied into, if you can create, perhaps, a separate video, that would be awesome. Also in this project, If Ingress Controller automatically creates a Load Balancer, why there was a need to create AWS Application Load Balancer?
Tokens are created for authentication, here most are copied into the Jenkins for it to pull the source code (github), Docker image (ECR for which we added AWS creds/Account details) etc ... the App load balancer was created for Argo CD.
@@aliakber786 Thanks Ali for the token clarification. In Devopsified-1 video, Argo CD was used and Ingress Controller automatically took care of the Load-Balancer (I think EKS cluster was used with NLB under the hood - automatically). In this video, implementing Load-balancer was more of a manual approach which bothers me, as of why manual approach.... :)
@AbhishekVeeramalla/Aman The aws-load-balancer-controller in Kubernetes creates what type of external load balancer in AWS, is it ALB or NLB and by what it decides to create so? Please answer this it would be of great help...
Hello, Abhishek first of all thanks for providing such kind of advanced content . but when i was trying to do this project i have faced an error when adding task in my todo list but i was unable to do that.
Can you please create a project where in we also promote an application from dev to staging and prod using ci CD... this gives us a complete idea not just build and deploy it to dev but also promoting it till production. It's a humble request please consider the requirement as this is going to help in interviews
Hi @AbhishekVeeramalla Sir. Thanks for this great project. I've been trying to connect to the Ec2 using session manager but I'm continuously receiving the below error. I was not able to resolve the error. Could you please help me with this? "SSM Agent is not online The SSM Agent was unable to connect to a Systems Manager endpoint to register itself with the service."
Thanks for the video Abhishek! Are the below steps missing from the video ? I am following the video but faced an issue When I reached to the step of running Jenkins build for the first time, I faced an error : Error : Error refreshing state: Unable to access object "eks/terraform.tfstate" in S3 bucket "my-ews-baket1": operation error S3: HeadObject, https response error StatusCode: 403 Where exactly have to do this step of setting up backend S3 and dyanmoDB ?
Having error while running jenkins for the first time "Error refreshing state: Unable to access object "eks/terraform.tfstate" in S3 bucket "my-ews-baket1": operation error S3: HeadObject, https response error StatusCode: 403". Any solution for it
It would be very nice if we have a community like discord to discuss and meet like minded people to connect with and collab together building projects, please consider this suggestion.
Why don't we always follow best practices, like creating a least-privileged user, when writing Dockerfiles? Is it okay to keep Dockerfiles simple instead?
@@AbhishekVeeramalla Yeah, I personally use these steps to write Dockerfiles. But I wanted to know how they write Dockerfiles in the industry level? Like do they write simple Dockerfiles or they follow the best practices like least privileged users and multistage builds?
Error refreshing state: Unable to access object "eks/terraform.tfstate" in S3 bucket "my-ews-baket1": operation error S3: HeadObject, https response error StatusCode: 403". You have to create an S3 Bucket and Dynamodb table and replace it's values in this file eks/backend.tf This error because the script couldn't find the S3 backend bucket and you can't use the same bucket name "my-ews-bucket" as the bucket name must be uniquid globally
Last DevOpsified project was performed on GitHub Actions and this time its is Jenkins. However, we have GitLab CI, Azure Pipeline and AWS codepipeline end to end setup videos on the channel as well.
So if you want to replace the Jenkins part in the video with for example GitHub Actions, you can follow DevOpsified 1 project.
I hope this project provides you an end to end overview. All the best ❤️
Hi bro, please do a video on Disaster recovery for eks cluster, I am using velero as my backup tool for my production grade cluster, please do a video on this one !
Thank you for this wonderful piece of work. Kudos!! Well done
Pls, where can i find the user data shell script in the GitHub that Aman used in provisoning the EC2 jenkins server.
@@festusebin its in the repo itself. tools-install.sh
github.com/AmanPathak-DevOps/End-to-End-Kubernetes-Three-Tier-DevSecOps-Project/blob/master/Jenkins-Server-TF/tools-install.sh
@@AbhishekVeeramalla yes I got it , let’s focus on disaster recovery tools like velero and other da tools 😉😉
Successfully completed the end-to-end project, and learned a lot through the troubleshooting process. Big thanks to Abhi and Aman for the amazing work!
can u please tell me what charges are amount u spent on complete this project
"This series is pure gold! 🎉 The way you've broken down the DevOps implementation on a MERN stack is incredibly detailed and easy to follow. As someone with experience in AWS, Kubernetes, and Docker, I appreciate how you've covered the end-to-end process. The integration of CI/CD, monitoring, and infrastructure as code is spot on! Can't wait to implement some of these strategies in my next project. Thank you so much, Abhishek & Amen! Keep up the fantastic work! 🚀"
i want to implement this but i only know about docker what should i learn and which resources should i follow before starting with this to understand all things properly????
00:02 Learn how to implement end-to-end DevOps on a MERN stack application
02:58 Complete DevOps Implementation on MERN Stack
09:15 DevOps implementation using Terraform, Jenkins, Argo CD, and MERN stack
12:20 MERN Stack simplifies web application development.
18:02 Setting up Jenkins server using Terraform on MERN stack application
20:09 Setting up Jenkins and SonarCube on an EC2 instance with specific configurations
24:19 Using session manager for AWS Instance connectivity
26:33 Complete setup of DevOps tools including Jenkins, Sonar, Terraform, AWS CLI
31:15 Storing and managing credentials in Jenkins for AWS.
33:31 Installing Terraform plugin and configuring global settings for tools.
37:51 Deploy EKS cluster using Jenkins file
39:45 DevOps implementation stages on MERN stack
43:50 Running Terraform plan for creating necessary resources
45:59 Creation and importance of Gem Server in VPC for private EKS clusters
49:55 Setting up tools like AWS CLI, Cube cutle, Helm, and E CLE for DevOps automation
52:06 Setting up environment and configuration using Terraform and Jenkins
56:10 Configure Argo CD and Application Load Balancer for DevOps implementation
58:06 Configuring Prometheus and Kubernetes
1:02:41 Troubleshooting connection issues with the VPC architecture
1:04:54 Setting up load balancer for EKS cluster
1:09:36 Implementing service account creation using cloud formation
1:11:32 Configuring AWS Load Balancer Controller for EKS
1:15:41 Configuring ALB and EC2 permissions for DevOps project
1:17:35 Configuring and exposing Argo CD with load balancer
1:23:13 DevOps implementation on MERN Stack application
1:25:32 Setting up SonarQube and Webhook in Jenkins
1:29:53 Setting up back end project and configuring Jenkins credentials
1:32:07 Creation of private repositories for front end and back end Docker images on ECR
1:36:21 Installing and configuring necessary tools for Docker and Node.js.
1:38:32 Installation of Node.js, SonarCube, Dependency Check, and Docker
1:42:42 Setting up DevOps pipeline for MERN Stack application
1:44:43 Implementing quality check and dependency check in the CI/CD pipeline
1:48:39 Implementing image scanning and updating tags in DevOps workflow
1:50:43 Configuring username and email for tracking changes
1:55:08 Troubleshooting missing personal access token for GitHub repository update
1:57:13 Successfully updated deployment image to version three
2:02:05 Setting up repository for Helm usage
2:04:04 Configuring Deployment, Service, and Persistence in DevOps Project
2:08:31 Setting up database, backend, and services dependencies
2:10:36 Setting up front end for the application on MERN stack
2:14:54 Mapping local domain to Route 53 domai
2:16:55 Provisioning and monitoring load balancers
2:21:14 Configuring Prometheus and Grafana using Helm repository
2:23:46 Changing service type from Cluster IP to Load Balancer
2:28:48 Setting up access to Grafana dashboard
2:30:57 Import and utilize pre-configured dashboards in Prometheus
if i dont have any knowledge on terraform and jenkins then should i proceed with this??? i was thinking to learn these while doing this project...
Really, Thank you so much for this Abhishek, much needed, specially along with EKS, Terraform, Prometheus and Grafana. Hats off to you Bro.
I'm incredibly grateful for your TH-cam channel. I've gained a lot of knowledge about technology through your videos, and I must say that your explanations are even better than those of some DevOps instructors on Udemy.
I wanted to extend my sincere gratitude for your outstanding DevOps implementation series on the MERN stack application. Your comprehensive walkthrough, covering everything from infrastructure setup to deployment, was incredibly insightful and well-organized.
The way you broke down complex concepts and demonstrated the end-to-end process was invaluable for someone like me, looking to deepen my understanding of DevOps in a practical setting. Your content has made a significant impact on my learning journey.
Thank you for sharing your knowledge and expertise with the community @ Abhishek & Aman.
Thanks alot ❤️
Thank you, Abhishek and Aman, for your hard work in recording this end-to-end deployment video and for providing valuable insights to the DevOps community. Your efforts are helping others to learn and grow. Happy learning, and much appreciation to both of you!
Thank You for work both you are done, this is entire DevOps course in one video, learn a lot of things, will review it extra two times.
devOpsified videos are really great idea......its giving real end to end at once
Thank you for all the hard work and effort you put into creating such valuable content. You are making a significant difference in the lives of many DevOps Engineers and DevOps Aspirants as well in gaining practical knowledge, and I am genuinely grateful for your contributions for our DevOps community.. Thanks a lot 😊
My pleasure!
Thanks!! Please continue doing devopsified videos from time to time! This section is gold! :)
Abhishek, it's really helpful. Actually, I'm waiting for your videos, which are very informative and provide fruitful knowledge. Thank you to both Aman and Abhishek.
abhishek waited for you,tq for your hardwork.
Most welcome 😊
@@AbhishekVeeramalla we missed your explanation the way of your explanation is good whether compared to him.
Thanks for providing me an assignment for the weekend. I will get this implemented
Excellent!
@@AbhishekVeeramalla Give me location for jenkins file which creates infra structure, we see only front end jenkins file and back end jenkins file, please check it once !
Its great thank you, I make sure that I will implement this project I believe this could add lot of value to my resume, Thanks a lot for your efforts in creating such valuable project for learners 🙏
Abhishek, Thanks for your all videos
You are gem bro loves this mern mongo express react node application build tool npm brilliant Abhishek bhai no one has covered node js application
Amazing Abhi and Aman definitely will try to implement this !!
Great Thanks 🤝 Abhishek and Aman for your effort in sharing this. It is in my To Do List now🎉💫👍
Thanks for sharing your knowledge with the devops community
Thank you so much @AbhishekVeeramalla and @AmanPathak, this is the BEST END-TO-END SETUP of the CICD project 👌✊🙏
I have 6 accounts
I opened 6 accounts and liked and subscribed to your channel to reach more audience
What a quality of presentation and explanation
Thank you so much for saving our money and helping the community 💓
Wow, thanks
You are God gifted for us❤
Eagerly waitied abhishek. Tq uuu
Thank you for sharing your knowledge for the community
Thanks Abhishek.
I tried doing this lab getting some issues when trying to run Jenkins job 1st time.
Need to create S3 bucket and dynamo db table manually.
This is going to be interesting to debug and run the pipeline.
Edit: Deployed all things working fine 🙂
same issue can you tell me how to resolve that problem ?
@@LuffyMonkey-sy5qt you can comment lock-files dynamo db table
If region is different then change in dev.tfvar file also.
how to do the domain part for the ingress, did you bought a domain?
@HellCRICKET Same Error, can you elaborate how to do it?
thanks @HellCRICKET
Many thanks for this wonderful project
You are the boss of DevOps
Hello Ganesh Anna...I am also from Udupi....where from Udupi ?
Hello. I am from Manipal.
Thanks Abhi. Will watch vedio asap
One of the greatest video🙏🙏
This is very good video Abhishek. I have few questions
1) Why the images are scanned using Trivy after uploading to the ECR cluster not before uploading
2) If I have 5 EKS cluster and I need to setup centralized monitoring using Prometheus, there we need to configure centralized Prometheus
Can we please me with these questions?
It was a great practical for increase handon, It would have great if provide how to delete the aws resources after creation 😊
Use Terraform destroy
Awesome guys🎉🎉🎉
Great session!!!!!
Thankyou Abhishek sir❤
with respect to locking the statefile. pls did you create the dynamo db and s3 bucket seperately before running the terraform script ?
you da best < 3
Awesome ❤❤❤
thanks we need more videos
Hello Abhishek Sir & @Amanpathak! first of all i have never seen this kind of amazing project in youtube. I'am grateful and can't thank both you guys enough i am extending my sincere gratitude towards both of you🙏🙏🙏...I have a doubt...on monitoring setup part while deploying prometheus my prometheus-alertmanager and prometheus-server pods not getting up..found that PVCs are not getting bounded...can help me on that?
Remember to create your own s3 and dynamoDB table if you fork his github repo
Thanks
sir, you don't monitor the application
Thanks for the video .
Can you please consider creating one more session in which you will do all the jenkins part with Terraform + Helm and configure the Jenkins plugins / jobs / etc as a code?
Thanks
Thanks both for the wonderful demo. Just a thought, would it be better to use aws role for the jump box rather than storing credentials on the box, from security point of view it's not best practice what i think.
Start using CloudShell instead of using jump host.
no mention of jump server in the blog post, the method in blog post is different to the method followed in the video
Maybe I am missing something but using session manager requires opening port 443 on outbound, Did anyone encounter this issue, secondly, the user data didn't run and I was unable to install tools on the server, lastly the blog didn't align with this video at all. I stand to corrected.
Hi Abhishek, thanks for this video.
Do you have an estimate of the cost of this project, assuming I delete all resources immediately afterwards? Usually LBs & other resources would come with a cost if I'm not wrong.
you came to know the estimated cost plz let me know
Awesome !!!!!!!
Thank you for your endless efforts. Abhishek how can we secure the jenkins as we store all the secrets like docker, sonar, aws, ecr so on.. Of course the https/ssl is there, but how the jenkins will configure in the real time production environment to achieve secure scalable and HA.
Thank you Aman Pathak and Abhishek sir......
Hi Abhishek thank you for this excellent content.
I'm requesting you please make an End to End Gitlab CI CD Project.
GitLab CI project is already available. Rest the CD part, monitoring and everything is covered in this video ☺️
can u we use the same stages , same yml files, same docker configuration , for our own product development also ? . in mern stack ?
@42 min only we copy terraform code from jenkins file and run via jenkins job ........do we need to copy other terraform code somewhere ? ......might be looking a silly question but at the moment looking bit confused to me .
hey bro..if i dont have any knowledge on terraform and jenkins then should i proceed with this??? i was thinking to learn these while doing this project...
Thanks a lot Bhaiya for this Gold. Can you please help us with a course for pipeline as code series. Through which we can learn things like how we can work in a real production environment. Most people aware about structure of it but they don't know what we have to write in stages so want a in-depth course for same. Waiting for your response.
Thanks a lot for help.
many thanks abhishek bro & aman , could able to implement as you mentioned in the video , took me around 10 hrs.
take a bow !
Amazing. Great Dedication
how much did it cost bro?
nice work bro! inspiring
I have some errors. Would you help me?
Which is the best tool to draw architecture diagram, which tool is use to draw this
@2:14:54 Mapping local domain to Route 53 domain , if someone dont have this then we write localhost in file ?
Bro please make video on multi branch pipeline setup for microservices
Ok soon
I will try and see this for myself
Here Aman pathak looking more famous than phalguni pathak
Great 👍
Hi Abhishek, Great stuff..
What is the name of domain provider.. which aman used
Thanks anna🎉❤
Hi Abhishek, I am strucked in the begning my ec2 ip is not working in 8080 port I think problem in SG I allowed all ssh, HTTP, HTTPS plz help me
Thank you for the video. Where can I get the user data for the Jenkin server? It's not in the attached links.
Check the github repository
@@AbhishekVeeramalla i couldnot find the userdata for jenkins-server could could please provide it sir
Good throw nice
A question to all: Shouldn't we see 2 ALB, one for argocd service and one for AWS ALB ingress controller? Thanks
my GOAT!!!!!!!! 🐐🐐🐐
Thankyou Aman and Abhishek for the exceptional Project. Here, setting up Secrets for various services was mind-boggling to me, like, from which service Secret Token was created and what service it was copied into, if you can create, perhaps, a separate video, that would be awesome.
Also in this project, If Ingress Controller automatically creates a Load Balancer, why there was a need to create AWS Application Load Balancer?
Tokens are created for authentication, here most are copied into the Jenkins for it to pull the source code (github), Docker image (ECR for which we added AWS creds/Account details) etc ... the App load balancer was created for Argo CD.
@@aliakber786 Thanks Ali for the token clarification. In Devopsified-1 video, Argo CD was used and Ingress Controller automatically took care of the Load-Balancer (I think EKS cluster was used with NLB under the hood - automatically). In this video, implementing Load-balancer was more of a manual approach which bothers me, as of why manual approach.... :)
@@faisalraj6654 I think he created a load balancer service for the argo CD even in deveopsified 1 video. Look at the repo argo files.
@AbhishekVeeramalla/Aman The aws-load-balancer-controller in Kubernetes creates what type of external load balancer in AWS, is it ALB or NLB and by what it decides to create so? Please answer this it would be of great help...
alb
@@Karthickcoach why alb reason?
Thanks sir ❤🙏
Hello, Abhishek first of all thanks for providing such kind of advanced content . but when i was trying to do this project
i have faced an error when adding task in my todo list but i was unable to do that.
Sir what is point to use single node Jenkins architecture and two worker node eks cluster in above demo..i mean any significance
Does he use the same keys that we used aws configure on Jenkins and other steps
what is the reason for the two Load balancer
Hi bro..
Can you please do video on creating multi-master node kubernetes cluster by kubeadm & kops
Can you please create a project where in we also promote an application from dev to staging and prod using ci CD... this gives us a complete idea not just build and deploy it to dev but also promoting it till production.
It's a humble request please consider the requirement as this is going to help in interviews
Is this for beginners orb who already know devops
Hi @AbhishekVeeramalla Sir. Thanks for this great project. I've been trying to connect to the Ec2 using session manager but I'm continuously receiving the below error. I was not able to resolve the error. Could you please help me with this?
"SSM Agent is not online
The SSM Agent was unable to connect to a Systems Manager endpoint to register itself with the service."
So this is basically hosting everything on 1 single huge server?
Why don't make it like Microservices?
Can we have application monitoring with k8s too? Implementing opentelemetry and jeager?
All the things i understand the database part how to connect with this project for example the aws rds private postgres or mysql how to connect
Thanks for the video Abhishek!
Are the below steps missing from the video ? I am following the video but faced an issue When I reached to the step of running Jenkins build for the first time, I faced an error :
Error : Error refreshing state: Unable to access object "eks/terraform.tfstate" in S3 bucket "my-ews-baket1": operation error S3: HeadObject, https response error StatusCode: 403
Where exactly have to do this step of setting up backend S3 and dyanmoDB ?
create bucket manually if it is not there. And also create dynamodb table
@@traveldiaries1999 thanks
Abhi sir in all your devops and aws videos you never used user data(optional) in ec2 lanching but aman used it why?
Great
thank you helpfull
Having error while running jenkins for the first time "Error refreshing state: Unable to access object "eks/terraform.tfstate" in S3 bucket "my-ews-baket1": operation error S3: HeadObject, https response error StatusCode: 403". Any solution for it
did you give all the permision to aws the user whose credentials you are using as well as i think we have to create the s3 bucket manuaally
It would be very nice if we have a community like discord to discuss and meet like minded people to connect with and collab together building projects, please consider this suggestion.
He does have one on slack.
Could you please start series on Google cloud also?
do we have to configure sonarqube checks or testcases like how its able to compare the code smells.
Bro could please do flutter application also please with the backend APIs and data dump into the s3 bucket list please.....
This infra project will work Free tier aws account will work or we may need to take some permanent aws account?
Hey Abhishek , How difficult is F5 networks coding interview.
Aman using one ec2 for Jenkins, all ci integrated tools and another for jump server... what if we use first ec2 for jump server as well
More over in first EC2 we enable the security group but in jump server EC2 Aman said no need security group why sir?
hi Abhishek. my question might seem silly. but how much does it cost do implement this in aws?
If you keep the entire project live for 3 days, cost will be around 50 USD. I have done the same
Why don't we always follow best practices, like creating a least-privileged user, when writing Dockerfiles? Is it okay to keep Dockerfiles simple instead?
There are videos on the channel which covers that as well. You can definitely try to include multistage builds and distroless images.
@@AbhishekVeeramalla Yeah, I personally use these steps to write Dockerfiles. But I wanted to know how they write Dockerfiles in the industry level? Like do they write simple Dockerfiles or they follow the best practices like least privileged users and multistage builds?
Yes multistage
Create s3 bucket and dynamoDB table with LOCKID as key for table.
Thanks
Error refreshing state: Unable to access object "eks/terraform.tfstate" in S3 bucket "my-ews-baket1": operation error S3: HeadObject, https response error StatusCode: 403".
You have to create an S3 Bucket and Dynamodb table and replace it's values in this file eks/backend.tf
This error because the script couldn't find the S3 backend bucket and you can't use the same bucket name "my-ews-bucket" as the bucket name must be uniquid globally