I recently transitioned from sysadmin work to devops, and your videos have been an amazing resource. They provide a lot of clarity when confronted with so many moving parts. Thank you!
We have inbuilt git repositories now with yaml tasks similar to github workflows in DevOps now so im not seeing the point of this nowdays but great video
Hello Ned, great video and thanks for sharing! One quick question, I don't get what you are ADO feature you are referring to when you mention automatic creation of service principal that would be used from the task runner. Can you be more specific?
Great video, Ned. Do you have any more information on how you went about setting up Terraform Cloud, as in what needs to be created to hold the var? Is it a workspace?
Hey Ned, Thanks for the awesome content as always! I see that you configured your pipeline with Azure Hosted DevOps Agent. Yet, there is a known issue if a DevOps Agents & Storage Accounts are in the same region. If your storage account has its firewall enable and DevOps agent is deployed in the same region, the pipeline job will fail as Storage Account firewall will block it. An agent that is deployed in the same region of Storage account communicates via Private IP instead of public ones (Store Account Firewall does not support Prive IP whitelist). Do you have any suggestions in to implement a similar pipeline that is not self-hosting the DevOps agent?
At 17:32 you mention running the "setup script" to create the initial project with pipeline in Azure DevOps. What setup script? What steps did you take to get the infrastructure in the "setup" directory deployed?
I am also struggling to configure . I think we should buy Udemy course to know about the configuration process because some configuration seems to be missing.
Error: Either an Access Key / SAS Token or the Resource Group for the Storage Account must be specified - or Azure AD Authentication must be enabled. Please Help to execute .
Hi @Ned, Thanks for the great tutorial. I followed the exact same steps but some how the Vnets are never created. Pipeline runs but there is nothing in the plan. The pipeline path and everything are exactly same
The process would be largely the same, except you would need to supply AWS credentials through environment variables. And you would need an AWS Terraform config you want to deploy.
Hi, that's stupendous Ned. I have a query, can we access variable groups in terraform file like I have client I'd, secret, tenant I'd and all I have added them in variable group and I want use those variables in terraform file is it possible? waiting for your reply thanks in advance.
You could pass them as input variables to the configuration, or if they are stored in environment variables you can use this provider: registry.terraform.io/providers/EppO/environment/latest
Could you please help me in resolving this issue. I am getting an error in azure devops pipeline with error "/providers/Microsoft.Web/sourceControls/GitHub" already exists. When I gave terraform import in additional commands column in azure devops pipeline, I got error "Too many arguments. Need one positional argument". Could you please help
Where this ARM_SAS_TOKEN: $(sas_token) is defined ? I don't see in your video. I am struggling to configure this project. I cloned it in my machine but not able to run. Is there any video to configure this project in our machine ?
The 'setup' portion of the demo was used in Terraform Cloud. Once the pipeline executed (creating the VNET) the storage account would keep the state of that resource.
Thanks for the super informative video! Can azure pipelines use terraform to deploy infrastructure in other cloud vendors like Oracle? In that case, how azure pipelines will connect to OCI? By adding OCI credentials to azure pipeline variables? Is there any way other than that?
Yeah, after some reflection I think I'd rather control the version of Terraform that's installed instead of using the one that comes with the runner VM.
I see no reason why don't you start with best practices even if you want to show people the basic configurations! there's no reason to use env variables! you can use variables group and integrate it with your azure keyvault to get some secrets or sensitive data.
What I've being doing for my Terraform variables that are tokens or sensitive that that i create a secrets.tfvars that is in the .gitignore then i copy the file to secrets.tfvars.vault then encrypt the file with ansible-vault and it's a encrypted file.
I recently transitioned from sysadmin work to devops, and your videos have been an amazing resource. They provide a lot of clarity when confronted with so many moving parts. Thank you!
Thanks Kirk!! Hope all is going well.
How the hell you already have a video for every idea I have in the shower? You absolutely kill it, man! Best channel in the world!
This was just released! Thank you for this!
This guy is amazing, thank you!
material and knowledge very help me and better than on udemy, plurasight etc. Super job !!! thx
Thanks for the great video’s Ned!
Super informative video, can you please make more videos on Azure DevOps and Terraform
Sure thing! This is the first of a series.
We have inbuilt git repositories now with yaml tasks similar to github workflows in DevOps now so im not seeing the point of this nowdays but great video
Hello Ned, great video and thanks for sharing! One quick question, I don't get what you are ADO feature you are referring to when you mention automatic creation of service principal that would be used from the task runner. Can you be more specific?
Great video, Ned. Do you have any more information on how you went about setting up Terraform Cloud, as in what needs to be created to hold the var? Is it a workspace?
Hey Ned,
Thanks for the awesome content as always!
I see that you configured your pipeline with Azure Hosted DevOps Agent. Yet, there is a known issue if a DevOps Agents & Storage Accounts are in the same region.
If your storage account has its firewall enable and DevOps agent is deployed in the same region, the pipeline job will fail as Storage Account firewall will block it. An agent that is deployed in the same region of Storage account communicates via Private IP instead of public ones (Store Account Firewall does not support Prive IP whitelist).
Do you have any suggestions in to implement a similar pipeline that is not self-hosting the DevOps agent?
Great ❤!! Do you have the same, but for ArgoCD instead Azure devops?
I don't, but def thinking about it!
At 17:32 you mention running the "setup script" to create the initial project with pipeline in Azure DevOps. What setup script? What steps did you take to get the infrastructure in the "setup" directory deployed?
Did you get an answer to this?
@@dkleinm unfortunately not!
I am also struggling to configure . I think we should buy Udemy course to know about the configuration process because some configuration seems to be missing.
Error: Either an Access Key / SAS Token or the Resource Group for the Storage Account must be specified - or Azure AD Authentication must be enabled. Please Help to execute .
Was the workspace created within terraform a version control workflow or cli-driven workflow please?
Hi @Ned, Thanks for the great tutorial. I followed the exact same steps but some how the Vnets are never created. Pipeline runs but there is nothing in the plan. The pipeline path and everything are exactly same
can you list the prequiste please? you utilise terraform cloud and terraform cli! can you list the steps? Thank you
Nice ❤
Would love to see this again but deploying AWS resources using azure pipelines
The process would be largely the same, except you would need to supply AWS credentials through environment variables. And you would need an AWS Terraform config you want to deploy.
Hi, that's stupendous Ned. I have a query, can we access variable groups in terraform file like I have client I'd, secret, tenant I'd and all I have added them in variable group and I want use those variables in terraform file is it possible? waiting for your reply thanks in advance.
You could pass them as input variables to the configuration, or if they are stored in environment variables you can use this provider: registry.terraform.io/providers/EppO/environment/latest
can i have a link to the other parts of this pipeline where you used azure vault to store those secrets and variables
Could you please help me in resolving this issue. I am getting an error in azure devops pipeline with error "/providers/Microsoft.Web/sourceControls/GitHub" already exists. When I gave terraform import in additional commands column in azure devops pipeline, I got error "Too many arguments. Need one positional argument". Could you please help
Solid, I hope it’s with AWS rather than azure reaoirces
How have you only got 4.4k subscribers!
Ned u rock! could go over checkov in one of you episodes? there are good amount of video, but your way is always unique :)
Defintely planning to incorporate Checkov. My next video is going to add Azure Key Vault to hold variable values.
Where this ARM_SAS_TOKEN: $(sas_token) is defined ? I don't see in your video. I am struggling to configure this project. I cloned it in my machine but not able to run. Is there any video to configure this project in our machine ?
Can you tell me which folder on the Terraform Tuesday repository you're following? It's been a while and I have a few different versions on there.
@Ned.. can you make something on terraform -? Azure-> Githb integration [for using github as a code]
Hi Ned, I would like to ask why did you use both backend I mean Terraform Cloud and Azure (storage account etc)?
The 'setup' portion of the demo was used in Terraform Cloud. Once the pipeline executed (creating the VNET) the storage account would keep the state of that resource.
Thanks for the super informative video!
Can azure pipelines use terraform to deploy infrastructure in other cloud vendors like Oracle? In that case, how azure pipelines will connect to OCI? By adding OCI credentials to azure pipeline variables? Is there any way other than that?
As I'm aware, yes. You just need to add the corresponding provider to the config file. I haven't tried that out.
Can you do videos on strategy based YAML pipeline?
I'm not familiar with a strategy based pipeline. Could you elaborate?
@@NedintheCloud github.com/microsoft/azure-pipelines-yaml/blob/master/design/deployment-strategies.md
It'll give you a clear idea.
Ok seems that Microsoft extension for terraform didn't do the trick for me. Installed Charles Zipp's instead and boom
Yeah, after some reflection I think I'd rather control the version of Terraform that's installed instead of using the one that comes with the runner VM.
I see no reason why don't you start with best practices even if you want to show people the basic configurations! there's no reason to use env variables! you can use variables group and integrate it with your azure keyvault to get some secrets or sensitive data.
What I've being doing for my Terraform variables that are tokens or sensitive that that i create a secrets.tfvars that is in the .gitignore then i copy the file to secrets.tfvars.vault then encrypt the file with ansible-vault and it's a encrypted file.