I am having hard time understanding when would terraform cdk create a workspace just by passing a unique name ?
I have a regular template with values as dynamic. Here for each invokation I would like to create new workspaces at will. How would I be able to do that ?
Also I want all the variables to be stored at Terraform Cloud and not in the code hard-coded. What I had planned is may be I could retrieve the data like api keys and other data on each invoke and want to save the same in the workspace itself. Is that possible ?
Yes. I want to create variables on Terraform Cloud. I understand TFE providers is the way to go but how would we be able to use this with CDK ?
I want to use the CDK as a reusable module where it takes in configurations and create Workspaces on the fly based on the received params. @DanielMSchmidt what do you think when would this on the fly workspace creation would be added as a feature to the CDK?
The issue is currently not in a milestone so it’s hard to estimate. The project is open source, we already have code for creating a workspace, so I would guess the PR is not too big. If you like I can guide you trough the PR and we’ll get it done together.
Yes. I want to rely on TF Cloud to manage execution as well as secrets. I would want to be able to manage state and variables which are available in TF Cloud itself. So if I want to change anything I could do it in the TF Cloud itself. I am not 100% sure if that is the way to go but I want to make sure I don’t store the secrets in the JSON file.
What are your thoughts on this ? Point is to write a generic module using CDK and it spits out the unique stack everytime (updates the old one if known name is passed) and have it as a source of truth. This stack would then be copied to some central directory or similar and terraform plan and apply from there (Since as of now I cannot rely on the current implementation where it deletes the old stack once a new synth is executed).
Point is to write a generic module using CDK and it spits out the unique stack everytime (updates the old one if known name is passed) and have it as a source of truth. This stack would then be copied to some central directory or similar and terraform plan and apply from there (Since as of now I cannot rely on the current implementation where it deletes the old stack once a new synth is executed).
I want to have the cdk behave as a generic module and run it for separate projects (configurations saved for every project). But while synth it deletes what I have already generated which invalidates my use case.
@DanielMSchmidt I don’t think this is a CDK issue. The error occurs only when deploy is executed. This needs to be handled from Terraform cli itself. Since internally CDK is dependent on the CLI to execute this.
Any other way of achieving this? I can go through the API route and create workspace before synth phase and let everything go as it is.
I think what help to get a common understanding of the exact workflow you’re trying to model, an example repository in Github would be helpful. Also, are we talking about a few generated stacks, or thousands or even more?
Its more like a PaaS, that I am talking about. I would generate a configuration one time for a project which internally creates a Cloud Workspace and manage all the configurations and variables through Terraform Cloud. Now since this module that I am talking about serves like a one time config generator (projectname → tf.json).
I cannot rely on normal cdk flow which clears all the stacks that was generated before right ?
So I raised this issue or may be a enhancement in github that I want to preserve the stack once it is generated and if there is a change I might want there I can just look at passing a param which would ignore all the stack previously generated and only update the one I passed. may be using –stackname=myproject or –override=false.
When I think of using a CDK. I want to use it to the fullest which allows me to dynamically generate block of configuration and then attach it to the main config and pass a project name which in turn will generate a new stack overall and preserve it.