Terrars is a tools for building Terraform stacks in Rust. This is an alternative to the CDK.
Current status: Usable, but may have some rough edges and missing features. I may continue to tweak things to improve ergonomics.
Why use this/CDK instead of raw Terraform?
Why use this instead of the CDK?
cdk
requires terraform
, a cdk
CLI, Javascript tools, Javascript package directories, and depending on what language you use that language itself as well. CDK generation requires a json spec -> typescript -> generated javascript -> final language
translation process. terrars
only requires terraform
both during generation and runtime and goes directly from the JSON spec to Rust.Why not use this instead of the CDK/raw Terraform?
Install pre-generated bindings such as terrars-andrewbaxter-stripe or else generate your own (see [Generation] below).
Develop your code
Create a Stack
and set up providers:
rust
let mut stack = BuildStack{}.build();
BuildProviderStripe {
token: STRIPE_TOKEN,
}.build(&mut stack);
The first provider instance for a provider type will be used by default for that provider's resources, so you don't need to bind it.
Then create resources:
rust
let my_product = BuildProduct {
name: "My Product".into(),
}.build(&mut stack);
let my_price = BuildPrice {
...
}.build(&mut stack);
my_price.set_product(my_product.id());
...
Finally, write the stack out:
rust
stack.serialize("mystack.tf.json")?;
Call terraform
on your stack as usual
(Stack
also has methods run()
and get_output()
to call terraform
for you. You must have terraform
in your path.)
As an example, to use hashicorp/aws
, create a json file (ex: terrars_aws.json
) with the specification of what you want to generate:
json
{
"provider": "hashicorp/aws",
"version": "4.48.0",
"include": [
"cognito_user_pool",
"cognito_user_pool_client",
"cognito_user_pool_domain",
"cognito_user_pool_ui_customization",
"route53_zone",
"route53_record",
"aws_acm_certificate",
"aws_acm_certificate_validation"
],
"dest": "src/bin/mydeploy/tfschema/aws"
}
tfschema/aws
must be an otherwise unused directory - it will be wiped when you genenerate the code. If include
is missing or empty, this will generate everything (alternatively, you can use exclude
to blacklist resources/datasources). Resources and datasources don't include the provider prefix (aws_
in this example). Datasources start with data_
.
Make sure you have terraform
in your PATH
. Run cargo install terrars
, then terrars-generate terrars_aws.json
.
The first time you do this, create a src/bin/mydeploy/tfschema/mod.rs
file with this contents to root the generated provider:
pub mod aws;
There are Build*
structs containing required parameters and a build
method for most schema items (resources, stack, variables, outputs, etc). The build
method registers the item in the Stack
if applicable. Optional parameters can be set on the value returned from build
.
In Terraform, all fields regardless of type can be assigned a string template expression for values computed during stack application. Since all strings can potentially be templates, non-template strings must be escaped to avoid accidental interpolation.
When defining resources and calling methods, String
and &str
will be treated as non-template strings and appropriately escaped. To avoid the escaping, you can produce a PrimExpr
object via stack.str_expr
(to produce an expr that evaluates to a string) or stack.expr
for other expression types. To produce the expression itself you can use format!()
as usual, but note - you must call .raw()
on any PrimExpr
s you use in the new expression to avoid double-unescaping issues.
If Terraform gives you an error about something with the text _TERRARS_SENTINEL*
it means you probably missed a .raw()
call on that value.
Lists, sets, and record references have a .map
method which takes care of all the different "for" methods in Terraform. Specifically
.map
and define a resource: does resource-level for-each (per Terraform limitations, this cannot be done on lists derived from other resources so has very limited use).map
and define a block element: does block-level for-each.map
and return an attribute reference: produces an attribute for expression.map
always produces a list reference, but this can be assgned to set fields as well. .map_rec
is similar to .map
but results in a record.
Terraform provides a method to output provider schemas as json. This tool uses that schema to generate structures that would output matching json Terraform stack files.
Take as an example:
rust
format!("{}{}", my_expr, verbatim_string))
This code would somehow need to escape the pattern and verbatim_string
, while leaving my_expr
unescaped, and the result would need to be treated as an "expression" to prevent escaping if it's used again in another format!
or something. This applies to not just format!
but serde serialization (json), other methods.
For now Terrars uses a simple (somewhat dirty) hack to avoid this. All expressions are put into a replacement table, and a sentinel string (ex: _TERRARS_SENTINEL_99_
) is used instead. During final stack json serialization, the strings are escaped and then the original expression text is substituted back in , replacing the sentinel text.
This way, all normal string formatting methods should retain the expected expressions.
Not all Terraform features have been implemented
The only one I'm aware of missing at the moment is resource "provisioning".
ignore_changes
takes strings rather than an enum
No variable or output static type checking
I'd like to add a derive macro for generating variables/outputs automatically from a structure at some point.
Non-local deployment methods
I think this is easy, but I haven't looked into it yet.
I originally called this terrarust
but then I realized it sounded like terrorist so I decided to play it safe and went with terrars
.