I'm trying to understand how to do parameter substitution when lauching AWS Batch jobs. What I need to do is provide an S3 object key to my AWS Batch job. I haven't managed to find a Terraform example where parameters are passed to a Batch job and I can't seem to get it to work.
The documentation for aws_batch_job_definition contains the following example:
resource "aws_batch_job_definition" "test" {
name = "tf_test_batch_job_definition"
type = "container"
container_properties = <<CONTAINER_PROPERTIES
{
"command": ["ls", "-la"],
"image": "busybox",
"memory": 1024,
"vcpus": 1,
"volumes": [
{
"host": {
"sourcePath": "/tmp"
},
"name": "tmp"
}
],
"environment": [
{"name": "VARNAME", "value": "VARVAL"}
],
"mountPoints": [
{
"sourceVolume": "tmp",
"containerPath": "/tmp",
"readOnly": false
}
],
"ulimits": [
{
"hardLimit": 1024,
"name": "nofile",
"softLimit": 1024
}
]
}
CONTAINER_PROPERTIES
}
Let's say that I would like for VARNAME
to be a parameter, so that when I launch the job through the AWS Batch API I would specify its value. How is this accomplished? According to the docs for the aws_batch_job_definition
resource, there's a parameter called parameters
. However, this is a map and not a list, which I would have expected. What are the keys and values that are given in this map?
question from:
https://stackoverflow.com/questions/65932741/terraform-aws-batch-job-definition-parameters-aws-batch-job-definition 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…