Update website/docs/language/modules (#230)

Signed-off-by: Marcin Białoń <mbialon@spacelift.io>
This commit is contained in:
Marcin Białoń 2023-08-30 13:06:23 +02:00 committed by GitHub
parent fb8d960e1d
commit 1a3eee582f
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
10 changed files with 221 additions and 403 deletions

View File

@ -7,8 +7,8 @@ description: |-
# Module Composition
In a simple Terraform configuration with only one root module, we create a
flat set of resources and use Terraform's expression syntax to describe the
In a simple OpenTF configuration with only one root module, we create a
flat set of resources and use OpenTF's expression syntax to describe the
relationships between these resources:
```hcl
@ -57,7 +57,7 @@ can therefore connect the same modules in different ways to produce different
results.
The rest of this page discusses some more specific composition patterns that
may be useful when describing larger systems with Terraform.
may be useful when describing larger systems with OpenTF.
## Dependency Inversion
@ -114,7 +114,7 @@ exists and create it if not, we recommend applying the dependency inversion
approach: making the module accept the object it needs as an argument, via
an input variable.
For example, consider a situation where a Terraform module deploys compute
For example, consider a situation where an OpenTF module deploys compute
instances based on a disk image, and in some environments there is a
specialized disk image available while other environments share a common
base disk image. Rather than having the module itself handle both of these
@ -126,7 +126,7 @@ of the `aws_ami` resource type and data source schemas:
variable "ami" {
type = object({
# Declare an object using only the subset of attributes the module
# needs. Terraform will allow any object that has at least these
# needs. OpenTF will allow any object that has at least these
# attributes.
id = string
architecture = string
@ -172,9 +172,9 @@ module "example" {
}
```
This is consistent with Terraform's declarative style: rather than creating
This is consistent with OpenTF's declarative style: rather than creating
modules with complex conditional branches, we directly describe what
should already exist and what we want Terraform to manage itself.
should already exist and what we want OpenTF to manage itself.
By following this pattern, we can be explicit about in which situations we
expect the AMI to already be present and which we don't. A future reader
@ -193,7 +193,7 @@ Every module has implicit assumptions and guarantees that define what data it ex
- **Assumption:** A condition that must be true in order for the configuration of a particular resource to be usable. For example, an `aws_instance` configuration can have the assumption that the given AMI will always be configured for the `x86_64` CPU architecture.
- **Guarantee:** A characteristic or behavior of an object that the rest of the configuration should be able to rely on. For example, an `aws_instance` configuration can have the guarantee that an EC2 instance will be running in a network that assigns it a private DNS record.
We recommend using [custom conditions](/terraform/language/expressions/custom-conditions) to help capture and test for assumptions and guarantees. This helps future maintainers understand the configuration design and intent. Custom conditions also return useful information about errors earlier and in context, helping consumers more easily diagnose issues in their configurations.
We recommend using [custom conditions](/opentf/language/expressions/custom-conditions) to help capture and test for assumptions and guarantees. This helps future maintainers understand the configuration design and intent. Custom conditions also return useful information about errors earlier and in context, helping consumers more easily diagnose issues in their configurations.
The following examples creates a precondition that checks whether the EC2 instance has an encrypted root volume.
@ -211,12 +211,12 @@ output "api_base_url" {
## Multi-cloud Abstractions
Terraform itself intentionally does not attempt to abstract over similar
OpenTF itself intentionally does not attempt to abstract over similar
services offered by different vendors, because we want to expose the full
functionality in each offering and yet unifying multiple offerings behind a
single interface will tend to require a "lowest common denominator" approach.
However, through composition of Terraform modules it is possible to create
However, through composition of modules it is possible to create
your own lightweight multi-cloud abstractions by making your own tradeoffs
about which platform features are important to you.
@ -275,7 +275,7 @@ replace the `dns_records` module with a new implementation targeting that
provider, and all of the configuration that _produces_ the recordset
definitions can remain unchanged.
We can create lightweight abstractions like these by defining Terraform object
We can create lightweight abstractions like these by defining OpenTF object
types representing the concepts involved and then using these object types
for module input variables. In this case, all of our "DNS records"
implementations would have the following variable declared:
@ -331,7 +331,7 @@ Most modules contain `resource` blocks and thus describe infrastructure to be
created and managed. It may sometimes be useful to write modules that do not
describe any new infrastructure at all, but merely retrieve information about
existing infrastructure that was created elsewhere using
[data sources](/terraform/language/data-sources).
[data sources](/opentf/language/data-sources).
As with conventional modules, we suggest using this technique only when the
module raises the level of abstraction in some way, in this case by
@ -367,7 +367,7 @@ data sources, or it could read saved information from a Consul cluster using
[`consul_keys`](https://registry.terraform.io/providers/hashicorp/consul/latest/docs/data-sources/keys),
or it might read the outputs directly from the state of the configuration that
manages the network using
[`terraform_remote_state`](/terraform/language/state/remote-state-data).
[`terraform_remote_state`](/opentf/language/state/remote-state-data).
The key benefit of this approach is that the source of this information can
change over time without updating every configuration that depends on it.

View File

@ -7,51 +7,49 @@ description: >-
# Creating Modules
> **Hands-on:** Try the [Reuse Configuration with Modules](/terraform/tutorials/modules?utm_source=WEBSITE&utm_medium=WEB_IO&utm_offer=ARTICLE_PAGE&utm_content=DOCS) tutorials.
A _module_ is a container for multiple resources that are used together.
You can use modules to create lightweight abstractions, so that you can
describe your infrastructure in terms of its architecture, rather than
directly in terms of physical objects.
The `.tf` files in your working directory when you run [`terraform plan`](/terraform/cli/commands/plan)
or [`terraform apply`](/terraform/cli/commands/apply) together form the _root_
module. That module may [call other modules](/terraform/language/modules/syntax#calling-a-child-module)
The `.tf` files in your working directory when you run [`opentf plan`](/opentf/cli/commands/plan)
or [`opentf apply`](/opentf/cli/commands/apply) together form the _root_
module. That module may [call other modules](/opentf/language/modules/syntax#calling-a-child-module)
and connect them together by passing output values from one to input values
of another.
To learn how to _use_ modules, see [the Modules configuration section](/terraform/language/modules).
To learn how to _use_ modules, see [the Modules configuration section](/opentf/language/modules).
This section is about _creating_ re-usable modules that other configurations
can include using `module` blocks.
## Module structure
Re-usable modules are defined using all of the same
[configuration language](/terraform/language) concepts we use in root modules.
[configuration language](/opentf/language) concepts we use in root modules.
Most commonly, modules use:
* [Input variables](/terraform/language/values/variables) to accept values from
* [Input variables](/opentf/language/values/variables) to accept values from
the calling module.
* [Output values](/terraform/language/values/outputs) to return results to the
* [Output values](/opentf/language/values/outputs) to return results to the
calling module, which it can then use to populate arguments elsewhere.
* [Resources](/terraform/language/resources) to define one or more
* [Resources](/opentf/language/resources) to define one or more
infrastructure objects that the module will manage.
To define a module, create a new directory for it and place one or more `.tf`
files inside just as you would do for a root module. Terraform can load modules
files inside just as you would do for a root module. OpenTF can load modules
either from local relative paths or from remote repositories; if a module will
be re-used by lots of configurations you may wish to place it in its own
version control repository.
Modules can also call other modules using a `module` block, but we recommend
keeping the module tree relatively flat and using [module composition](/terraform/language/modules/develop/composition)
keeping the module tree relatively flat and using [module composition](/opentf/language/modules/develop/composition)
as an alternative to a deeply-nested tree of modules, because this makes
the individual modules easier to re-use in different combinations.
## When to write a module
In principle any combination of resources and other constructs can be factored
out into a module, but over-using modules can make your overall Terraform
out into a module, but over-using modules can make your overall OpenTF
configuration harder to understand and maintain, so we recommend moderation.
A good module should raise the level of abstraction by describing a new concept
@ -70,16 +68,10 @@ your module is not creating any new abstraction and so the module is
adding unnecessary complexity. Just use the resource type directly in the
calling module instead.
### No-Code Provisioning in Terraform Cloud
You can also create no-code ready modules to enable the no-code provisioning workflow in Terraform Cloud. No-code provisioning lets users deploy a module's resources in Terraform Cloud without writing any Terraform configuration.
No-code ready modules have additional requirements and considerations. Refer to [Designing No-Code Ready Modules](/terraform/cloud-docs/no-code-provisioning/module-design) in the Terraform Cloud documentation for details.
## Refactoring module resources
You can include [refactoring blocks](/terraform/language/modules/develop/refactoring) to record how resource
You can include [refactoring blocks](/opentf/language/modules/develop/refactoring) to record how resource
names and module structure have changed from previous module versions.
Terraform uses that information during planning to reinterpret existing objects
OpenTF uses that information during planning to reinterpret existing objects
as if they had been created at the corresponding new addresses, eliminating a
separate workflow step to replace or migrate existing objects.

View File

@ -1,7 +1,7 @@
---
page_title: Providers Within Modules - Configuration Language
description: >-
Use providers within Terraform modules. Learn about version constraints, aliases, implicit inheritance, and passing providers to Terraform modules.
Use providers within modules. Learn about version constraints, aliases, implicit inheritance, and passing providers to modules.
---
# Providers Within Modules
@ -13,9 +13,9 @@ for how resources are associated with provider configurations.
Each resource in the configuration must be associated with one provider
configuration. Provider configurations, unlike most other concepts in
Terraform, are global to an entire Terraform configuration and can be shared
OpenTF, are global to an entire OpenTF configuration and can be shared
across module boundaries. Provider configurations can be defined only in a
root Terraform module.
root module.
Providers can be passed down to descendent modules in two ways: either
_implicitly_ through inheritance, or _explicitly_ via the `providers` argument
@ -24,12 +24,10 @@ following sections.
A module intended to be called by one or more other modules must not contain
any `provider` blocks. A module containing its own provider configurations is
not compatible with the `for_each`, `count`, and `depends_on` arguments that
were introduced in Terraform v0.13. For more information, see
[Legacy Shared Modules with Provider Configurations](#legacy-shared-modules-with-provider-configurations).
not compatible with the `for_each`, `count`, and `depends_on` arguments.
Provider configurations are used for all operations on associated resources,
including destroying remote objects and refreshing state. Terraform retains, as
including destroying remote objects and refreshing state. OpenTF retains, as
part of its state, a reference to the provider configuration that was most
recently used to apply changes to each resource. When a `resource` block is
removed from the configuration, this record in the state will be used to locate
@ -38,7 +36,7 @@ the appropriate configuration because the resource's `provider` argument
As a consequence, you must ensure that all resources that belong to a
particular provider configuration are destroyed before you can remove that
provider configuration's block from your configuration. If Terraform finds
provider configuration's block from your configuration. If OpenTF finds
a resource instance tracked in the state whose provider configuration block is
no longer available then it will return an error during planning, prompting you
to reintroduce the provider configuration.
@ -46,10 +44,10 @@ to reintroduce the provider configuration.
## Provider Version Constraints in Modules
Although provider _configurations_ are shared between modules, each module must
declare its own [provider requirements](/terraform/language/providers/requirements), so that
Terraform can ensure that there is a single version of the provider that is
declare its own [provider requirements](/opentf/language/providers/requirements), so that
OpenTF can ensure that there is a single version of the provider that is
compatible with all modules in the configuration and to specify the
[source address](/terraform/language/providers/requirements#source-addresses) that serves as
[source address](/opentf/language/providers/requirements#source-addresses) that serves as
the global (module-agnostic) identifier for a provider.
To declare that a module requires particular versions of a specific provider,
@ -70,9 +68,9 @@ A provider requirement says, for example, "This module requires version v2.7.0
of the provider `hashicorp/aws` and will refer to it as `aws`." It doesn't,
however, specify any of the configuration settings that determine what remote
endpoints the provider will access, such as an AWS region; configuration
settings come from provider _configurations_, and a particular overall Terraform
settings come from provider _configurations_, and a particular overall OpenTF
configuration can potentially have
[several different configurations for the same provider](/terraform/language/providers/configuration#alias-multiple-provider-configurations).
[several different configurations for the same provider](/opentf/language/providers/configuration#alias-multiple-provider-configurations).
## Provider Aliases Within Modules
@ -95,7 +93,7 @@ The above requirements are identical to the previous, with the addition of the
alias provider configuration name `aws.alternate`, which can be referenced by
resources using the `provider` argument.
If you are writing a shared Terraform module, constrain only the minimum
If you are writing a shared module, constrain only the minimum
required provider version using a `>=` constraint. This should specify the
minimum version containing the features your module relies on, and thus allow a
user of your module to potentially select a newer provider version if other
@ -104,7 +102,7 @@ features are needed by other parts of their overall configuration.
## Implicit Provider Inheritance
For convenience in simple configurations, a child module automatically inherits
[default provider configurations](/terraform/language/providers/configuration#default-provider-configurations) from its parent. This means that
[default provider configurations](/opentf/language/providers/configuration#default-provider-configurations) from its parent. This means that
explicit `provider` blocks appear only in the root module, and downstream
modules can simply declare resources for that provider and have them
automatically associated with the root provider configurations.
@ -134,10 +132,10 @@ resource "aws_s3_bucket" "example" {
We recommend using this approach when a single configuration for each provider
is sufficient for an entire configuration.
~> **Note:** Only provider configurations are inherited by child modules, not provider source or version requirements. Each module must [declare its own provider requirements](/terraform/language/providers/requirements). This is especially important for non-HashiCorp providers.
~> **Note:** Only provider configurations are inherited by child modules, not provider source or version requirements. Each module must [declare its own provider requirements](/opentf/language/providers/requirements). This is especially important for non-HashiCorp providers.
In more complex situations there may be
[multiple provider configurations](/terraform/language/providers/configuration#alias-multiple-provider-configurations),
[multiple provider configurations](/opentf/language/providers/configuration#alias-multiple-provider-configurations),
or a child module may need to use different provider settings than
its parent. For such situations, you must pass providers explicitly.
@ -174,7 +172,7 @@ module "example" {
```
The `providers` argument within a `module` block is similar to
[the `provider` argument](/terraform/language/meta-arguments/resource-provider)
[the `provider` argument](/opentf/language/meta-arguments/resource-provider)
within a resource, but is a map rather than a single string because a module may
contain resources from many different providers.
@ -232,135 +230,3 @@ terraform {
Each resource should then have its own `provider` attribute set to either
`aws.src` or `aws.dst` to choose which of the two provider configurations to
use.
## Legacy Shared Modules with Provider Configurations
In Terraform v0.10 and earlier there was no explicit way to use different
configurations of a provider in different modules in the same configuration,
and so module authors commonly worked around this by writing `provider` blocks
directly inside their modules, making the module have its own separate
provider configurations separate from those declared in the root module.
However, that pattern had a significant drawback: because a provider
configuration is required to destroy the remote object associated with a
resource instance as well as to create or update it, a provider configuration
must always stay present in the overall Terraform configuration for longer
than all of the resources it manages. If a particular module includes
both resources and the provider configurations for those resources then
removing the module from its caller would violate that constraint: both the
resources and their associated providers would, in effect, be removed
simultaneously.
Terraform v0.11 introduced the mechanisms described in earlier sections to
allow passing provider configurations between modules in a structured way, and
thus we explicitly recommended against writing a child module with its own
provider configuration blocks. However, that legacy pattern continued to work
for compatibility purposes -- though with the same drawback -- until Terraform
v0.13.
Terraform v0.13 introduced the possibility for a module itself to use the
`for_each`, `count`, and `depends_on` arguments, but the implementation of
those unfortunately conflicted with the support for the legacy pattern.
To retain the backward compatibility as much as possible, Terraform v0.13
continues to support the legacy pattern for module blocks that do not use these
new features, but a module with its own provider configurations is not
compatible with `for_each`, `count`, or `depends_on`. Terraform will produce an
error if you attempt to combine these features. For example:
```
Error: Module does not support count
on main.tf line 15, in module "child":
15: count = 2
Module "child" cannot be used with count because it contains a nested provider
configuration for "aws", at child/main.tf:2,10-15.
This module can be made compatible with count by changing it to receive all of
its provider configurations from the calling module, by using the "providers"
argument in the calling module block.
```
To make a module compatible with the new features, you must remove all of the
`provider` blocks from its definition.
If the new version of the module declares `configuration_aliases`, or if the
calling module needs the child module to use different provider configurations
than its own default provider configurations, the calling module must then
include an explicit `providers` argument to describe which provider
configurations the child module will use:
```hcl
provider "aws" {
region = "us-west-1"
}
provider "aws" {
region = "us-east-1"
alias = "east"
}
module "child" {
count = 2
providers = {
# By default, the child module would use the
# default (unaliased) AWS provider configuration
# using us-west-1, but this will override it
# to use the additional "east" configuration
# for its resources instead.
aws = aws.east
}
}
```
Since the association between resources and provider configurations is
static, module calls using `for_each` or `count` cannot pass different
provider configurations to different instances. If you need different
instances of your module to use different provider configurations then you
must use a separate `module` block for each distinct set of provider
configurations:
```hcl
provider "aws" {
alias = "usw1"
region = "us-west-1"
}
provider "aws" {
alias = "usw2"
region = "us-west-2"
}
provider "google" {
alias = "usw1"
credentials = "${file("account.json")}"
project = "my-project-id"
region = "us-west1"
zone = "us-west1-a"
}
provider "google" {
alias = "usw2"
credentials = "${file("account.json")}"
project = "my-project-id"
region = "us-west2"
zone = "us-west2-a"
}
module "bucket_w1" {
source = "./publish_bucket"
providers = {
aws.src = aws.usw1
google.src = google.usw1
}
}
module "bucket_w2" {
source = "./publish_bucket"
providers = {
aws.src = aws.usw2
google.src = google.usw2
}
}
```

View File

@ -6,14 +6,14 @@ description: A module is a container for multiple resources that are used togeth
# Publishing Modules
If you've built a module that you intend to be reused, we recommend
[publishing the module](/terraform/registry/modules/publish) on the
[Terraform Registry](https://registry.terraform.io). This will version
[publishing the module](/opentf/registry/modules/publish) on the
[Public Terraform Registry](https://registry.terraform.io). This will version
your module, generate documentation, and more.
Published modules can be easily consumed by Terraform, and users can
[constrain module versions](/terraform/language/modules/syntax#version)
Published modules can be easily consumed by OpenTF, and users can
[constrain module versions](/opentf/language/modules/syntax#version)
for safe and predictable updates. The following example shows how a caller
might use a module from the Terraform Registry:
might use a module from the Module Registry:
```hcl
module "consul" {
@ -22,20 +22,20 @@ module "consul" {
```
If you do not wish to publish your modules in the public registry, you can
instead use a [private registry](/terraform/registry/private) to get
instead use a [private registry](/opentf/registry/private) to get
the same benefits.
We welcome contributions of Terraform modules from our community members, partners, and customers. Our ecosystem is made richer by each new module created or an existing one updated, as they reflect the wide range of experience and technical requirements of the community that uses them. Our cloud provider partners often seek to develop specific modules for popular or challenging use cases on their platform and utilize them as valuable learning experiences to empathize with their users. Similarly, our community module developers incorporate a variety of opinions and use cases from the broader Terraform community. Both types of modules have their place in the Terraform registry, accessible to practitioners who can decide which modules best fit their requirements.
We welcome contributions of modules from our community members, partners, and customers. Our ecosystem is made richer by each new module created or an existing one updated, as they reflect the wide range of experience and technical requirements of the community that uses them. Our cloud provider partners often seek to develop specific modules for popular or challenging use cases on their platform and utilize them as valuable learning experiences to empathize with their users. Similarly, our community module developers incorporate a variety of opinions and use cases from the broader OpenTF community. Both types of modules have their place in the registry, accessible to practitioners who can decide which modules best fit their requirements.
## Distribution via other sources
Although the registry is the native mechanism for distributing re-usable
modules, Terraform can also install modules from
[various other sources](/terraform/language/modules/sources). The alternative sources
modules, OpenTF can also install modules from
[various other sources](/opentf/language/modules/sources). The alternative sources
do not support the first-class versioning mechanism, but some sources have
their own mechanisms for selecting particular VCS commits, etc.
We recommend that modules distributed via other protocols still use the
[standard module structure](/terraform/language/modules/develop/structure) so that they can
[standard module structure](/opentf/language/modules/develop/structure) so that they can
be used in a similar way as a registry module or be published on the registry
at a later time.

View File

@ -5,26 +5,20 @@ description: How to make backward-compatible changes to modules already in use.
# Refactoring
-> **Note:** Explicit refactoring declarations with `moved` blocks is available in Terraform v1.1 and later. For earlier Terraform versions or for refactoring actions too complex to express as `moved` blocks, you can
use the [`terraform state mv` CLI command](/terraform/cli/commands/state/mv)
as a separate step.
In shared modules and long-lived configurations, you may eventually outgrow
your initial module structure and resource names. For example, you might decide
that what was previously one child module makes more sense as two separate
modules and move a subset of the existing resources to the new one.
Terraform compares previous state with new configuration, correlating by
each module or resource's unique address. Therefore _by default_ Terraform
OpenTF compares previous state with new configuration, correlating by
each module or resource's unique address. Therefore _by default_ OpenTF
understands moving or renaming an object as an intent to destroy the object
at the old address and to create a new object at the new address.
When you add `moved` blocks in your configuration to record where you've
historically moved or renamed an object, Terraform treats an existing object at
historically moved or renamed an object, OpenTF treats an existing object at
the old address as if it now belongs to the new address.
> **Hands On:** Try the [Use Configuration to Move Resources](/terraform/tutorials/configuration-language/move-config) tutorial.
## `moved` Block Syntax
A `moved` block expects no labels and contains only `from` and `to` arguments:
@ -39,9 +33,9 @@ moved {
The example above records that the resource currently known as `aws_instance.b`
was known as `aws_instance.a` in a previous version of this module.
Before creating a new plan for `aws_instance.b`, Terraform first checks
Before creating a new plan for `aws_instance.b`, OpenTF first checks
whether there is an existing object for `aws_instance.a` recorded in the state.
If there is an existing object, Terraform renames that object to
If there is an existing object, OpenTF renames that object to
`aws_instance.b` and then proceeds with creating a plan. The resulting plan is
as if the object had originally been created at `aws_instance.b`, avoiding any
need to destroy it during apply.
@ -70,7 +64,7 @@ resource "aws_instance" "a" {
}
```
Applying this configuration for the first time would cause Terraform to
Applying this configuration for the first time would cause OpenTF to
create `aws_instance.a[0]` and `aws_instance.a[1]`.
If you later choose a different name for this resource, then you can change the
@ -89,7 +83,7 @@ moved {
}
```
When creating the next plan for each configuration using this module, Terraform
When creating the next plan for each configuration using this module, OpenTF
treats any existing objects belonging to `aws_instance.a` as if they had
been created for `aws_instance.b`: `aws_instance.a[0]` will be treated as
`aws_instance.b[0]`, and `aws_instance.a[1]` as `aws_instance.b[1]`.
@ -99,7 +93,7 @@ New instances of the module, which _never_ had an
`aws_instance.b[0]` and `aws_instance.b[1]` as normal.
Both of the addresses in this example referred to a resource as a whole, and
so Terraform recognizes the move for all instances of the resource. That is,
so OpenTF recognizes the move for all instances of the resource. That is,
it covers both `aws_instance.a[0]` and `aws_instance.a[1]` without the need
to identify each one separately.
@ -119,10 +113,10 @@ resource "aws_instance" "a" {
}
```
Applying this configuration would lead to Terraform creating an object
Applying this configuration would lead to OpenTF creating an object
bound to the address `aws_instance.a`.
Later, you use [`for_each`](/terraform/language/meta-arguments/for_each) with this
Later, you use [`for_each`](/opentf/language/meta-arguments/for_each) with this
resource to systematically declare multiple instances. To preserve an object
that was previously associated with `aws_instance.a` alone, you must add a
`moved` block to specify which instance key the object will take in the new
@ -153,12 +147,12 @@ moved {
}
```
The above will keep Terraform from planning to destroy any existing object at
The above will keep OpenTF from planning to destroy any existing object at
`aws_instance.a`, treating that object instead as if it were originally
created as `aws_instance.a["small"]`.
When at least one of the two addresses includes an instance key, like
`["small"]` in the above example, Terraform understands both addresses as
`["small"]` in the above example, OpenTF understands both addresses as
referring to specific _instances_ of a resource rather than the resource as a
whole. That means you can use `moved` to switch between keys and to add and
remove keys as you switch between `count`, `for_each`, or neither.
@ -196,7 +190,7 @@ moved {
```
-> **Note:** When you add `count` to an existing resource that didn't use it,
Terraform automatically proposes to move the original object to instance zero,
OpenTF automatically proposes to move the original object to instance zero,
unless you write an `moved` block explicitly mentioning that resource.
However, we recommend still writing out the corresponding `moved` block
explicitly, to make the change clearer to future readers of the module.
@ -214,7 +208,7 @@ module "a" {
}
```
When applying this configuration, Terraform would prefix the addresses for
When applying this configuration, OpenTF would prefix the addresses for
any resources declared in this module with the module path `module.a`.
For example, a resource `aws_instance.example` would have the full address
`module.a.aws_instance.example`.
@ -235,13 +229,13 @@ moved {
}
```
When creating the next plan for each configuration using this module, Terraform
When creating the next plan for each configuration using this module, OpenTF
will treat any existing object addresses beginning with `module.a` as if
they had instead been created in `module.b`. `module.a.aws_instance.example`
would be treated as `module.b.aws_instance.example`.
Both of the addresses in this example referred to a module call as a whole, and
so Terraform recognizes the move for all instances of the call. If this
so OpenTF recognizes the move for all instances of the call. If this
module call used `count` or `for_each` then it would apply to all of the
instances, without the need to specify each one separately.
@ -257,11 +251,11 @@ module "a" {
}
```
Applying this configuration would cause Terraform to create objects whose
Applying this configuration would cause OpenTF to create objects whose
addresses begin with `module.a`.
In later module versions, you may need to use
[`count`](/terraform/language/meta-arguments/count) with this resource to systematically
[`count`](/opentf/language/meta-arguments/count) with this resource to systematically
declare multiple instances. To preserve an object that was previously associated
with `aws_instance.a` alone, you can add a `moved` block to specify which
instance key that object will take in the new configuration:
@ -280,12 +274,12 @@ moved {
}
```
The configuration above directs Terraform to treat all objects in `module.a` as
if they were originally created in `module.a[2]`. As a result, Terraform plans
The configuration above directs OpenTF to treat all objects in `module.a` as
if they were originally created in `module.a[2]`. As a result, OpenTF plans
to create new objects only for `module.a[0]` and `module.a[1]`.
When at least one of the two addresses includes an instance key, like
`[2]` in the above example, Terraform will understand both addresses as
`[2]` in the above example, OpenTF will understand both addresses as
referring to specific _instances_ of a module call rather than the module
call as a whole. That means you can use `moved` to switch between keys and to
add and remove keys as you switch between `count`, `for_each`, or neither.
@ -384,7 +378,7 @@ moved {
```
When an existing user of the original module upgrades to the new "shim"
version, Terraform notices these three `moved` blocks and behaves
version, OpenTF notices these three `moved` blocks and behaves
as if the objects associated with the three old resource addresses were
originally created inside the two new modules.
@ -398,9 +392,9 @@ typical rule that a parent module sees its child module as a "closed box",
unaware of exactly which resources are declared inside it. This compromise
assumes that all three of these modules are maintained by the same people
and distributed together in a single
[module package](/terraform/language/modules/sources#modules-in-package-sub-directories).
[module package](/opentf/language/modules/sources#modules-in-package-sub-directories).
Terraform resolves module references in `moved` blocks relative to the module
OpenTF resolves module references in `moved` blocks relative to the module
instance they are defined in. For example, if the original module above were
already a child module named `module.original`, the reference to
`module.x.aws_instance.a` would resolve as
@ -425,7 +419,7 @@ Over time, a long-lasting module may accumulate many `moved` blocks.
Removing a `moved` block is a generally breaking change because any configurations that refer to the old address will plan to delete that existing object instead of move it. We strongly recommend that you retain all historical `moved` blocks from earlier versions of your modules to preserve the upgrade path for users of any previous version.
If you do decide to remove `moved` blocks, proceed with caution. It can be safe to remove `moved` blocks when you are maintaining private modules within an organization and you are certain that all users have successfully run `terraform apply` with your new module version.
If you do decide to remove `moved` blocks, proceed with caution. It can be safe to remove `moved` blocks when you are maintaining private modules within an organization and you are certain that all users have successfully run `opentf apply` with your new module version.
If you need to rename or move the same object twice, we recommend documenting the full history
using _chained_ `moved` blocks, where the new block refers to the existing block:
@ -444,5 +438,5 @@ moved {
Recording a sequence of moves in this way allows for successful upgrades for
both configurations with objects at `aws_instance.a` _and_ configurations with
objects at `aws_instance.b`. In both cases, Terraform treats the existing
objects at `aws_instance.b`. In both cases, OpenTF treats the existing
object as if it had been originally created as `aws_instance.c`.

View File

@ -7,7 +7,7 @@ description: >-
# Standard Module Structure
The standard module structure is a file and directory layout we recommend for
reusable modules distributed in separate repositories. Terraform tooling is
reusable modules distributed in separate repositories. OpenTF tooling is
built to understand the standard module structure and use that structure to
generate documentation, index modules for the module registry, and more.
@ -16,7 +16,7 @@ appear long, but everything is optional except for the root module. Most modules
don't need to do any extra work to follow the standard structure.
* **Root module**. This is the **only required element** for the standard
module structure. Terraform files must exist in the root directory of
module structure. OpenTF files must exist in the root directory of
the repository. This should be the primary entrypoint for the module and is
expected to be opinionated. For the
[Consul module](https://registry.terraform.io/modules/hashicorp/consul)
@ -55,13 +55,13 @@ don't need to do any extra work to follow the standard structure.
* **Variables and outputs should have descriptions.** All variables and
outputs should have one or two sentence descriptions that explain their
purpose. This is used for documentation. See the documentation for
[variable configuration](/terraform/language/values/variables) and
[output configuration](/terraform/language/values/outputs) for more details.
[variable configuration](/opentf/language/values/variables) and
[output configuration](/opentf/language/values/outputs) for more details.
* **Nested modules**. Nested modules should exist under the `modules/`
subdirectory. Any nested module with a `README.md` is considered usable
by an external user. If a README doesn't exist, it is considered for internal
use only. These are purely advisory; Terraform will not actively deny usage
use only. These are purely advisory; OpenTF will not actively deny usage
of internal modules. Nested modules should be used to split complex behavior
into multiple small modules that advanced users can carefully pick and
choose. For example, the
@ -71,12 +71,12 @@ don't need to do any extra work to follow the standard structure.
own IAM policy choices.
If the root module includes calls to nested modules, they should use relative
paths like `./modules/consul-cluster` so that Terraform will consider them
paths like `./modules/consul-cluster` so that OpenTF will consider them
to be part of the same repository or package, rather than downloading them
again separately.
If a repository or package contains multiple nested modules, they should
ideally be [composable](/terraform/language/modules/develop/composition) by the caller, rather than
ideally be [composable](/opentf/language/modules/develop/composition) by the caller, rather than
calling directly to each other and creating a deeply-nested tree of modules.
* **Examples**. Examples of using the module should exist under the

View File

@ -7,24 +7,22 @@ description: >-
# Modules
> **Hands-on:** Try the [Reuse Configuration with Modules](/terraform/tutorials/modules?utm_source=WEBSITE&utm_medium=WEB_IO&utm_offer=ARTICLE_PAGE&utm_content=DOCS) tutorials.
_Modules_ are containers for multiple resources that are used together. A module
consists of a collection of `.tf` and/or `.tf.json` files kept together in a
directory.
Modules are the main way to package and reuse resource configurations with
Terraform.
OpenTF.
## The Root Module
Every Terraform configuration has at least one module, known as its
Every OpenTF configuration has at least one module, known as its
_root module_, which consists of the resources defined in the `.tf` files in
the main working directory.
## Child Modules
A Terraform module (usually the root module of a configuration) can _call_ other
An OpenTF module (usually the root module of a configuration) can _call_ other
modules to include their resources into the configuration. A module that has
been called by another module is often referred to as a _child module._
@ -33,38 +31,37 @@ multiple configurations can use the same child module.
## Published Modules
In addition to modules from the local filesystem, Terraform can load modules
In addition to modules from the local filesystem, OpenTF can load modules
from a public or private registry. This makes it possible to publish modules for
others to use, and to use modules that others have published.
The [Terraform Registry](https://registry.terraform.io/browse/modules) hosts a
broad collection of publicly available Terraform modules for configuring many
kinds of common infrastructure. These modules are free to use, and Terraform can
The [Public Terraform Registry](https://registry.terraform.io/browse/modules) hosts a
broad collection of publicly available OpenTF modules for configuring many
kinds of common infrastructure. These modules are free to use, and OpenTF can
download them automatically if you specify the appropriate source and version in
a module call block.
Also, members of your organization might produce modules specifically crafted
for your own infrastructure needs. [Terraform Cloud](https://cloud.hashicorp.com/products/terraform) and
[Terraform Enterprise](/terraform/enterprise) both include a private
for your own infrastructure needs. TACOS (TF Automation and Collaboration Software) include a private
module registry for sharing modules internally within your organization.
## Using Modules
- [Module Blocks](/terraform/language/modules/syntax) documents the syntax for
- [Module Blocks](/opentf/language/modules/syntax) documents the syntax for
calling a child module from a parent module, including meta-arguments like
`for_each`.
- [Module Sources](/terraform/language/modules/sources) documents what kinds of paths,
- [Module Sources](/opentf/language/modules/sources) documents what kinds of paths,
addresses, and URIs can be used in the `source` argument of a module block.
- The Meta-Arguments section documents special arguments that can be used with
every module, including
[`providers`](/terraform/language/meta-arguments/module-providers),
[`depends_on`](/terraform/language/meta-arguments/depends_on),
[`count`](/terraform/language/meta-arguments/count),
and [`for_each`](/terraform/language/meta-arguments/for_each).
[`providers`](/opentf/language/meta-arguments/module-providers),
[`depends_on`](/opentf/language/meta-arguments/depends_on),
[`count`](/opentf/language/meta-arguments/count),
and [`for_each`](/opentf/language/meta-arguments/for_each).
## Developing Modules
For information about developing reusable modules, see
[Module Development](/terraform/language/modules/develop).
[Module Development](/opentf/language/modules/develop).

View File

@ -1,27 +1,25 @@
---
page_title: Module Sources
description: >-
The source argument tells Terraform where to find child modules's
configurations in locations like GitHub, the Terraform Registry, Bitbucket,
The source argument tells OpenTF where to find child modules's
configurations in locations like GitHub, the OpenTF Registry, Bitbucket,
Git, Mercurial, S3, and GCS.
---
# Module Sources
The `source` argument in [a `module` block](/terraform/language/modules/syntax)
tells Terraform where to find the source code for the desired child module.
The `source` argument in [a `module` block](/opentf/language/modules/syntax)
tells OpenTF where to find the source code for the desired child module.
Terraform uses this during the module installation step of `terraform init`
to download the source code to a directory on local disk so that other Terraform commands can use it.
> **Hands-on:** Try the [Use Modules From the Registry](/terraform/tutorials/modules/module-use) or [Build and Use a Local Module](/terraform/tutorials/modules/module-create) tutorials.
OpenTF uses this during the module installation step of `opentf init`
to download the source code to a directory on local disk so that other OpenTF commands can use it.
The module installer supports installation from a number of different source
types.
- [Local paths](#local-paths)
- [Terraform Registry](#terraform-registry)
- [Module Registry](#module-registry)
- [GitHub](#github)
@ -43,12 +41,12 @@ of sources and additional features.
We recommend using local file paths for closely-related modules used primarily
for the purpose of factoring out repeated code elements, and using a native
Terraform module registry for modules intended to be shared by multiple calling
OpenTF module registry for modules intended to be shared by multiple calling
configurations. We support other sources so that you can potentially distribute
Terraform modules internally with existing infrastructure.
OpenTF modules internally with existing infrastructure.
Many of the source types will make use of "ambient" credentials available
when Terraform is run, such as from environment variables or credentials files
when OpenTF is run, such as from environment variables or credentials files
in your home directory. This is covered in more detail in each of the following
sections.
@ -69,40 +67,40 @@ module "consul" {
A local path must begin with either `./` or `../` to indicate that a local
path is intended, to distinguish from
[a module registry address](#terraform-registry).
[a module registry address](#module-registry).
Local paths are special in that they are not "installed" in the same sense
that other sources are: the files are already present on local disk (possibly
as a result of installing a parent module) and so can just be used directly.
Their source code is automatically updated if the parent module is upgraded.
Note that Terraform does not consider an _absolute_ filesystem path (starting
Note that OpenTF does not consider an _absolute_ filesystem path (starting
with a slash, a drive letter, or similar) to be a local path. Instead,
Terraform will treat that in a similar way as a remote module and copy it into
OpenTF will treat that in a similar way as a remote module and copy it into
the local module cache. An absolute path is a "package" in the sense described
in [Modules in Package Sub-directories](#modules-in-package-sub-directories).
We don't recommend using absolute filesystem paths to refer to Terraform
modules, because it will tend to couple your configuration to the filesystem
We don't recommend using absolute filesystem paths to refer to modules,
because it will tend to couple your configuration to the filesystem
layout of a particular computer.
## Terraform Registry
## Module Registry
A module registry is the native way of distributing Terraform modules for use
across multiple configurations, using a Terraform-specific protocol that
A module registry is the native way of distributing modules for use
across multiple configurations, using an OpenTF-specific protocol that
has full support for module versioning.
[Terraform Registry](https://registry.terraform.io/) is an index of modules
[Public Terraform Registry](https://registry.terraform.io/) is an index of modules
shared publicly using this protocol. This public registry is the easiest way
to get started with Terraform and find modules created by others in the
to get started with OpenTF and find modules created by others in the
community.
You can also use a
[private registry](/terraform/registry/private), either
via the built-in feature from Terraform Cloud, or by running a custom
[private registry](/opentf/registry/private), either
via TACOS (TF Automation and Collaboration Software), or by running a custom
service that implements
[the module registry protocol](/terraform/registry/api-docs).
[the module registry protocol](/opentf/registry/api-docs).
Modules on the public Terraform Registry can be referenced using a registry
Modules on the public registry can be referenced using a registry
source address of the form `<NAMESPACE>/<NAME>/<PROVIDER>`, with each
module's information page on the registry site including the exact address
to use.
@ -116,7 +114,7 @@ module "consul" {
The above example will use the
[Consul module for AWS](https://registry.terraform.io/modules/hashicorp/consul/aws)
from the public registry.
from a public registry.
For modules hosted in other registries, prefix the source address with an
additional `<HOSTNAME>/` portion, giving the hostname of the private registry:
@ -128,28 +126,22 @@ module "consul" {
}
```
If you are using the SaaS version of Terraform Cloud, its private
registry hostname is `app.terraform.io`. If you use a self-hosted Terraform
Enterprise instance, its private registry hostname is the same as the host
where you'd access the web UI and the host you'd use when configuring
the [Terraform Cloud CLI integration](/terraform/cli/cloud).
Registry modules support versioning. You can provide a specific version as shown
in the above examples, or use flexible
[version constraints](/terraform/language/modules/syntax#version).
[version constraints](/opentf/language/modules/syntax#version).
You can learn more about the registry at the
[Terraform Registry documentation](/terraform/registry/modules/use#using-modules).
[Module Registry documentation](/opentf/registry/modules/use#using-modules).
To access modules from a private registry, you may need to configure an access
token [in the CLI config](/terraform/cli/config/config-file#credentials). Use the
token [in the CLI config](/opentf/cli/config/config-file#credentials). Use the
same hostname as used in the module source string. For a private registry
within Terraform Cloud, use the same authentication token as you would
use with the Enterprise API or command-line clients.
within TACOS (TF Automation and Collaboration Software), use the same authentication token as you would
use with the API or command-line clients.
## GitHub
Terraform will recognize unprefixed `github.com` URLs and interpret them
OpenTF will recognize unprefixed `github.com` URLs and interpret them
automatically as Git repository sources.
```hcl
@ -175,19 +167,19 @@ particular to access private repositories.
## Bitbucket
Terraform will recognize unprefixed `bitbucket.org` URLs and interpret them
OpenTF will recognize unprefixed `bitbucket.org` URLs and interpret them
automatically as BitBucket repositories:
```hcl
module "consul" {
source = "bitbucket.org/hashicorp/terraform-consul-aws"
source = "bitbucket.org/example-corp/opentf-consul-aws"
}
```
This shorthand works only for public repositories, because Terraform must
This shorthand works only for public repositories, because OpenTF must
access the BitBucket API to learn if the given repository uses Git or Mercurial.
Terraform treats the result either as [a Git source](#generic-git-repository)
OpenTF treats the result either as [a Git source](#generic-git-repository)
or [a Mercurial source](#generic-mercurial-repository) depending on the
repository type. See the sections on each version control type for information
on how to configure credentials for private repositories and how to specify
@ -212,7 +204,7 @@ module "storage" {
}
```
Terraform installs modules from Git repositories by running `git clone`, and
OpenTF installs modules from Git repositories by running `git clone`, and
so it will respect any local Git configuration set on your system, including
credentials. To access a non-public Git repository, configure Git with
suitable credentials for that repository.
@ -227,13 +219,9 @@ username/password credentials, configure
[Git Credentials Storage](https://git-scm.com/book/en/v2/Git-Tools-Credential-Storage)
to select a suitable source of credentials for your environment.
If your Terraform configuration will be used within [Terraform Cloud](https://www.hashicorp.com/products/terraform),
only SSH key authentication is supported, and
[keys can be configured on a per-workspace basis](/terraform/cloud-docs/workspaces/settings/ssh-keys).
### Selecting a Revision
By default, Terraform will clone and use the default branch (referenced by
By default, OpenTF will clone and use the default branch (referenced by
`HEAD`) in the selected repository. You can override this using the
`ref` argument. The value of the `ref` argument can be any reference that would be accepted
by the `git checkout` command, such as branch, SHA-1 hash (short or full), or tag names.
@ -264,13 +252,13 @@ telling Git to create a shallow clone with the history truncated to only
the specified number of commits.
However, because shallow clone requires different Git protocol behavior,
setting the `depth` argument makes Terraform pass your [`ref` argument](#selecting-a-revision),
setting the `depth` argument makes OpenTF pass your [`ref` argument](#selecting-a-revision),
if any, to
[the `--branch` argument to `git clone`](https://git-scm.com/docs/git-clone#Documentation/git-clone.txt---branchltnamegt)
instead. That means it must specify a named branch or tag known to the remote
repository, and that raw commit IDs are not acceptable.
Because Terraform only uses the most recent selected commit to find the source
Because OpenTF only uses the most recent selected commit to find the source
code of your specified module, it is not typically useful to set `depth`
to any value other than `1`.
@ -288,10 +276,10 @@ module "storage" {
}
```
If you use the `ssh://` URL scheme then Terraform will assume that the colon
If you use the `ssh://` URL scheme then OpenTF will assume that the colon
marks the beginning of a port number, rather than the beginning of the path.
This matches how Git itself interprets these different forms, aside from
the Terraform-specific `git::` selector prefix.
the OpenTF-specific `git::` selector prefix.
## Generic Mercurial Repository
@ -306,7 +294,7 @@ module "vpc" {
}
```
Terraform installs modules from Mercurial repositories by running `hg clone`, and
OpenTF installs modules from Mercurial repositories by running `hg clone`, and
so it will respect any local Mercurial configuration set on your system,
including credentials. To access a non-public repository, configure Mercurial
with suitable credentials for that repository.
@ -316,10 +304,6 @@ automatically. This is the most common way to access non-public Mercurial
repositories from automated systems because it allows access to private
repositories without interactive prompts.
If your Terraform configuration will be used within [Terraform Cloud](https://www.hashicorp.com/products/terraform),
only SSH key authentication is supported, and
[keys can be configured on a per-workspace basis](/terraform/cloud-docs/workspaces/settings/ssh-keys).
### Selecting a Revision
You can select a non-default branch or tag using the optional `ref` argument:
@ -332,39 +316,39 @@ module "vpc" {
## HTTP URLs
When you use an HTTP or HTTPS URL, Terraform will make a `GET` request to
When you use an HTTP or HTTPS URL, OpenTF will make a `GET` request to
the given URL, which can return _another_ source address. This indirection
allows using HTTP URLs as a sort of "vanity redirect" over a more complicated
module source address.
Terraform will append an additional query string argument `terraform-get=1` to
OpenTF will append an additional query string argument `opentf-get=1` to
the given URL before sending the `GET` request, allowing the server to
optionally return a different result when Terraform is requesting it.
optionally return a different result when OpenTF is requesting it.
If the response is successful (`200`-range status code), Terraform looks in
If the response is successful (`200`-range status code), OpenTF looks in
the following locations in order for the next address to access:
- The value of a response header field named `X-Terraform-Get`.
- The value of a response header field named `X-OpenTF-Get`.
- If the response is an HTML page, a `meta` element with the name `terraform-get`:
- If the response is an HTML page, a `meta` element with the name `opentf-get`:
```html
<meta name="terraform-get" content="github.com/hashicorp/example" />
<meta name="opentf-get" content="github.com/hashicorp/example" />
```
In either case, the result is interpreted as another module source address
using one of the forms documented elsewhere on this page.
If an HTTP/HTTPS URL requires authentication credentials, use a `.netrc`
file to configure the credentials. By default, Terraform searches for the `.netrc` file
file to configure the credentials. By default, OpenTF searches for the `.netrc` file
in your HOME directory. However, you can override the default filesystem location by setting the `NETRC` environment variable. For information on the `.netrc` format,
refer to [the documentation for using it in `curl`](https://everything.curl.dev/usingcurl/netrc).
### Fetching archives over HTTP
As a special case, if Terraform detects that the URL has a common file
As a special case, if OpenTF detects that the URL has a common file
extension associated with an archive file format then it will bypass the
special `terraform-get=1` redirection described above and instead just use
special `opentf-get=1` redirection described above and instead just use
the contents of the referenced archive as the module source code:
```hcl
@ -373,7 +357,7 @@ module "vpc" {
}
```
The extensions that Terraform recognizes for this special behavior are:
The extensions that OpenTF recognizes for this special behavior are:
- `zip`
- `tar.bz2` and `tbz2`
@ -402,20 +386,20 @@ prefix, followed by
```hcl
module "consul" {
source = "s3::https://s3-eu-west-1.amazonaws.com/examplecorp-terraform-modules/vpc.zip"
source = "s3::https://s3-eu-west-1.amazonaws.com/examplecorp-opentf-modules/vpc.zip"
}
```
-> **Note:** Buckets in AWS's us-east-1 region must use the hostname `s3.amazonaws.com` (instead of `s3-us-east-1.amazonaws.com`).
The `s3::` prefix causes Terraform to use AWS-style authentication when
The `s3::` prefix causes OpenTF to use AWS-style authentication when
accessing the given URL. As a result, this scheme may also work for other
services that mimic the S3 API, as long as they handle authentication in the
same way as AWS.
The resulting object must be an archive with one of the same file
extensions as for [archives over standard HTTP](#fetching-archives-over-http).
Terraform will extract the archive to obtain the module source tree.
OpenTF will extract the archive to obtain the module source tree.
The module installer looks for AWS credentials in the following locations,
preferring those earlier in the list when multiple are available:
@ -447,7 +431,7 @@ use any of the following methods to set Google Cloud Platform credentials:
* Set the `GOOGLE_OAUTH_ACCESS_TOKEN` environment variable to a raw Google Cloud Platform OAuth access token.
* Enter the path of your service account key file in the `GOOGLE_APPLICATION_CREDENTIALS` environment variable.
* If you're running Terraform from a GCE instance, default credentials are automatically available. See [Creating and Enabling Service Accounts](https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances) for Instances for more details.
* If you're running OpenTF from a GCE instance, default credentials are automatically available. See [Creating and Enabling Service Accounts](https://cloud.google.com/compute/docs/access/create-enable-service-accounts-for-instances) for Instances for more details.
* On your computer, you can make your Google identity available by running `gcloud auth application-default login`.
## Modules in Package Sub-directories
@ -456,14 +440,14 @@ When the source of a module is a version control repository or archive file
(generically, a "package"), the module itself may be in a sub-directory relative
to the root of the package.
A special double-slash syntax is interpreted by Terraform to indicate that
A special double-slash syntax is interpreted by OpenTF to indicate that
the remaining path after that point is a sub-directory within the package.
For example:
- `hashicorp/consul/aws//modules/consul-cluster`
- `git::https://example.com/network.git//modules/vpc`
- `https://example.com/network-module.zip//modules/vpc`
- `s3::https://s3-eu-west-1.amazonaws.com/examplecorp-terraform-modules/network.zip//modules/vpc`
- `s3::https://s3-eu-west-1.amazonaws.com/examplecorp-opentf-modules/network.zip//modules/vpc`
If the source address has arguments, such as the `ref` argument supported for
the version control sources, the sub-directory portion must be _before_ those
@ -472,7 +456,7 @@ arguments:
- `git::https://example.com/network.git//modules/vpc?ref=v1.2.0`
- `github.com/hashicorp/example//modules/vpc?ref=v1.2.0`
Terraform will still extract the entire package to local disk, but will read
OpenTF will still extract the entire package to local disk, but will read
the module from the subdirectory. As a result, it is safe for a module in
a sub-directory of a package to use [a local path](#local-paths) to another
module as long as it is in the _same_ package.

View File

@ -8,11 +8,9 @@ description: >-
# Module Blocks
> **Hands-on:** Try the [Reuse Configuration with Modules](/terraform/tutorials/modules?utm_source=WEBSITE&utm_medium=WEB_IO&utm_offer=ARTICLE_PAGE&utm_content=DOCS) tutorials.
A _module_ is a container for multiple resources that are used together.
Every Terraform configuration has at least one module, known as its
Every OpenTF configuration has at least one module, known as its
_root module_, which consists of the resources defined in the `.tf` files in
the main working directory.
@ -23,13 +21,13 @@ in separate configurations, allowing resource configurations to be packaged
and re-used.
This page describes how to call one module from another. For more information
about creating re-usable child modules, see [Module Development](/terraform/language/modules/develop).
about creating re-usable child modules, see [Module Development](/opentf/language/modules/develop).
## Calling a Child Module
To _call_ a module means to include the contents of that module into the
configuration with specific values for its
[input variables](/terraform/language/values/variables). Modules are called
[input variables](/opentf/language/values/variables). Modules are called
from within other modules using `module` blocks:
```hcl
@ -53,28 +51,28 @@ Module calls use the following kinds of arguments:
- The `version` argument is recommended for modules from a registry.
- Most other arguments correspond to [input variables](/terraform/language/values/variables)
- Most other arguments correspond to [input variables](/opentf/language/values/variables)
defined by the module. (The `servers` argument in the example above is one of
these.)
- Terraform defines a few other meta-arguments that can be used with all
- OpenTF defines a few other meta-arguments that can be used with all
modules, including `for_each` and `depends_on`.
### Source
All modules **require** a `source` argument, which is a meta-argument defined by
Terraform. Its value is either the path to a local directory containing the
module's configuration files, or a remote module source that Terraform should
OpenTF. Its value is either the path to a local directory containing the
module's configuration files, or a remote module source that OpenTF should
download and use. This value must be a literal string with no template
sequences; arbitrary expressions are not allowed. For more information on
possible values for this argument, see [Module Sources](/terraform/language/modules/sources).
possible values for this argument, see [Module Sources](/opentf/language/modules/sources).
The same source address can be specified in multiple `module` blocks to create
multiple copies of the resources defined within, possibly with different
variable values.
After adding, removing, or modifying `module` blocks, you must re-run
`terraform init` to allow Terraform the opportunity to adjust the installed
`opentf init` to allow OpenTF the opportunity to adjust the installed
modules. By default this command will not upgrade an already-installed module;
use the `-upgrade` option to instead upgrade to the newest available version.
@ -95,14 +93,14 @@ module "consul" {
}
```
The `version` argument accepts a [version constraint string](/terraform/language/expressions/version-constraints).
Terraform will use the newest installed version of the module that meets the
The `version` argument accepts a [version constraint string](/opentf/language/expressions/version-constraints).
OpenTF will use the newest installed version of the module that meets the
constraint; if no acceptable versions are installed, it will download the newest
version that meets the constraint.
Version constraints are supported only for modules installed from a module
registry, such as the public [Terraform Registry](https://registry.terraform.io/)
or [Terraform Cloud's private module registry](/terraform/cloud-docs/registry).
registry, such as the [Public Terraform Registry](https://registry.terraform.io/)
or any TACOS (TF Automation and Collaboration Software) private modules registry.
Other module sources can provide their own versioning mechanisms within the
source string itself, or might not support versions at all. In particular,
modules sourced from local file paths do not support `version`; since
@ -111,36 +109,36 @@ version as their caller.
### Meta-arguments
Along with `source` and `version`, Terraform defines a few more
Along with `source` and `version`, OpenTF defines a few more
optional meta-arguments that have special meaning across all modules,
described in more detail in the following pages:
- `count` - Creates multiple instances of a module from a single `module` block.
See [the `count` page](/terraform/language/meta-arguments/count)
See [the `count` page](/opentf/language/meta-arguments/count)
for details.
- `for_each` - Creates multiple instances of a module from a single `module`
block. See
[the `for_each` page](/terraform/language/meta-arguments/for_each)
[the `for_each` page](/opentf/language/meta-arguments/for_each)
for details.
- `providers` - Passes provider configurations to a child module. See
[the `providers` page](/terraform/language/meta-arguments/module-providers)
[the `providers` page](/opentf/language/meta-arguments/module-providers)
for details. If not specified, the child module inherits all of the default
(un-aliased) provider configurations from the calling module.
- `depends_on` - Creates explicit dependencies between the entire
module and the listed targets. See
[the `depends_on` page](/terraform/language/meta-arguments/depends_on)
[the `depends_on` page](/opentf/language/meta-arguments/depends_on)
for details.
Terraform does not use the `lifecycle` argument. However, the `lifecycle` block is reserved for future versions.
OpenTF does not use the `lifecycle` argument. However, the `lifecycle` block is reserved for future versions.
## Accessing Module Output Values
The resources defined in a module are encapsulated, so the calling module
cannot access their attributes directly. However, the child module can
declare [output values](/terraform/language/values/outputs) to selectively
declare [output values](/opentf/language/values/outputs) to selectively
export certain values to be accessed by the calling module.
For example, if the `./app-cluster` module referenced in the example above
@ -156,40 +154,40 @@ resource "aws_elb" "example" {
```
For more information about referring to named values, see
[Expressions](/terraform/language/expressions).
[Expressions](/opentf/language/expressions).
## Transferring Resource State Into Modules
Moving `resource` blocks from one module into several child modules causes
Terraform to see the new location as an entirely different resource. As a
result, Terraform plans to destroy all resource instances at the old address
OpenTF to see the new location as an entirely different resource. As a
result, OpenTF plans to destroy all resource instances at the old address
and create new instances at the new address.
To preserve existing objects, you can use
[refactoring blocks](/terraform/language/modules/develop/refactoring) to record the old and new
addresses for each resource instance. This directs Terraform to treat existing
[refactoring blocks](/opentf/language/modules/develop/refactoring) to record the old and new
addresses for each resource instance. This directs OpenTF to treat existing
objects at the old addresses as if they had originally been created at the
corresponding new addresses.
## Replacing resources within a module
You may have an object that needs to be replaced with a new object for a reason
that isn't automatically visible to Terraform, such as if a particular virtual
that isn't automatically visible to OpenTF, such as if a particular virtual
machine is running on degraded underlying hardware. In this case, you can use
[the `-replace=...` planning option](/terraform/cli/commands/plan#replace-address)
to force Terraform to propose replacing that object.
[the `-replace=...` planning option](/opentf/cli/commands/plan#replace-address)
to force OpenTF to propose replacing that object.
If the object belongs to a resource within a nested module, specify the full
path to that resource including all of the nested module steps leading to it.
For example:
```shellsession
$ terraform plan -replace=module.example.aws_instance.example
$ opentf plan -replace=module.example.aws_instance.example
```
The above selects a `resource "aws_instance" "example"` declared inside a
`module "example"` child module declared inside your root module.
Because replacing is a very disruptive action, Terraform only allows selecting
Because replacing is a very disruptive action, OpenTF only allows selecting
individual resource instances. There is no syntax to force replacing _all_
resource instances belonging to a particular module.

View File

@ -6,17 +6,17 @@ description: Part of the ongoing design research for module integration testing.
# Module Testing Experiment
This page is about some experimental features available in recent versions of
Terraform CLI related to integration testing of shared modules.
OpenTF CLI related to integration testing of shared modules.
The Terraform team is aiming to use these features to gather feedback as part
of ongoing research into different strategies for testing Terraform modules.
The OpenTF team is aiming to use these features to gather feedback as part
of ongoing research into different strategies for testing modules.
These features are likely to change significantly in future releases based on
feedback.
## Current Research Goals
Our initial area of research is into the question of whether it's helpful and
productive to write module integration tests in the Terraform language itself,
productive to write module integration tests in the OpenTF language itself,
or whether it's better to handle that as a separate concern orchestrated by
code written in other languages.
@ -24,36 +24,29 @@ Some existing efforts have piloted both approaches:
* [Terratest](https://terratest.gruntwork.io/) and
[kitchen-terraform](https://github.com/newcontext-oss/kitchen-terraform)
both pioneered the idea of writing tests for Terraform modules with explicit
both pioneered the idea of writing tests for modules with explicit
orchestration written in the Go and Ruby programming languages, respectively.
* The Terraform provider
* The OpenTF provider
[`apparentlymart/testing`](https://registry.terraform.io/providers/apparentlymart/testing/latest)
introduced the idea of writing Terraform module tests in the Terraform
introduced the idea of writing module tests in the OpenTF
language itself, using a special provider that can evaluate assertions
and fail `terraform apply` if they don't pass.
and fail `opentf apply` if they don't pass.
Both of these approaches have both advantages and disadvantages, and so it's
likely that both will coexist for different situations, but the community
efforts have already explored the external-language testing model quite deeply
while the Terraform-integrated testing model has not yet been widely trialled.
while the OpenTF-integrated testing model has not yet been widely trialled.
For that reason, the current iteration of the module testing experiment is
aimed at trying to make the Terraform-integrated approach more accessible so
aimed at trying to make the OpenTF-integrated approach more accessible so
that more module authors can hopefully try it and share their experiences.
## Current Experimental Features
-> This page describes the incarnation of the experimental features introduced
in **Terraform CLI v0.15.0**. If you are using an earlier version of Terraform
then you'll need to upgrade to v0.15.0 or later to use the experimental features
described here, though you only need to use v0.15.0 or later for running tests;
your module itself can remain compatible with earlier Terraform versions, if
needed.
Our current area of interest is in what sorts of tests can and cannot be
written using features integrated into the Terraform language itself. As a
means to investigate that without invasive, cross-cutting changes to Terraform
Core we're using a special built-in Terraform provider as a placeholder for
written using features integrated into the OpenTF language itself. As a
means to investigate that without invasive, cross-cutting changes to OpenTF
Core we're using a special built-in OpenTF provider as a placeholder for
potential new features.
If this experiment is successful then we expect to run a second round of
@ -61,17 +54,17 @@ research and design about exactly what syntax is most ergonomic for writing
tests, but for the moment we're interested less in the specific syntax and more
in the capabilities of this approach.
The temporary extensions to Terraform for this experiment consist of the
The temporary extensions to OpenTF for this experiment consist of the
following parts:
* A temporary experimental provider `terraform.io/builtin/test`, which acts as
a placeholder for potential new language features related to test assertions.
* A `terraform test` command for more conveniently running multiple tests in
* A `opentf test` command for more conveniently running multiple tests in
a single action.
* An experimental convention of placing test configurations in subdirectories
of a `tests` directory within your module, which `terraform test` will then
of a `tests` directory within your module, which `opentf test` will then
discover and run.
We would like to invite adventurous module authors to try writing integration
@ -87,7 +80,7 @@ likely not be required in a final design.
### Writing Tests for a Module
For the purposes of the current experiment, module tests are arranged into
_test suites_, each of which is a root Terraform module which includes a
_test suites_, each of which is a root OpenTF module which includes a
`module` block calling the module under test, and ideally also a number of
test assertions to verify that the module outputs match expectations.
@ -119,7 +112,7 @@ syntax for defining test assertions. For example:
terraform {
required_providers {
# Because we're currently using a built-in provider as
# a substitute for dedicated Terraform language syntax
# a substitute for dedicated OpenTF language syntax
# for now, test suite modules must always declare a
# dependency on this provider. This provider is only
# available when running tests, so you shouldn't use it
@ -144,7 +137,7 @@ module "main" {
# this module, so it doesn't set any input variables
# and just lets their default values be selected instead.
}
# As with all Terraform modules, we can use local values
# As with all OpenTF modules, we can use local values
# to do any necessary post-processing of the results from
# the module in preparation for writing test assertions.
locals {
@ -179,7 +172,7 @@ resource "test_assertions" "api_url" {
}
# We can also use data resources to respond to the
# behavior of the real remote system, rather than
# just to values within the Terraform configuration.
# just to values within the OpenTF configuration.
data "http" "api_response" {
depends_on = [
# make sure the syntax assertions run first, so
@ -206,7 +199,7 @@ in the expected way.
### Running Your Tests
The `terraform test` command aims to make it easier to exercise all of your
The `opentf test` command aims to make it easier to exercise all of your
defined test suites at once, and see only the output related to any test
failures or errors.
@ -217,7 +210,7 @@ suite directory containing `test_defaults.tf`.
Because these test suites are integration tests rather than unit tests, you'll
need to set up any credentials files or environment variables needed by the
providers your module uses before running `terraform test`. The test command
providers your module uses before running `opentf test`. The test command
will, for each suite:
* Install the providers and any external modules the test configuration depends
@ -227,10 +220,10 @@ on.
* Collect all of the test results from the apply step, which would also have
"created" the `test_assertions` resources.
* Destroy all of the objects recorded in the temporary test state, as if running
`terraform destroy` against the test configuration.
`opentf destroy` against the test configuration.
```shellsession
$ terraform test
$ opentf test
─── Failed: defaults.api_url.scheme (default scheme is https) ───────────────
wrong value
got: "http"
@ -240,19 +233,19 @@ wrong value
In this case, it seems like the module returned an `http` rather than an
`https` URL in the default case, and so the `defaults.api_url.scheme`
assertion failed, and the `terraform test` command detected and reported it.
assertion failed, and the `opentf test` command detected and reported it.
The `test_assertions` resource captures any assertion failures but does not
return an error, because that can then potentially allow downstream
assertions to also run and thus capture as much context as possible.
However, if Terraform encounters any _errors_ while processing the test
However, if OpenTF encounters any _errors_ while processing the test
configuration it will halt processing, which may cause some of the test
assertions to be skipped.
## Known Limitations
The design above is very much a prototype aimed at gathering more experience
with the possibilities of testing inside the Terraform language. We know it's
with the possibilities of testing inside the OpenTF language. We know it's
currently somewhat non-ergonomic, and hope to improve on that in later phases
of research and design, but the main focus of this iteration is on available
functionality and so with that in mind there are some specific possibilities
@ -270,11 +263,11 @@ failure if it doesn't. We'll hopefully be able to improve on this in a future
iteration with the test assertions better integrated into the language.
* Capturing context about failures. Due to this prototype using a provider as
an approximation for new assertion syntax, the `terraform test` command is
an approximation for new assertion syntax, the `opentf test` command is
limited in how much context it's able to gather about failures. A design
more integrated into the language could potentially capture the source
expressions and input values to give better feedback about what went wrong,
similar to what Terraform typically returns from expression evaluation errors
similar to what OpenTF typically returns from expression evaluation errors
in the main language.
* Unit testing without creating real objects. Although we do hope to spend more
@ -300,12 +293,6 @@ well in each and what didn't work so well.
Our ultimate goal is to work towards an integration testing methodology which
strikes the best compromise between the capabilities of these different
approaches, ideally avoiding a hard requirement on any particular external
language and fitting well into the Terraform workflow.
language and fitting well into the OpenTF workflow.
Since this is still early work and likely to lead to unstructured discussion,
we'd like to gather feedback primarily via new topics in
[the community forum](https://discuss.hashicorp.com/c/terraform-core/27). That
way we can have some more freedom to explore different ideas and approaches
without the structural requirements we typically impose on GitHub issues.
Any feedback you'd like to share would be very welcome!
Any feedback you'd like to share would be very welcome!