backend/remote: Support HCL variable values in local operations
For remote operations, the remote system (Terraform Cloud or Enterprise)
writes the stored variable values into a .tfvars file before running the
remote copy of Terraform CLI.
By contrast, for operations that only run locally (like
"terraform import"), we fetch the stored variable values from the remote
API and add them into the set of available variables directly as part
of creating the local execution context.
Previously in the local-only case we were assuming that all stored
variables are strings, which isn't true: the Terraform Cloud/Enterprise UI
allows users to specify that a particular variable is given as an HCL
expression, in which case the correct behavior is to parse and evaluate
the expression to obtain the final value.
This also addresses a related issue whereby previously we were forcing
all sensitive values to be represented as a special string "<sensitive>".
That leads to type checking errors for any variable specified as having
a type other than string, so instead here we use an unknown value as a
placeholder so that type checking can pass.
Unpopulated sensitive values may cause errors downstream though, so we'll
also produce a warning for each of them to let the user know that those
variables are not available for local-only operations. It's a warning
rather than an error so that operations that don't rely on known values
for those variables can potentially complete successfully.
This can potentially produce errors in situations that would've been
silently ignored before: if a remote variable is marked as being HCL
syntax but is not valid HCL then it will now fail parsing at this early
stage, whereas previously it would've just passed through as a string
and failed only if the operation tried to interpret it as a non-string.
However, in situations like these the remote operations like
"terraform plan" would already have been failing with an equivalent
error message anyway, so it's unlikely that any existing workspace that
is being used for routine operations would have such a broken
configuration.
2019-10-29 19:09:16 -05:00
package remote
import (
2020-04-30 16:56:32 -05:00
"context"
2022-03-15 16:42:11 -05:00
"github.com/hashicorp/terraform/internal/terraform"
"github.com/hashicorp/terraform/internal/tfdiags"
"reflect"
backend/remote: Support HCL variable values in local operations
For remote operations, the remote system (Terraform Cloud or Enterprise)
writes the stored variable values into a .tfvars file before running the
remote copy of Terraform CLI.
By contrast, for operations that only run locally (like
"terraform import"), we fetch the stored variable values from the remote
API and add them into the set of available variables directly as part
of creating the local execution context.
Previously in the local-only case we were assuming that all stored
variables are strings, which isn't true: the Terraform Cloud/Enterprise UI
allows users to specify that a particular variable is given as an HCL
expression, in which case the correct behavior is to parse and evaluate
the expression to obtain the final value.
This also addresses a related issue whereby previously we were forcing
all sensitive values to be represented as a special string "<sensitive>".
That leads to type checking errors for any variable specified as having
a type other than string, so instead here we use an unknown value as a
placeholder so that type checking can pass.
Unpopulated sensitive values may cause errors downstream though, so we'll
also produce a warning for each of them to let the user know that those
variables are not available for local-only operations. It's a warning
rather than an error so that operations that don't rely on known values
for those variables can potentially complete successfully.
This can potentially produce errors in situations that would've been
silently ignored before: if a remote variable is marked as being HCL
syntax but is not valid HCL then it will now fail parsing at this early
stage, whereas previously it would've just passed through as a string
and failed only if the operation tried to interpret it as a non-string.
However, in situations like these the remote operations like
"terraform plan" would already have been failing with an equivalent
error message anyway, so it's unlikely that any existing workspace that
is being used for routine operations would have such a broken
configuration.
2019-10-29 19:09:16 -05:00
"testing"
tfe "github.com/hashicorp/go-tfe"
2021-05-17 10:42:17 -05:00
"github.com/hashicorp/terraform/internal/backend"
2021-05-17 14:07:38 -05:00
"github.com/hashicorp/terraform/internal/command/arguments"
"github.com/hashicorp/terraform/internal/command/clistate"
"github.com/hashicorp/terraform/internal/command/views"
2021-05-17 14:17:09 -05:00
"github.com/hashicorp/terraform/internal/configs"
2019-11-13 10:34:09 -06:00
"github.com/hashicorp/terraform/internal/initwd"
2021-05-17 14:43:35 -05:00
"github.com/hashicorp/terraform/internal/states/statemgr"
2021-02-16 06:19:22 -06:00
"github.com/hashicorp/terraform/internal/terminal"
backend/remote: Support HCL variable values in local operations
For remote operations, the remote system (Terraform Cloud or Enterprise)
writes the stored variable values into a .tfvars file before running the
remote copy of Terraform CLI.
By contrast, for operations that only run locally (like
"terraform import"), we fetch the stored variable values from the remote
API and add them into the set of available variables directly as part
of creating the local execution context.
Previously in the local-only case we were assuming that all stored
variables are strings, which isn't true: the Terraform Cloud/Enterprise UI
allows users to specify that a particular variable is given as an HCL
expression, in which case the correct behavior is to parse and evaluate
the expression to obtain the final value.
This also addresses a related issue whereby previously we were forcing
all sensitive values to be represented as a special string "<sensitive>".
That leads to type checking errors for any variable specified as having
a type other than string, so instead here we use an unknown value as a
placeholder so that type checking can pass.
Unpopulated sensitive values may cause errors downstream though, so we'll
also produce a warning for each of them to let the user know that those
variables are not available for local-only operations. It's a warning
rather than an error so that operations that don't rely on known values
for those variables can potentially complete successfully.
This can potentially produce errors in situations that would've been
silently ignored before: if a remote variable is marked as being HCL
syntax but is not valid HCL then it will now fail parsing at this early
stage, whereas previously it would've just passed through as a string
and failed only if the operation tried to interpret it as a non-string.
However, in situations like these the remote operations like
"terraform plan" would already have been failing with an equivalent
error message anyway, so it's unlikely that any existing workspace that
is being used for routine operations would have such a broken
configuration.
2019-10-29 19:09:16 -05:00
"github.com/zclconf/go-cty/cty"
)
func TestRemoteStoredVariableValue ( t * testing . T ) {
tests := map [ string ] struct {
Def * tfe . Variable
Want cty . Value
WantError string
} {
"string literal" : {
& tfe . Variable {
Key : "test" ,
Value : "foo" ,
HCL : false ,
Sensitive : false ,
} ,
cty . StringVal ( "foo" ) ,
` ` ,
} ,
"string HCL" : {
& tfe . Variable {
Key : "test" ,
Value : ` "foo" ` ,
HCL : true ,
Sensitive : false ,
} ,
cty . StringVal ( "foo" ) ,
` ` ,
} ,
"list HCL" : {
& tfe . Variable {
Key : "test" ,
Value : ` [] ` ,
HCL : true ,
Sensitive : false ,
} ,
cty . EmptyTupleVal ,
` ` ,
} ,
"null HCL" : {
& tfe . Variable {
Key : "test" ,
Value : ` null ` ,
HCL : true ,
Sensitive : false ,
} ,
cty . NullVal ( cty . DynamicPseudoType ) ,
` ` ,
} ,
"literal sensitive" : {
& tfe . Variable {
Key : "test" ,
HCL : false ,
Sensitive : true ,
} ,
cty . UnknownVal ( cty . String ) ,
` ` ,
} ,
"HCL sensitive" : {
& tfe . Variable {
Key : "test" ,
HCL : true ,
Sensitive : true ,
} ,
cty . DynamicVal ,
` ` ,
} ,
"HCL computation" : {
// This (stored expressions containing computation) is not a case
// we intentionally supported, but it became possible for remote
// operations in Terraform 0.12 (due to Terraform Cloud/Enterprise
// just writing the HCL verbatim into generated `.tfvars` files).
// We support it here for consistency, and we continue to support
// it in both places for backward-compatibility. In practice,
// there's little reason to do computation in a stored variable
// value because references are not supported.
& tfe . Variable {
Key : "test" ,
Value : ` [for v in ["a"] : v] ` ,
HCL : true ,
Sensitive : false ,
} ,
cty . TupleVal ( [ ] cty . Value { cty . StringVal ( "a" ) } ) ,
` ` ,
} ,
"HCL syntax error" : {
& tfe . Variable {
Key : "test" ,
Value : ` [ ` ,
HCL : true ,
Sensitive : false ,
} ,
cty . DynamicVal ,
` Invalid expression for var.test: The value of variable "test" is marked in the remote workspace as being specified in HCL syntax, but the given value is not valid HCL. Stored variable values must be valid literal expressions and may not contain references to other variables or calls to functions. ` ,
} ,
"HCL with references" : {
& tfe . Variable {
Key : "test" ,
Value : ` foo.bar ` ,
HCL : true ,
Sensitive : false ,
} ,
cty . DynamicVal ,
` Invalid expression for var.test: The value of variable "test" is marked in the remote workspace as being specified in HCL syntax, but the given value is not valid HCL. Stored variable values must be valid literal expressions and may not contain references to other variables or calls to functions. ` ,
} ,
}
for name , test := range tests {
t . Run ( name , func ( t * testing . T ) {
v := & remoteStoredVariableValue {
definition : test . Def ,
}
// This ParseVariableValue implementation ignores the parsing mode,
// so we'll just always parse literal here. (The parsing mode is
// selected by the remote server, not by our local configuration.)
gotIV , diags := v . ParseVariableValue ( configs . VariableParseLiteral )
if test . WantError != "" {
if ! diags . HasErrors ( ) {
t . Fatalf ( "missing expected error\ngot: <no error>\nwant: %s" , test . WantError )
}
errStr := diags . Err ( ) . Error ( )
if errStr != test . WantError {
t . Fatalf ( "wrong error\ngot: %s\nwant: %s" , errStr , test . WantError )
}
} else {
if diags . HasErrors ( ) {
t . Fatalf ( "unexpected error\ngot: %s\nwant: <no error>" , diags . Err ( ) . Error ( ) )
}
got := gotIV . Value
if ! test . Want . RawEquals ( got ) {
t . Errorf ( "wrong result\ngot: %#v\nwant: %#v" , got , test . Want )
}
}
} )
}
}
2019-11-13 10:34:09 -06:00
func TestRemoteContextWithVars ( t * testing . T ) {
catTerraform := tfe . CategoryTerraform
catEnv := tfe . CategoryEnv
tests := map [ string ] struct {
Opts * tfe . VariableCreateOptions
WantError string
} {
"Terraform variable" : {
& tfe . VariableCreateOptions {
Category : & catTerraform ,
} ,
` Value for undeclared variable: A variable named "key" was assigned a value, but the root module does not declare a variable of that name. To use this value, add a "variable" block to the configuration. ` ,
} ,
"environment variable" : {
& tfe . VariableCreateOptions {
Category : & catEnv ,
} ,
` ` ,
} ,
}
for name , test := range tests {
t . Run ( name , func ( t * testing . T ) {
configDir := "./testdata/empty"
b , bCleanup := testBackendDefault ( t )
defer bCleanup ( )
_ , configLoader , configCleanup := initwd . MustLoadConfigForTests ( t , configDir )
defer configCleanup ( )
2020-04-30 16:56:32 -05:00
workspaceID , err := b . getRemoteWorkspaceID ( context . Background ( ) , backend . DefaultStateName )
if err != nil {
t . Fatal ( err )
}
2021-02-16 06:19:22 -06:00
streams , _ := terminal . StreamsForTesting ( t )
view := views . NewStateLocker ( arguments . ViewHuman , views . NewView ( streams ) )
2019-11-13 10:34:09 -06:00
op := & backend . Operation {
ConfigDir : configDir ,
ConfigLoader : configLoader ,
2021-02-16 06:19:22 -06:00
StateLocker : clistate . NewLocker ( 0 , view ) ,
2019-11-13 10:34:09 -06:00
Workspace : backend . DefaultStateName ,
}
v := test . Opts
if v . Key == nil {
key := "key"
v . Key = & key
}
2020-12-01 08:10:01 -06:00
b . client . Variables . Create ( context . TODO ( ) , workspaceID , * v )
2019-11-13 10:34:09 -06:00
core: Functional-style API for terraform.Context
Previously terraform.Context was built in an unfortunate way where all of
the data was provided up front in terraform.NewContext and then mutated
directly by subsequent operations. That made the data flow hard to follow,
commonly leading to bugs, and also meant that we were forced to take
various actions too early in terraform.NewContext, rather than waiting
until a more appropriate time during an operation.
This (enormous) commit changes terraform.Context so that its fields are
broadly just unchanging data about the execution context (current
workspace name, available plugins, etc) whereas the main data Terraform
works with arrives via individual method arguments and is returned in
return values.
Specifically, this means that terraform.Context no longer "has-a" config,
state, and "planned changes", instead holding on to those only temporarily
during an operation. The caller is responsible for propagating the outcome
of one step into the next step so that the data flow between operations is
actually visible.
However, since that's a change to the main entry points in the "terraform"
package, this commit also touches every file in the codebase which
interacted with those APIs. Most of the noise here is in updating tests
to take the same actions using the new API style, but this also affects
the main-code callers in the backends and in the command package.
My goal here was to refactor without changing observable behavior, but in
practice there are a couple externally-visible behavior variations here
that seemed okay in service of the broader goal:
- The "terraform graph" command is no longer hooked directly into the
core graph builders, because that's no longer part of the public API.
However, I did include a couple new Context functions whose contract
is to produce a UI-oriented graph, and _for now_ those continue to
return the physical graph we use for those operations. There's no
exported API for generating the "validate" and "eval" graphs, because
neither is particularly interesting in its own right, and so
"terraform graph" no longer supports those graph types.
- terraform.NewContext no longer has the responsibility for collecting
all of the provider schemas up front. Instead, we wait until we need
them. However, that means that some of our error messages now have a
slightly different shape due to unwinding through a differently-shaped
call stack. As of this commit we also end up reloading the schemas
multiple times in some cases, which is functionally acceptable but
likely represents a performance regression. I intend to rework this to
use caching, but I'm saving that for a later commit because this one is
big enough already.
The proximal reason for this change is to resolve the chicken/egg problem
whereby there was previously no single point where we could apply "moved"
statements to the previous run state before creating a plan. With this
change in place, we can now do that as part of Context.Plan, prior to
forking the input state into the three separate state artifacts we use
during planning.
However, this is at least the third project in a row where the previous
API design led to piling more functionality into terraform.NewContext and
then working around the incorrect order of operations that produces, so
I intend that by paying the cost/risk of this large diff now we can in
turn reduce the cost/risk of future projects that relate to our main
workflow actions.
2021-08-24 14:06:38 -05:00
_ , _ , diags := b . LocalRun ( op )
2019-11-13 10:34:09 -06:00
if test . WantError != "" {
if ! diags . HasErrors ( ) {
t . Fatalf ( "missing expected error\ngot: <no error>\nwant: %s" , test . WantError )
}
errStr := diags . Err ( ) . Error ( )
if errStr != test . WantError {
t . Fatalf ( "wrong error\ngot: %s\nwant: %s" , errStr , test . WantError )
}
2020-08-11 10:23:42 -05:00
// When Context() returns an error, it should unlock the state,
// so re-locking it is expected to succeed.
stateMgr , _ := b . StateMgr ( backend . DefaultStateName )
if _ , err := stateMgr . Lock ( statemgr . NewLockInfo ( ) ) ; err != nil {
t . Fatalf ( "unexpected error locking state: %s" , err . Error ( ) )
}
2019-11-13 10:34:09 -06:00
} else {
if diags . HasErrors ( ) {
t . Fatalf ( "unexpected error\ngot: %s\nwant: <no error>" , diags . Err ( ) . Error ( ) )
}
2020-08-11 10:23:42 -05:00
// When Context() succeeds, this should fail w/ "workspace already locked"
stateMgr , _ := b . StateMgr ( backend . DefaultStateName )
if _ , err := stateMgr . Lock ( statemgr . NewLockInfo ( ) ) ; err == nil {
t . Fatal ( "unexpected success locking state after Context" )
}
2019-11-13 10:34:09 -06:00
}
} )
}
}
2022-03-15 16:42:11 -05:00
func TestRemoteVariablesDoNotOverride ( t * testing . T ) {
catTerraform := tfe . CategoryTerraform
varName1 := "key1"
varName2 := "key2"
varName3 := "key3"
varValue1 := "value1"
varValue2 := "value2"
varValue3 := "value3"
tests := map [ string ] struct {
localVariables map [ string ] backend . UnparsedVariableValue
remoteVariables [ ] * tfe . VariableCreateOptions
expectedVariables terraform . InputValues
} {
"no local variables" : {
map [ string ] backend . UnparsedVariableValue { } ,
[ ] * tfe . VariableCreateOptions {
{
Key : & varName1 ,
Value : & varValue1 ,
Category : & catTerraform ,
} ,
{
Key : & varName2 ,
Value : & varValue2 ,
Category : & catTerraform ,
} ,
{
Key : & varName3 ,
Value : & varValue3 ,
Category : & catTerraform ,
} ,
} ,
terraform . InputValues {
varName1 : & terraform . InputValue {
Value : cty . StringVal ( varValue1 ) ,
SourceType : terraform . ValueFromInput ,
SourceRange : tfdiags . SourceRange {
Filename : "" ,
Start : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
} ,
} ,
varName2 : & terraform . InputValue {
Value : cty . StringVal ( varValue2 ) ,
SourceType : terraform . ValueFromInput ,
SourceRange : tfdiags . SourceRange {
Filename : "" ,
Start : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
} ,
} ,
varName3 : & terraform . InputValue {
Value : cty . StringVal ( varValue3 ) ,
SourceType : terraform . ValueFromInput ,
SourceRange : tfdiags . SourceRange {
Filename : "" ,
Start : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
} ,
} ,
} ,
} ,
"single conflicting local variable" : {
map [ string ] backend . UnparsedVariableValue {
varName3 : testUnparsedVariableValue ( varValue3 ) ,
} ,
[ ] * tfe . VariableCreateOptions {
{
Key : & varName1 ,
Value : & varValue1 ,
Category : & catTerraform ,
} , {
Key : & varName2 ,
Value : & varValue2 ,
Category : & catTerraform ,
} , {
Key : & varName3 ,
Value : & varValue3 ,
Category : & catTerraform ,
} ,
} ,
terraform . InputValues {
varName1 : & terraform . InputValue {
Value : cty . StringVal ( varValue1 ) ,
SourceType : terraform . ValueFromInput ,
SourceRange : tfdiags . SourceRange {
Filename : "" ,
Start : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
} ,
} ,
varName2 : & terraform . InputValue {
Value : cty . StringVal ( varValue2 ) ,
SourceType : terraform . ValueFromInput ,
SourceRange : tfdiags . SourceRange {
Filename : "" ,
Start : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
} ,
} ,
varName3 : & terraform . InputValue {
Value : cty . StringVal ( varValue3 ) ,
SourceType : terraform . ValueFromNamedFile ,
SourceRange : tfdiags . SourceRange {
Filename : "fake.tfvars" ,
Start : tfdiags . SourcePos { Line : 1 , Column : 1 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 1 , Column : 1 , Byte : 0 } ,
} ,
} ,
} ,
} ,
"no conflicting local variable" : {
map [ string ] backend . UnparsedVariableValue {
varName3 : testUnparsedVariableValue ( varValue3 ) ,
} ,
[ ] * tfe . VariableCreateOptions {
{
Key : & varName1 ,
Value : & varValue1 ,
Category : & catTerraform ,
} , {
Key : & varName2 ,
Value : & varValue2 ,
Category : & catTerraform ,
} ,
} ,
terraform . InputValues {
varName1 : & terraform . InputValue {
Value : cty . StringVal ( varValue1 ) ,
SourceType : terraform . ValueFromInput ,
SourceRange : tfdiags . SourceRange {
Filename : "" ,
Start : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
} ,
} ,
varName2 : & terraform . InputValue {
Value : cty . StringVal ( varValue2 ) ,
SourceType : terraform . ValueFromInput ,
SourceRange : tfdiags . SourceRange {
Filename : "" ,
Start : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 0 , Column : 0 , Byte : 0 } ,
} ,
} ,
varName3 : & terraform . InputValue {
Value : cty . StringVal ( varValue3 ) ,
SourceType : terraform . ValueFromNamedFile ,
SourceRange : tfdiags . SourceRange {
Filename : "fake.tfvars" ,
Start : tfdiags . SourcePos { Line : 1 , Column : 1 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 1 , Column : 1 , Byte : 0 } ,
} ,
} ,
} ,
} ,
}
for name , test := range tests {
t . Run ( name , func ( t * testing . T ) {
configDir := "./testdata/variables"
b , bCleanup := testBackendDefault ( t )
defer bCleanup ( )
_ , configLoader , configCleanup := initwd . MustLoadConfigForTests ( t , configDir )
defer configCleanup ( )
workspaceID , err := b . getRemoteWorkspaceID ( context . Background ( ) , backend . DefaultStateName )
if err != nil {
t . Fatal ( err )
}
streams , _ := terminal . StreamsForTesting ( t )
view := views . NewStateLocker ( arguments . ViewHuman , views . NewView ( streams ) )
op := & backend . Operation {
ConfigDir : configDir ,
ConfigLoader : configLoader ,
StateLocker : clistate . NewLocker ( 0 , view ) ,
Workspace : backend . DefaultStateName ,
Variables : test . localVariables ,
}
for _ , v := range test . remoteVariables {
b . client . Variables . Create ( context . TODO ( ) , workspaceID , * v )
}
lr , _ , diags := b . LocalRun ( op )
if diags . HasErrors ( ) {
t . Fatalf ( "unexpected error\ngot: %s\nwant: <no error>" , diags . Err ( ) . Error ( ) )
}
// When Context() succeeds, this should fail w/ "workspace already locked"
stateMgr , _ := b . StateMgr ( backend . DefaultStateName )
if _ , err := stateMgr . Lock ( statemgr . NewLockInfo ( ) ) ; err == nil {
t . Fatal ( "unexpected success locking state after Context" )
}
actual := lr . PlanOpts . SetVariables
expected := test . expectedVariables
for expectedKey := range expected {
actualValue := actual [ expectedKey ]
expectedValue := expected [ expectedKey ]
if ! reflect . DeepEqual ( * actualValue , * expectedValue ) {
t . Fatalf ( "unexpected variable '%s'\ngot: %v\nwant: %v" , expectedKey , actualValue , expectedValue )
}
}
} )
}
}
type testUnparsedVariableValue string
func ( v testUnparsedVariableValue ) ParseVariableValue ( mode configs . VariableParsingMode ) ( * terraform . InputValue , tfdiags . Diagnostics ) {
return & terraform . InputValue {
Value : cty . StringVal ( string ( v ) ) ,
SourceType : terraform . ValueFromNamedFile ,
SourceRange : tfdiags . SourceRange {
Filename : "fake.tfvars" ,
Start : tfdiags . SourcePos { Line : 1 , Column : 1 , Byte : 0 } ,
End : tfdiags . SourcePos { Line : 1 , Column : 1 , Byte : 0 } ,
} ,
} , nil
}