Terraform
Deploy the Terraform MCP server
The Terraform Model Context Protocol (MCP) server enables AI models to generate Terraform configuration using up-to-date information from the Terraform Registry. This page explains how to install, configure, and integrate the MCP server with your AI client.
Overview
The Terraform MCP server is a specialized service that provides AI models with access to current Terraform provider documentation and module information. You can deploy the server to the following environments:
- Local deployment: Run the server on your workstation using
stdio
mode for direct communication through standard input/output - Remote deployment: Run the server on a remote instance using
streamable-http
mode for network-based communication
Secure configuration
Implement the following recommendations to securely deploy the MCP server:
- Hosting: We recommend running the MCP Server locally at
127.0.0.1
through the STDIO or HTTP Streamable transport protocol to limit publicly exposing your Terraform environment. The default transport is set to STDIO. If you host the service remotely, we recommend implementing additional security controls at the application and network layers. - CORS: By default, Terraform MCP server runs in
strict
CORS (cross-origin request) mode and the allowed origins are empty. As a result, all cross-origin requests are blocked unless the server is explicitly configured to allow them. Exercise caution when you need to change allowed origins list. - Terraform authentication: The
TFE_SKIP_VERIFY
option is enabled by default. We recommend keeping the option enabled so that communication with your Terraform environment is encrypted. We also recommend limiting the permissions of theTFE_TOKEN
used to authenticate as described in the Terraform documentation. Refer to API tokens for more information. - Rate limiting: We recommend setting up global and per-session rate limits to prevent the server or dependent resources from becoming overloaded through excessive requests.
- TLS: When making your MCP server accessible remotely, we recommend adding a TLS certificate to protect in-transit communication through the wire.
Installation methods
Choose from three installation options based on your environment and preferences:
Method | Best for | Requirements |
---|---|---|
Docker | Most users, consistent environments | Docker Engine v20.10.21+ or Docker Desktop v4.14.0+. Refer to the Docker documentation for installation instructions. |
Compiled binary | Lightweight deployments, specific OS needs | Compatible operating system |
Source installation | Development, customization | Go development environment |
Requirements
Terraform MCP server v0.3.0 or newer is required to authenticate Terraform MCP server with the registry. You must also obtain an authentication token from HCP Terraform or your Terraform Enterprise deployment so that the MCP server can present it. Refer to API tokens for instructions.
We recommend authenticating the server with the registry to ensure that communication in your Terraform environment is encrypted. Refer to Secure configuration for more information.
Run in Docker
Docker provides the most reliable and consistent way to run the Terraform MCP server across different environments. To run the server in Docker, start Docker on your system, then integrate with your AI assistant.
Enable authentication
Set the TFE_TOKEN
environment variable to your HCP Terraform or Terraform Enteprise API token. Set the TFE_HOSTNAME
environment variable to the URL of your Terraform Enteprise deployment. Refer to the environment variables reference for more information.
Verify Visual Studio Code is installed.
Verify the GitHub Copilot extension is installed and chats are configured to
Agent
mode.Verify MCP support enabled, refer to the VS Code MCP documentation for more information.
To use the MCP server in all workspaces, add the following configuration to your user settings JSON file:
{ "mcp": { "servers": { "terraform": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "${input:tfe_token}", "-e", "${input:tfe_address}", "hashicorp/terraform-mcp-server:0.3.0" ] } }, "inputs": [ { "type": "promptString", "id": "tfe_token", "description": "Terraform API Token", "password": true }, { "type": "promptString", "id": "tfe_address", "description": "Terraform Address", "password": false } ] } }
Alternatively, to use the server in a specific workspace, create an
mcp.json
file with the following configuration in your workspace's.vscode
directory:{ "servers": { "terraform": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "${input:tfe_token}", "-e", "${input:tfe_address}", "hashicorp/terraform-mcp-server:0.3.0" ] } }, "inputs": [ { "type": "promptString", "id": "tfe_token", "description": "Terraform API Token", "password": true }, { "type": "promptString", "id": "tfe_address", "description": "Terraform Address", "password": false } ] }
Run without authentication
Set the tfe_token
environment variable to your HCP Terraform or Terraform Enteprise API token. Set the tfe_hostname
environment variable to the URL of your Terraform Enteprise deployment. Refer to the environment variables reference for more information.
Verify Visual Studio Code is installed.
Verify the GitHub Copilot extension is installed and chats are configured to
Agent
mode.Verify MCP support enabled, refer to the VS Code MCP documentation for more information.
To use the MCP server in all workspaces, add the following configuration to your user settings JSON file:
{ "mcp": { "servers": { "terraform": { "command": "docker", "args": [ "run", "-i", "--rm", "hashicorp/terraform-mcp-server:0.3.0" ] } } } }
Alternatively, to use the server in a specific workspace, create an
mcp.json
file with the following configuration in your workspace's.vscode
directory:{ "servers": { "terraform": { "command": "docker", "args": [ "run", "-i", "--rm", "hashicorp/terraform-mcp-server:0.3.0" ] } } }
Verify the integration by opening the chat interface and selecting Agent from the mode settings.
Click the tools icon to verify that Terraform MCP server tools appear in the available tools list.
Run the compiled binary
The compiled binary option provides a lightweight installation without Docker dependencies. This method is ideal when you want to minimize resource usage or work in environments with restricted container access.
Download the binary for your operating system and architecture from the release library. Then, add the configure your client settings.
If you downloaded a supported version of the binary, you can enable authentication in the configuration. Otherwise, you can run the server without authentication.
Enable authentication
Set the tfe_token
environment variable to your HCP Terraform or Terraform Enteprise API token. Set the tfe_hostname
environment variable to the URL of your Terraform Enteprise deployment. Refer to the environment variables reference for more information.
Replace /path/to/terraform-mcp-server
with the path to your downloaded binary.
{
"mcp": {
"servers": {
"terraform": {
"type": "stdio",
"command": "/path/to/terraform-mcp-server",
"env": {
"TFE_TOKEN": "<tfe-token>"
},
}
}
}
}
Run without authentication
{
"mcp": {
"servers": {
"terraform": {
"type": "stdio",
"command": "/path/to/terraform-mcp-server"
}
}
}
}
Install from source
Installing from source gives you access to the latest features and allows for customization. This method requires a Go development environment.
Install the latest stable release.
$ go install github.com/hashicorp/terraform-mcp-server/cmd/terraform-mcp-server@latest
Alternatively, you can install the development version on
main
.$ go install github.com/hashicorp/terraform-mcp-server/cmd/terraform-mcp-server@main
After installation, add the following configuration to your client.
{ "mcp": { "servers": { "terraform": { "command": "/path/to/terraform-mcp-server", "args": ["stdio"] } } } }
Replace
/path/to/terraform-mcp-server
with the actual path to your downloaded binary. The binary location depends on your Go installation andGOPATH
configuration.
Use which terraform-mcp-server
to find the installed binary path.
Start the server
You can use the terraform-mcp-server
CLI and specify the transport protocol you want to use to start the server. Refer to the transport protocols reference for more information.
Start the server in stdio
mode.
$ terraform-mcp-server stdio [--log-file /path/to/log]
Run the following command on the local instance to start the server in streamable-http
mode:
$ terraform-mcp-server streamable-http \
[--transport-port 8080] \
[--transport-host 127.0.0.1] \
[--mcp-endpoint /mcp] \
[--log-file /path/to/log]
Instead of setting values manually, you can also use the supported environment variables. Refer to the environment variables reference for details.
Next steps
- Begin prompting your AI model about Terraform configurations. Refer to Prompt an AI model for guidance on effective prompting techniques.
- The server provides access to up-to-date provider documentation.
- Ask for help with specific Terraform resources and modules.
- Explore advanced configuration options for your specific deployment needs.