Terraform
Terraform MCP server overview
Learn about the Terraform model context protocol (MCP) server and how it can help you write Terraform configuration using AI.
The Terraform Model Context Protocol (MCP) server enhances AI models with real-time access to current Terraform provider documentation, modules, and policies from the Terraform registry. This ensures AI-generated Terraform configurations use accurate, up-to-date information rather than potentially outdated training data.
What is the Terraform MCP server?
The Model Context Protocol (MCP) is an open standard that enables AI models to securely connect with external tools, applications, and data sources. MCP allows AI models to access information beyond their training data, providing more current and accurate responses.
The Terraform MCP server implements this protocol specifically for Terraform development, offering several key benefits:
- Real-time accuracy: Access current provider documentation instead of relying on potentially outdated training data
- Terraform Registry Integration: Direct integration with public Terraform Registry APIs for providers, modules, and policies
- HCP Terraform & Terraform Enterprise Support: Full workspace management, organization/project listing, and private registry access
- Workspace Operations: Create, update, delete workspaces with support for variables, tags, and run management
- AI enhancement: Enables more accurate and actionable Terraform configuration generation
How it works
When you connect an AI model to the Terraform MCP server, the model gains access to specialized tools that can perform the following actions:
- Search and retrieve current provider documentation
- Access module information, including inputs, outputs, and examples
- Find Sentinel policies for governance and compliance
- HCP Terraform or TFE organization and workspace lists
- HCP Terraform or TFE workspace, variables, tags, variable set create, read, update and delete operations
The AI model uses these tools automatically when you ask questions about Terraform configuration, ensuring responses are based on the most current information available.
Deployment architecture
The deployment architecture for running an MCP server includes the following components:
- AI model: A trained algorithm and large dataset that recognizes patterns, makes predictions, and performs tasks with minimal human intervention.
- MCP host: AI application or environment in a language model performs AI-driven tasks that operate the MCP client, such as Claude Desktop.
- MCP client: An interface that discovers MCP server tools and translates model prompts into executable actions so that the MCP host can communicate with the MCP server.
- MCP server: A service that the MCP client calls to execute various tools, resources, and prompts. It provides a server or tool manifest for allowing the model to dynamically discover available capabilities.
- MCP tool: A server-defined executable function or operation, such as plan or apply operations in Terraform, with defined inputs and outputs callable by clients.
- MCP transport: Handles the underlying communication of how messages are sent and received over the JSON-RPC 2.0 protocol. The stdio transport allows the MCP server to invoke tools directly using the standard input/output pipe. The streamable HTTP transport exposes a local server, such as
127.0.0.1:8080
, to receive and respond to MCP tool calls.
Additional resources
- Terraform MCP server repository: For source code, issues and contributions.
- Terraform MCP server releases: Provides links for downloading prebuilt binaries.