Retrieve CI/CD secrets from Vault
Introduction
CI/CD pipelines require secure access to sensitive data like API keys, credentials, and certificates. Manually managing these secrets creates security risks and operational overhead. You should use a centralized secrets store to manage your secrets.
This article provides guidance and resources for securing popular CI/CD platforms with Vault, and highlights common authentication and secrets management anti-patterns.
HashiCorp Vault enables centralized secrets management to help secure your CI/CD workflows. Vault can manage identities and authentication with JWT/OIDC, LDAP, TLS certificates, tokens, and usernames & passwords. You can also use Vault to authenticate your CI/CD workloads with major cloud providers such as AWS, Azure, and GCP. This range of support enables you to build flexible workflows, and choose how your CI/CD pipelines retrieve data.
HashiCorp resources:
Explore these resources to learn more about Vault authentication methods and best practices around secrets management.
Anti-patterns
Authentication and secrets management are critical components of any CI/CD system, but organizations inadvertently introduce security risks by adopting anti-patterns around authentication and secrets.
These pitfalls can lead to compromised credentials, exposed data, or hijacking of the CI/CD pipeline itself, undermining the integrity of the entire software delivery process.
Vault can help you to avoid these common authentication and secrets management anti-patterns in your CI/CD pipelines.
Hard-coded secrets
Your CI/CD pipeline should not contain hard-coded secrets, nor should the code that your pipeline runs. Refer to the Common Weakness Enumeration (CWE) on use of hard-coded password for more information: [CWE-259]. Hard-coded secrets are an anti-patter that puts your pipeline and sensitive information at risk of data exposure or unauthorized access.
Instead, you should configure your pipelines to use secrets from a secret store. You can use HCP Vault Radar to help mitigate this anti-pattern. Vault Radar is a product that automates the detection and identification of un-managed secrets in your code so that security teams can take appropriate actions to remediate issues.
There are range of secret store choices, but these are the most common types:
HashiCorp resources:
- Platform independent secret store, like HashiCorp Vault.
External resources:
- CI/CD native secrets store, such as secure parameters in TeamCity.
- Cloud provider secret store, like AWS Secrets Manager, Google Cloud Secret Manager, and Azure Key Vault.
Hard-coded authentication
Do not store Vault authentication tokens or passwords in your code repository. Refer to the Common Weakness Enumeration (CWE) on use of hard-coded credentials for more information: [CWE-798].
Users or processes should instead use a secure authentication method, such as JWT/OIDC, or an external authentication method for dynamic authentication with a lifecycle policy.
Lack of revocation or rotation
When you fail to revoke or rotate (change) keys, tokens, certificates, or other credentials, you leave the pipeline vulnerable to exploitation through the anti-pattern of long-lived secrets. Refer to the CWE on use of key past its expiration date for more information: [CWE-324].
You should establish a time to live policy for credentials, and configure your CI/CD or external secrets manager to enforce it. For example, Vault can automatically revoke credentials when their time to live expires. You can also rotate dynamic credentials and certificates generated by Vault secrets engines.
HashiCorp resources:
- Read documentation on lease, renew, revoke.
Static versus dynamic secrets
Static secrets are conventional, long-lived credentials that you manually create and store in Vault. They remain unchanged until explicitly updated or destroyed. In contrast, Vault generates dynamic secrets on-demand when a client requests them. Each client request for dynamic secrets generates a new set of credentials. Dynamic secrets also expire after a specified time. Vault can automatically revoke dynamic secrets when they're expired.
Key differences between static and dynamic secrets:
- Secret creation: Users manually create and store static secrets, while Vault generates and stores dynamic secrets as needed.
- Lifecycle management: You must manually rotate or revoke static secrets, and they tend to be long-lived. Dynamic secrets let you configure an expiration time, and Vault automatically revokes them when they expire.
- Auditing: Dynamic secrets are more readily audited, as each credential is unique and tied to a specific request or user.
- Use cases: Static secrets tend to work better for shared credentials like API keys, whereas dynamic secrets are more ideal for database credentials, cloud access, or temporary needs.
Dynamic secrets
When you request access to a secret, Vault generates a dynamic secret on demand. Dynamic secrets don't exist until a user or system reads them, eliminating the risk of theft or unauthorized use by another client. Vault's built-in revocation mechanisms allow it to revoke dynamic secrets after use, minimizing the secret's lifespan.
Vault supports a range of secret engines that integrate with services like CI/CD tools to generate dynamic credentials as needed. These secrets engines are plugins to external services, such as AWS, Azure, GCP, Kubernetes, databases, and more. After enabling a secrets engine, and authenticating Vault to an external resource, users can request credentials from Vault to access the external resource.
Consider a CI/CD pipeline job that needs to retrieve an object from Amazon S3. Instead of using hard-coded AWS credentials in code, plaintext files, or CI/CD environment variables, the pipeline can authenticate to Vault using one of the supported authentication methods. Once authenticated, Vault issues temporary credentials to the CI/CD job that automatically expire when the pipeline completes its task. The pipeline task can use these temporary credentials to securely retrieve the object from S3.
Use-cases
Vault can generate and manage dynamic short-lived secrets for a range of common use cases. When you implement these dynamic secrets engines in your pipelines, you avoid anti-patterns like long-lived credentials and secret sprawl. Common use cases for dynamic secrets engines include:
- Database access
- Cloud access (for example: AWS, GCP, and Azure)
- SSH
- Authentication credentials
HashiCorp resources:
- Database credential rotation
- Dynamic secrets for AWS authentication for S3 access
- Understand dynamic secrets
- SSH Secrets engine: One-time SSH password
- Schedule-based static role rotation
External resources:
- From Vulnerabilities to Vault: How We Stopped Hard-coding Secrets and Started Using HashiCorp Vault
- External Secrets Operator Vault Dynamic Secret
- GitOps Secrets with Argo CD, HashiCorp Vault and the External Secret Operator
- Use HashiCorp Vault's Dynamic Secrets
Static secrets
Vault enables you to manage static secrets with the KV (key/value) secrets engine. After you enable a KV secrets engine in Vault, users can create static key/value secrets, like passwords, API keys, and certificates. CI/CD pipelines can authenticate with Vault, and retrieve these secrets instead of using secrets stored in code, files, or environment variables.
Consider a CI/CD pipeline job that needs to access a Google service. Vault can use the KV secrets engine to store the Google API key as a static secret. When the pipeline runs, the job authenticates with Vault and retrieves the API key, allowing secure access to the Google service without exposing sensitive credentials in code or configuration files.
Use-cases
Vault can secure and manage the lifecycle of sensitive information that does not frequently change with static secrets engines. Unlike dynamic secrets, static secrets do not have an associated time to live value, and Vault does not automatically revoke such secrets. Common use cases for Vault static secrets engines include:
- Storing third-party API keys
- Certificates
- Authentication credentials
HashiCorp resources:
GitLab
GitLab uses a JSON Web Token (JWT) to authenticate with Vault to securely access secrets for CI/CD pipelines. Once authenticated, GitLab can pull static secrets from the KV secrets engine, or dynamic secrets from engines such as the AWS secrets engine.
Follow the guidance in Use HashiCorp Vault secrets in GitLab CI/CD to enable your GitLab pipeline to establish authentication, and use secrets in Vault. Review the Using external secrets in CI tutorial to learn more about using Vault secrets engines with your GitLab pipelines.
Static secrets
To use static secrets, reference the secrets:vault
keyword in the secrets portion of your gitlab-ci.yml
file.
In the following example, the GitLab pipeline automatically authenticates to Vault with an ID Token. The pipeline then uses secrets:vault
to pull a secret from the Vault K/V secrets engine at the path /ops/production/db, and set the value of the password field as the DATABASE_PASSWORD
environment variable. Pipeline jobs
can then use the secret stored in the environment variable to authenticate to the correlating database. Refer to the Use Vault secrets in a CI job for further documentation.
job_with_secrets:
id_tokens:
# Automatically authenticate to Vault with GitLab ID token
VAULT_ID_TOKEN:
aud: https://vault.example.com
secrets:
# Store the the secret value in the DATABASE_PASSWORD environment variable
DATABASE_PASSWORD:
# Secret path: ops/data/production/db, field: password
vault: production/db/password@ops
# Store value directly in the environment variable, not a file
file: false
You can also pull static secrets and set them to environment variables from the CLI with manual authentication as shown in the Manual ID Token authentication documentation example.
manual_authentication:
variables:
VAULT_ADDR: http://vault.example.com:8200
image: vault:latest
id_tokens:
VAULT_ID_TOKEN:
aud: http://vault.example.com
# Store the the secret value in the DATABASE_PASSWORD environment variable
script:
- export DATABASE_PASSWORD="$(vault kv get -field=password secret/myproject/example/db)"
The following diagram shows the steps a GitLab CI/CD pipeline takes to retrieve a secret from Vault.
Dynamic secrets
GitLab users are also able to take advantage of Vault dynamic secrets engines. Once you set up JWT authentication to Vault as described above, you can enable a dynamic secrets engine such as AWS secrets engine in Vault. The AWS secrets engine allows the GitLab CI/CD jobs to request short-lived dynamic AWS credentials.
The following is an example of using dynamic AWS credentials in a GitLab job.
Note
If you're signing requests to AWS, you may need to set AWS_SESSION_TOKEN
in the following example.
read_secrets:
image: hashicorp/vault:latest
script:
# jq must be installed
# set the dynamic aws credentials to AWS_CREDS variable
- export AWS_CREDS="$(vault read aws/creds/my-role -format=json)"
# use jq to parse AWS_CREDS and set the AWS access_key to AWS_ACCESS_KEY_ID
- AWS_ACCESS_KEY_ID=$(echo "${AWS_CREDS}" | jq -r .data.access_key)
- export AWS_ACCESS_KEY_ID
# use jq to parse AWS_CREDS and set the AWS secret_key to AWS_SECRET_ACCES_KEY
- AWS_SECRET_ACCESS_KEY=$(echo "${AWS_CREDS}" | jq -r .data.secret_key)
- export AWS_SECRET_ACCESS_KEY
# uncomment next two lines to use set a session token if required
#- AWS_SESSION_TOKEN=$(echo "$(AWS_CREDS}" | jq -r .data.security.token)
#- export AWS_SESSION_TOKEN
HashiCorp resources:
External resources:
Guy Barros, a Senior Solutions Engineer at HashiCorp, maintains a repository with Terraform code to automate the JWT auth method integration between HCP Vault Dedicated and GitLab. Barros demonstrates how to use the Terraform code in the Codify your JWT-OIDC Vault auth method with Terraform HashiTalks video.
GitLab Unfiltered - How to integrate GitLab CI with HashiCorp Vault to retrieve secrets (via JWT or "secrets:"), uses AWS Quick Start to launch HashiCorp Vault on AWS, and demonstrates how to set up policies, roles, and authentication to Vault.
GitHub Actions
You can choose from several effective methods to integrate Vault with GitHub Actions to manage pipeline secrets. Use this section to discover approaches to authenticate with Vault and retrieve secrets with GitHub Actions, and to learn more about the solution that best fits your requirements.
Use the GitHub OIDC provider
You can use the GitHub OIDC provider with Vault's JWT auth method to authenticate, and retrieve pipeline secrets from Vault. The HashiCorp Validated Pattern Retrieve Vault secrets from GitHub Actions details the background and best practices for using this approach, along with validated architecture, and complete implementation examples using HCP Vault.
When you use this approach for managing GitHub Actions pipeline secrets, you gain a scalable solution that simplifies management, uses the principle of least privilege, and eliminates long-lived static credentials. You can also audit the complete secrets management solution.
HashiCorp resources:
- Retrieve Vault secrets from GitHub Actions
- Using OIDC With HashiCorp Vault and GitHub Actions (video)
- Building Scalable Enterprise Secrets Management with GitHub OIDC and HashiCorp Vault (video)
GitHub resources:
Use Vault Secrets Sync
Vault Secrets Sync provides an alternative solution for managing GitHub Actions pipeline secrets that is useful in cases where using the GitHub OIDC provider is not a feasible approach. This approach simplifies configuration, and allows you to select specific secrets or secret paths for synchronization from Vault to GitHub as repository or environment secrets.
When you use this approach you gain several benefits:
- Unlike with other approaches, you do not need to include authentication code in your workflow.
- Secrets Sync works with all runner types without extra configuration.
- When you rotate secrets in Vault, they automatically synchronize to GitHub.
- Simpler developer workflow; access secrets using familiar GitHub Action syntax without the need to learn Vault details. HashiCorp resources:
- Maintain centralized policy and audit controls.
- Repository level access control.
The Secrets Sync approach does have some limitations:
Supports only static key/value (KV) Vault secrets engines.
Copies secrets into GitHub instead of just-in-time access.
Can require extra management of GitHub Personal Access Tokens or App Tokens for authentication.
Developer's Guide to HCP Vault, Part 3: Secrets sync (video)
External resources:
Use the HashiCorp Vault GitHub Action
HashiCorp provides an official Vault GitHub Action that integrates with your GitHub Actions CI/CD pipelines. The Vault GitHub Action supports several auth methods, allowing you to implement the approach that works best with your CI/CD workflow. We recommend using the JWT authentication method with GitHub OIDC tokens or the AppRole authentication method for this solution. Review the Vault GitHub Action documentation auth methods section to learn more.
Once authenticated with Vault, GitHub Actions pipelines can request secrets from any Vault secrets engine that supports retrieval via GET
requests. For example, you can use the AWS secrets engine to generate and retrieve dynamic AWS credentials, as this secrets engine uses GET
requests for credential retrieval.
HashiCorp resources:
- Vault GitHub Action
- Vault GitHub Action repository
- Automate workflows with Vault GitHub actions
- Integrate with GitHub Actions
- Learn Vault GitHub Actions (example code)
- Secure Developer Workflows with Vault & GitHub Actions (video)
- Secure GitOps Workflows with GitHub Actions and HashiCorp Vault (video)
External resources:
- Push button security for your GitHub Actions
- How to Use HashiCorp Vault Action
- Vault GitHub Actions example
- Automate Workflows w/ HashiCorp Vault GitHub Actions (video)
Jenkins
Jenkins uses plugins to integrate with third-party tools. Traditionally, you would manage Jenkins secrets for pipelines with Jenkins credential management in the Jenkins Controller. This approach binds all secrets to environment variables, and masks these variables when they appear in pipeline logs. However, this conventional approach can lead to secret sprawl with external system credentials (like tokens, database credentials, and other pipeline secrets) duplicated in Jenkins. The Jenkins Controller also relies on the Jenkins user database for role based access control (RBAC) rather than implementing more secure granular access control lists (ACLs) based on credential paths.
Use a best practice approach
A more secure method to inject and use credentials during pipeline runs is to use API integration with Vault for each pipeline step. When you use this approach in your pipelines, you gain the following:
- Vault manages authorization for stored secrets based on externally authenticated identities.
- The Jenkins Credentials Plugin secures the Vault authentication process.
- Vault auth methods (like AppRole or JWT) benefit from the Credentials Management plugin's binding capabilities.
- Authenticated pipelines can make API calls to Vault, and retrieve just the necessary secrets for completing specific jobs.
This integration enables stronger security controls, while reducing secret sprawl throughout your organization.
Use the Jenkins Vault plugin and other methods
Depending on your security requirements for protecting secrets in Jenkins pipeline logs, you can choose from several approaches to authenticate to Vault:
Jenkins Vault plugin
The Jenkins Vault plugin serves as an authentication helper, and provides secret binding during pipeline execution. This approach offers significant advantages:
- Automatically masks any secret retrieved from Vault in the pipeline logs
- Provides a streamlined and declarative syntax for secret retrieval
- Seamlessly integrates with Jenkins credentials management
CloudBees CI Vault plugin
CloudBees CI provides a shared, centrally managed, self-service experience for development teams running Jenkins either on-premise or in the cloud. You can use the CloudBees HashiCorp Vault Plugin to manage the lifecycle of your Cloud Bees controller static or dynamic Vault credentials. Learn how to install and configure the HashiCorp Vault plugin with the Jenkins CLI or Plugin Manager in the CloudBees HashiCorp Vault Plugin documentation.
Here are some example code snippets that demonstrate how to use secrets in CloudBees CI pipelines.
Add a secret for the username and password auth method:
withCredentials([usernamePassword(credentialsId: 'vault-creds', passwordVariable: 'PASS', usernameVariable: 'USER')]) {
sh 'echo USER=$USER'
sh 'echo PASS=$PASS'
}
Use a static secret from a Vault key/value secrets engine:
withCredentials([string(credentialsId: 'vault-creds', variable: 'TOKEN')]) {
sh 'echo TOKEN=$TOKEN'
}
Consult the CloudBees HashiCorp Vault Plugin documentation for more details.
External resources:
Credentials binding with REST API
You can use Jenkins credentials binding to securely manage environment variables, like VAULT_TOKEN
, VAULT_ADDR
, and VAULT_NAMESPACE
. The credentials binding solution provides these features in your pipelines:
- Masks environment variables in the pipeline logs
- Allows you to make REST API calls to Vault with the bound credentials
- Offers more flexibility for complex secret retrieval patterns where an application might need several types of credentials from different paths, and which use function abstraction, structured secret organization, secret composition, and Enterprise namespaces.
Vault Agent sidecar
When you implement Vault Agent directly on Jenkins Agents, you create a powerful pattern for accessing secrets across pipelines. Some advantages from using this approach include:
- Remove the need for extra Jenkins plugins
- Works well with ephemeral agents using a Vault Agent sidecar configuration
- Simplifies secret retrieval for containerized workloads
Note
Secrets retrieved from Vault with this method can appear in pipeline logs without adding specific masking configuration.
The approach that best suits your environment depends on your organization's security requirements, infrastructure configuration, and operational preferences.
How to use Vault dynamic secrets in Jenkins
You can use the Jenkins Vault plugin to retrieve and protect dynamic Vault secrets in the pipeline logs.
These secrets include for example: Terraform API tokens, GCP keys, and database credentials. You can do this in the
pipeline as code definition by specifying the
engine version 1
specification of the plugin, like in the following example:
stage ('My Stage') {
steps {
withVault(configuration: [
vaultCredentialId: "my-vault-creds",
vaultUrl: "http://vault:8200"],
vaultSecrets: [
[
path: "terraform/creds/tfe-role",
engineVersion: 1,
secretValues: [
[envVar: "tfe_token", vaultKey: "token"]
]
]) {
curl -H "Authorization: Bearer ${env.tfe_token}" \
-H "Content-Type: application/vnd.api+json" \
-X GET \
"https://app.terraform.io/api/v2/workspaces/"
}
}
}
If you do not use the Jenkins Vault plugin, it is possible to do a REST API call to Vault and mask the
VAULT_TOKEN
environment variable in the pipeline logs. Masking the
VAULT_TOKEN
environment variable is possible using the Credentials Management in Jenkins.
You can also use the Vault Agent plugin to manage the token caching, avoiding the usage of any plugin in that case.
An external process executed in the Jenkins Agent, like an admin pipeline or init process, (it can be an admin or operator pipeline, or an init process),
can manage the Vault login that retrieves the token value specified in the
VAULT_TOKEN
environment variable and puts it in a Jenkins credential.
Community resources:
- Learn how to secure CI/CD pipeline secrets in Jenkins on Kubernetes in this project by David Cañadillas. Code examples
CircleCI
CircleCI uses OIDC tokens to authenticate with Vault. Vault supports OIDC and has a lab to help practitioners learn Vault's OIDC Auth Method. We recommend practitioners complete the lab before setting up CircleCI Vault integration.
- Rosemary Wang (Developer Advocate, HashiCorp) pairs with Angel Rivera (Developer Advocate, CircleCI) to inject static secrets from HashiCorp Vault into a CircleCI pipeline in their HashiTalks video.
- Video Resources:
TeamCity
TeamCity supports native parameters and tokens to secure CI/CD pipelines, and also features the ability to add extra security through external secret managers like Vault. You can use the TeamCity Vault plugin to store sensitive values in Vault KV secrets engines.
You can set up a Vault connection to authenticate with either the AppRole or LDAP auth methods. Here's example code to set up a connection with AppRole defined in TeamCity's Kotlin domain specific language:
project {
features {
hashiCorpVaultConnection {
id = "HashiCups"
name = "HashiCorp Vault"
url = "http://127.0.0.1:8200/"
vaultNamespace = "enterprise/vault/namespace"
authMethod = appRole {
roleId = "..."
secretId = "..."
}
}
}
}
The connection details specify metadata, a Vault server address, optional Vault enterprise namespace name, and the AppRole auth method roleID and secretID values.
After you define the connection, you can define a parameter to use a Vault secret in your TeamCity pipeline. Here is an example to use a static secret and set it as the pipeline environment variable AWS_ACCESS_KEY_ID
:
project {
params {
hashiCorpVaultParameter {
name = "env.AWS_ACCESS_KEY_ID"
query = "secret/data/awscreds!/access_key"
vaultId = "HashiCups"
}
}
}
You can learn more in the TeamCity HashiCorp Vault integration documentation.
External resources: