Well-Architected Framework
Manage leaked secrets
In a technology-enabled world, almost everything relies on secure access to systems whether you are checking your email, swiping a loyalty card at your favorite store, or your company's payroll system making a deposit in your bank account. These systems and processes have one thing in common — they all require some sort of secret to access them and ensure data confidentiality, integrity, and availability. Secrets can be a password, an API key, or a JSON web token. These secrets authenticate people accessing the system or different systems interacting with each other.
Developer productivity relies on the ability to securely access and use secrets to connect applications and services. If teams are unable to efficiently access secrets, the likelihood of secrets being leaked increases.
A secret is considered leaked when it is published in a location that it was not intended for. For example, an API key would be considered leaked if it was committed to a Git repository or added to documentation in plain text. The API key is now potentially accessible to anyone with access to the repository or documentation exposing the system to unauthorized access.
If a secret is leaked, people, systems, and organizations that rely on the secure exchange of information are at risk. When managed properly, you should store secrets in a secure location, verify the secrets are used as intended, and ensure they are not stored or written down in plain text.
This article provides guidance and resources to ensure you can prevent leaked secrets, identify leaked secrets, and remediate the incident if a secret is leaked.
How do secrets leak?
Some of the main ways organizations leak secrets are code exposure, process friction, insider threats, and trusting 3rd party tools such as AI or data processing software.
Code repositories are one of the most common sources of secret exposure. Developers accidentally commit API keys, passwords, and tokens directly into source code. Once secrets are in code and pushed to repositories (whether public or private), they become accessible to anyone with repository access and remain in the commit history until someone removes them.
According to GitHub, over 39 million secrets were leaked in GitHub repositories in 2024. Bad actors automatically scan public code repositories, looking for leaked secrets, and use these secrets to gain access to systems.
Process friction is when the ability to securely access a secret is so difficult, people find a way to bypass the process to retrieve the secret. Process friction harms operational excellence at organizations, effectively lowering the security posture.
The following are examples of process friction:
Overly complex or lengthy approval processes.
Lack of documentation or training on how to securely access a secret.
Lack of centralized management services such as secrets management or single sign-on (SSO).
Secret sprawl when teams manage secrets in several locations with no single source of truth.
Implementing systems to solve these problems can improve operational excellence and the security posture of organizations. Some ways to mitigate process friction include centralizing secret management systems like Vault, creating clear documentation and training for creating and accessing secrets, and reducing the risk of human error by automating the use of secrets through CI/CD systems.
Another cause of leaked secrets is insider threats. Insider threats are when an employee or ex-employee, either intentionally or unintentionally, leaks sensitive information.
An intentional insider threat is when someone purposely steals sensitive material, like an employee taking private encryption keys. An unintentional insider threat if someone who accidentally exposes sensitive information, such as pushing a secret to a public repository or is socially engineered to provide credentials to bad actors.
The chance of insider threats increases dramatically when there is process friction. Consider an environment that does not use single sign-on and requires users to have different logins for each system. When a user leaves the organization, an operations team has to shut down accounts in every system. If the account for one system is missed, that user is still able to access the system and associated data.
A new cause of leaked secrets is the improper use of AI or other data processing tools. AI increases developer productivity, but also increases the risk of leaked secrets. A developer may use an AI tool to generate code that requires a secret. The AI tool may include the secret in plaintext, and the developer intentionally pushes that secret to a code repository.
An AI tool might also either have bad intentions by using the code provided for nefarious purposes, or lacks security for server-side processing. Unless your organization self-hosts AI systems, you should never pass an AI system sensitive information. Or, if you must provide it with a key, do so in a development environment and rotate that key when you're done.
Always consult with the security team at your organization before using AI tools.
Impact on the organization
Along with technical implications of leaked secrets, there are also major business and professional implications. When organizations leak secrets, they often experience a loss of trust. Losing trust can lead to application downtime, and the loss of new and existing customers which may result in a loss of revenue.
From 2015 through 2024, the three major credit bureaus all suffered major data breaches. The breaches exposed the personal information of millions of people. These breaches caused a loss of trust in the credit bureaus, but it also led to the United States government passing the Economic Growth, Regulatory Relief, and Consumer Protection Act. This bill requires the credit bureaus to provide free credit freezes and fraud alerts to consumers, among other requirements. Credit freezes had been a source of revenue for the credit bureaus.
A popular cryptocurrency exchange suffered a data breach stemming from insider threats. The company expects to pay up to $400 million to make up for the loss of customer funds.
In addition to the loss of revenue, individuals have been fired, fined, and even had lawsuits brought against them, resulting in prison sentences for attempting to cover up data breaches.
HashiCorp resources:
- Learn to use Vault client libraries inside your application code to store and retrieve your first secret value. Centralized secrets management with HashiCorp Vault
- HashiCorp Vault Enterprise secrets sync
External resources:
- Google SRE book
- Microsoft Employees Exposed Own Company's Internal Logins
- AI programming copilots are worsening code security and leaking more secrets
- GitHub found 39M secret leaks in 2024
- IBM Security - The cost of a data breach
- ThomsonReuters - The cost of data breaches
- 7 security incidents that cost CISOs their jobs
- Economic Growth, Regulatory Relief, and Consumer Protection Act
- Coinbase data breach disclosure
Next steps
In this overview, you learned about how secrets leak in your organization. Refer to the following documents to learn more about how to prevent, identify, and remediate leaked secrets: