Breaking Up With Legacy Security Solutions Can Create Peace Between Security and DevOps
Automation is woven into the strands of DevOps DNA. Any disruption could significantly damage automation and speed, bringing an end to a DevOps process.
Join the DZone community and get the full member experience.Join For Free
Automation is woven into the strands of DevOps DNA. Any disruption such as manual deployment and configuration could significantly damage automation and speed, bringing an end to a DevOps process.
Unfortunately, security doesn’t make it easier on development and operations teams. Incorporating security into the DevOps process can be complicated, cumbersome, and riddled with manual tasks and configurations. The delays can cause tension between security and DevOps teams, causing the latter to avoid or bypass the security process altogether. However, without proper attention in the DevOps process, organizations open themselves up to cyberattacks due to inadequate controls around privileged identities, entitlements, data, and system access.
Privileged Access Management (PAM) is key to securing the DevOps pipeline. After all, research by the IDSA has revealed that nearly 80% of organizations have experienced an identity-related breach in the last two years, and 99% believe that the attacks were preventable. However, legacy PAM solutions can cause a slowdown in the DevOps process.
Security teams often simply want an olive branch extension from their DevOps counterparts. Unfortunately, security vendors can often be the ones causing the friction. To align better with the DevOps community, many solutions vendors fall short in two ways:
- Believing security is the only priority. Many PAM vendors impose new processes and technologies on DevOps teams, with no consideration for the delicate procedures already in place. Forcing them to bend and adapt is a sure-fire way to alienate them by choking agility and productivity, leading to the PAM solution being side-lined.
- The tools do not fit in the Agile world. Most PAM solutions were built for centralized data center workloads and human IT admin login only. They were not designed for a distributed IT infrastructure in a modern world where non-human identities for systems, applications, and microservices proliferate. The operational overhead necessary for manual installation and configuration introduces yet more friction, slowing down the CI/CD pipeline.
Let’s look at an example to illustrate this problem. IT spins up a new project in AWS at a global enterprise, and engineers are cutting code to develop a business application running in a new virtual private cloud (VPC). To build the app, the engineers need hundreds of microservices spread across dozens of Docker containers. Following the launch of the app, microservices need configuration data to tell them what task to perform and how to do it. They also need credentials to authenticate to other applications and services.
The project has provisioned a password vault to store account passwords and configuration data as secrets. Developers can then write code to log in to the vault and obtain what they need via RESTful APIs. This programmatic vault access function is critical for security and performance — to mitigate the risk of credential abuse and support application auto-scaling in response to user demand.
Diving Into the Problem
At first glance, the organization might be doing everything right. However, a deeper dive into the project would show a lot of manual effort for both the developers and operations teams. Let’s take a deeper look into the setup for every workload that needs to check out a credential or secret from the vault:
- Ops creates an OAuth2 client application and service account in the vault to enable the application or service to log in to the vault and call vault APIs. Ops then exports this as a token that the developer will use in their code.
- Developers then incorporate the service accounts and tokens into applications or local configuration files.
- Following this, Ops can assign roles to the service accounts to check out specific credentials and secrets from the vault.
- Ops then configure scopes for the OAuth2 clients to constrain which vault APIs the various workloads can access.
- Finally, the developers obtain the OAuth2 client code and incorporate it into their code, enabling the workload to communicate OAuth2 to the vault. After, they can get a bearer token to access vaulted secrets and obtain other tokens to talk to alternative services.
Each workload would be a rinse and repeat process — and there could be a few dozens, hundreds, or thousands of workloads. The number of steps and intricacies of the process contribute to DevOps relegating security measures such as legacy PAM to the curb.
So, how can teams incorporate security within the DevOps process without such friction? The answer lies in its DNA: automation.
What’s required is a modern PAM solution with automation at its core, essentially delivering PAM-as-code. Organizations looking to weave security into the DevOps process should look for vendors equipped with capabilities such as federation, temporary tokens, and delegated machine credentials. These methods take no time away from DevOps teams and help reduce the overall attack surface.
When combined with a least-privilege approach and best practices such as Zero Trust, zero standing privileges, just-in-time privilege access requests workflows, and adaptive multi-factor authentication, security can become a part of DevOps culture seamlessly.
Automation is essential to DevOps, and security needs to be as well. By working with security vendors that provide automated PAM solutions, the friction between DevOps and security teams will be a distant memory.
Opinions expressed by DZone contributors are their own.