Here's what you need to know about the state of CI/CD tools — and why you need to upgrade your tools and approach to deliver secure software at speed.
Adversaries have ramped up their targeting of software supply chains in recent months, inspired by headline-grabbing escapades like the attacks on SolarWinds and Kaseya. Research released earlier this year by the NCC Group said that assaults on software supply chains jumped 51% during the last six months of 2021. "Despite this," the researchers noted, "many of the organizations that we spoke to planned to invest in new third-party software, hardware and SaaS security products in 2022, which could increase the third-party threat vector for malicious actors."
The rise in supply chain attacks appears to be setting off alarms at the top of many organizations. In its annual C-suite security survey released this month, CloudBees, which owns the open source automation server Jenkins reported that four out of five executives (82%) were either "somewhat more concerned" (40%) or "much more concerned" (42%) about sorties on their supply chains than they were in 2019. What's more, their confidence in the security of their supply chains has also taken a hit, with 88% expressing confidence their supply chains were secure compared to 95% in 2021.
Idan Tendler, vice president of DevSecOps at Palo Alto Networks, writing in Forbes, said that software supply chains are only as strong as their weakest link, and that continuous integration/continuous delivery (CI/CD) pipelines are the latest attack vectors left vulnerable by unassuming DevOps teams."
"Just one CI/CD misconfiguration can expose sensitive information and can then be used as an entry point for injecting malicious code and leaking sensitive data. Ultimately, this can corrupt the entire CI/CD pipeline and the software supply chain."
Continuous integration/continuous delivery (CI/CD) is about delivering quality software at speed. Modern software supply chain security depends on getting your tools right, and focusing on the end-to-end software development lifecycle. Here's a review of the state of the CI/CD tools — and why you need to upgrade your tools to keep pace with threats.
Focus on the pipeline: Threat modeling and code scanning
The CI/CD pipeline needs to be secured throughout the software development life cycle (SDLC), from the planning, coding, and build phases through the testing, deployment, and monitoring phases.
During the planning phase of a project, security can be enhanced by applying threat modeling to the pipeline. Threat modeling can identify potential areas of attack on the pipeline and countermeasures can address those vulnerabilities. It answers questions such as what are my high-value assets, who is likely to attack me, where is my pipeline most vulnerable to threat actors, what threats are most relevant to me, and are there any attack vectors going unnoticed?
A useful tool for supply chain threat modeling is SLSA (Supply Chain Levels Software Artifacts). SLSA is a checklist of standards and controls designed to prevent tampering and improve the integrity of the pipeline, as well as secure the packages and infrastructure in pipeline projects. Other tools available for threat modeling include Microsoft's Free SDL Threat Modeling Tool, SecuriCad by Forseeti, ThreatModeler, and Irius Risk.
During the coding phase, developers write the code that makes up the application in the pipeline. By scanning the code, security code analyzers can detect and report weaknesses that can lead to security vulnerabilities. A list compiled by the National Institute of Standards and Technology reveals that there are dozens of these programs in the market to choose from.
Daniel Kennedy, research director for information security and networking at 451 Research, explained in an analysis of application security tools that the use of code scanning tools by developers has steadily increased as security responsibilities have shifted left in recent years.
"Information security professionals were, in 2015, the primary users of AST tools. That usage has ceded to developers a little more each year, and has flattened in 2020."
That's a positive development for security pros, he maintained. "The idea that security personnel in the typical enterprise will have time to review every code change or kick off manual scans to produce vulnerability reports for each pull request isn't realistic," he wrote.
He noted that in-depth peer reviews among developers on the same team are a rarity that is often context-dependent, highlighting that a security professional coming in who isn't familiar with a code base or project specifics will have even less of a foundation to work from with todays shorter project timelines.
"Giving developers the means to efficiently test for and respond to security vulnerabilities during code construction is the most efficient path to keeping up with newly introduced application security issues."
Build and test: Look to more comprehensive tools
During the build phase, developers commit their code to a shared repository. There, the code is subjected to tests for compliance with previously set requirements. At this point, it's a good idea to analyze the code with Static Application Security Testing (SAST) tools, such as — to name a few — SonarQube, Veracode, Appscan, or Codacy. Because applications use so much third-party code, it's also a good practice to run the build through a Software Composition Analysis (SCA) tool, such as those made by Veracode, Sonatype, or other vendors.
However, SCA may fall short when it comes to protecting the entire CI/CD pipeline because it's limited to scans of software repositories. An emerging product category—Pipeline Composition Analysis—is designed to identify dependencies across all phases of the SDLC, including application code dependencies, build modules and their dependencies, infrastructure as code dependencies, and more. If an organization understands what dependencies it has and where they reside in the pipeline, it can better identify, prioritize, and remediate any risks they create.
Kennedy noted in his application security tool analysis the ability of various kinds of AST to infiltrate different parts of the development process in a more automated way has led to a steady shift away over the last six years. That translated to waiting until the production phase to apply AST and increasingly applying those tools directly after the introduction of code changes.
Following the build phase, the software is tested for quality. Bugs are squashed, and if new features are added, regression testing is performed. At this stage, more static code analysis should be performed with tools such as Netsparker and Acunetix, as well as container scans with tools like Datadog, Clair, Anchore, and Qualys.
Containers, in particular, have been flagged as a ripe source for software supply chain attacks. One study by Palo Alto Networks, for example, found that almost all third-party containers deployed on public clouds have vulnerabilities and misconfigurations that expose organizations to supply chain attacks. It reported that 96% of third-party container applications deployed in the cloud infrastructure had known vulnerabilities and 63% of third-party code templates used in building cloud infrastructure contained insecure configurations.
Deployment and monitoring: Protect privacy and secrets
When software enters the deployment phase, it's important to make sure that privacy is preserved and sensitive data is protected by removing things like passwords, tokens, and secrets from the deployed application. A recent study of 1,859 mobile applications by Symantec found that more than three-quarters of them (77%) contained valid AWS access tokens allowing access to private AWS cloud services and close to half (47%) contained valid AWS tokens that also gave full access to numerous, often millions, of private files via the Amazon Simple Storage Service (Amazon S3).
Scott Gerlach, co-founder and CSO at StackHawk, an API security testing provider, told the vmblog that DevSecOps was key.
"Adding DevSecOps tools, like secret scanning to CI/CD, can help ferret out these types of secrets when building software. And it's critical that you understand how to manage and securely provision AWS and other API keys/tokens to prevent unwarranted access."
During deployment, secrets need to be moved from repository and configuration files to digital vaults. In more advanced deployments, config files can be dynamically generated and routine processes set in motion to detect and mitigate the presence of any unprotected secrets in the environment.
One way to prevent secrets from being hard-coded into an application's code base in the first place is to integrate secrets scanning into the workflow of developers through pre-commit and merge request scanning.
The growing adoption of containers has heightened the need for more security controls around their use. "Increasingly, threat actors have begun targeting container environments with DDoS attacks, exploits targeting kernel and container orchestration technologies and other attacks, putting enterprise cloud applications and assets at risk," wrote Jai Vijayan in a recent report for this blog.
With attacks today going beyond vulnerabilities alone — and including malware payloads, software signing, and secrets — it's important to think holistically about container security and the best practices for securing them.
Software security: Only as secure as your weakest link
An organization's software supply chain is only as secure as its weakest link. That's why security needs to reach beyond finding vulnerabilities in applications, and into the CI/CD pipeline and throughout the software development lifecycle, from planning, coding and building software through to its testing, deployment, and monitoring.