ReversingLabs: The More Powerful, Cost-Effective Alternative to VirusTotalSee Why

Commercial software risk: New controls required

Legacy strategies and tooling can’t manage today’s software threats. Here’s why binary analysis is necessary.

Commercial software risk

Ask any CISO what’s inside their organization’s internally developed applications and you’ll get a pretty detailed answer. But ask them what’s in their commercial software and there’s a good chance you’ll get very little in the way of real information. 

With internally developed applications, security teams can scan source code, review dependencies, and inspect build pipelines. Commercial software rarely offers the same level of transparency. Not really knowing what they’re getting, software buyers place their trust in vendor reputation, contractual assurances, and compliance certifications. Even if the vendor provides documentation, there’s typically no way to verify what’s in it. All the while, what’s really included in that software is increasingly dangerous.

Last year, the Shai-hulud worm demonstrated the peril when it tore through the npm ecosystem. The self-replicating malware exposed roughly 400,000 developer secrets and compromised nearly 1,000 npm packages across two campaigns, as ReversingLabs’ Software Supply Chain Security Report 2026 noted. The fallout included cascading exposures as packages maintained by commercial vendors such as Zapier, PostHog, Postman, and CrowdStrike were compromised. 

Patrick Enderby, senior product marketing manager at RL, said a clear takeaway from Shai-hulud is that when it comes to software supply chain security, implicit trust is a problem.

As commercial vendors increasingly rely on shared build components and open-source ecosystems, attacks scale horizontally. A single poisoned dependency can quietly propagate into hundreds of trusted commercial products.

Patrick Enderby

And such attacks are on the rise. The 2026 SSCS Report showed that malicious open-source package detections jumped 73% in 2025 above the previous year. It’s a perilous situation, as RL researchers pointed out in the report:

Modern enterprises depend heavily on closed-source, third-party commercial software that is assumed to be trustworthy but is rarely assessed. That assumption has become a huge security liability.

Patrick Enderby

Today there’s a growing consensus among security experts, analyst firms, and CISOs that, with legacy approaches to evaluating commercial software not cutting it, security teams need to dive deep into the binary level of commercial software to understand what’s going on under the hood. What’s needed, they say, is binary composition analysis.

Get report: Gartner® CISO Playbook for Commercial Software Supply Chain Security

Why traditional tools aren’t up to the job

The fundamental problem with evaluating commercial software is access. Traditional application security tools, be they static, dynamic, or software composition analysis tools, were designed for reviewing first-party code in development pipelines. If you don’t have build pipelines to look into or a vendor willing to open the hood, they can’t tell you much. 

And most vendors simply won’t offer that level of transparency, often for legitimate reasons that might involve protecting their intellectual property or meeting legal and compliance constraints. Customers of commercial software, unable to verify what’s inside, have to trust the documentation. Making matters worse, SaaS deployment models often obliterate security boundaries meant to isolate the way commercial software runs in environments. This was one of the main points made by  JPMorganChase CISO Pat Opet in his widely circulated open letter last year: 

SaaS models are fundamentally reshaping how companies integrate services and data — a subtle yet profound shift eroding decades of carefully architected security boundaries. In the traditional model, security practices enforced strict segmentation between a firm’s trusted internal resources and untrusted external interactions using protocol termination, tiered access, and logical isolation. External interaction layers like APIs and websites were intentionally separated from a company’s core backend systems, applications, and data that powered them. Modern integration patterns, however, dismantle these essential boundaries.

Pat Opet

These architectural vulnerabilities would be concerning enough if organizations could test and validate the software they’re integrating, but contractual limitations often prohibit reverse engineering or independent security testing.

Software bills of materials hold some promise, but can you really trust that vendor-provided SBOMs are accurate, current, and complete? 

SBOMs shift the problem from ‘no visibility’ to ‘unverified visibility.’ Without corroboration, they function more like declarations than evidence.

Patrick Enderby, Patrick Enderby

And because SBOM-generation processes and tooling are still in their infancy, even when vendors act with the best of intentions, the quality of their SBOM documentation will depend on both the tool they’re using and the point at which it’s introduced into the development lifecycle. Researchers at Endor Labs examined SBOM generators last year using a well-known piece of open-source software. Results varied widely. 

Endor security researcher Henrik Plate succinctly explained why that is a problem. 

A comparable analysis is impossible for proprietary software products — as a result of which SBOM consumers cannot do more than blindly trust the vendor's SBOM.

Henrik Plate

Traditional third-party risk management (TPRM) evaluation and tooling add little  to the visible risk signals for commercial software, since organizations mostly resort to static questionnaires and very generalized annual assessments. But even TPRM continuous monitoring can’t do more than look at a vendor’s general risk posture, which tells the customer nothing about the content of the commercial software products it’s deploying. A recent study by Panorays illustrates the resulting gap, finding that while 60% of U.S. CISOs have reported an increase in third-party security incidents, only about 15% said they have full visibility into these risks. 

The role of binary composition analysis

Binary composition analysis will increasingly play an important role in evaluating and monitoring the riskiness of commercial software, Enderby said, either by shedding light on opaque pieces of software or by validating a vendor’s SBOM results.

Binary analysis aligns security responsibility with technical reality. It allows organizations to assess the software they are actually deploying — not the code the vendor says they wrote, but the package that will run in production.

Patrick Enderby

Major tech analysts are already indicating that SSCS must be integrated into third-party cyber-risk management (TPCRM) strategies — and that binary analysis will play a big part in this. And guidance from both the U.S. Department of Defense and the Cybersecurity and Infrastructure Security Agency is pointing in the same direction. In its document, A Shared Vision of Software Bill of Materials for Cybersecurity, CISA says that “binary analysis tools can use increasingly accurate heuristics and datasets to determine the underlying components.” And the agency’s Minimum Elements for a Software Bill of Materials guidance similarly advises that organizations can use binary analysis to analyze and contextualize the SBOMs their vendors provide.

Binary analysis can be useful at every stage of software acquisition and maintenance. It can help evaluate new commercial software, facilitate ongoing vendor management as software is updated, and play a role in incident response, offering up a rapid inventory to help commercial software vendors quickly understand if they have been impacted by the latest supply chain attack.

Back to Top