SBOM skeptics and talks about the importance of software supply chain transparency
In this special Café edition of ConversingLabs, host Paul Roberts interviews Joshua Corman, the Vice President of Cyber Safety Strategy at Claroty and the Founder of I Am The Cavalry on the sidelines of the RSA Conference 2023 in San Francisco. Josh speaks with Paul about his RSAC track session, The Opposite of Transparency, which takes on skepticism of software bill of materials (SBOMs) and makes an argument for greater transparency around software supply chain risk.
Josh Corman, several hats, founder of IamtheCavalry.org, daddy of SBOM and Public Policy Vice President Cyber Safety Strategy at Claroty. Currently and recently, chief strategist for the CISA COVID Task Force through the pandemic.
Your talk at RSA addresses skepticism about SBOMs. Why is this important?
This should not be a controversial topic. SBOMs or bill of material are a treasured staple of every industry pioneered in Toyota supply chains in the forties. Adopted by aviation, by food supply, by chemical supply, bill of materials are treasured for quality, for profitability ... They cannot imagine not having bill of material.
And I would like to take proven, supply chain principles and apply them to modern software development. And this should not be controversial, and yet it is incredibly controversial. So I wanted to peel back and start to ask why to arm people with both information, but also tools to identify who's acting in good faith and who isn't.
What is an SBOM?
In its simplest form of bill of materials is a list of ingredients of the composition of the parts that go into your final goods assembly.
So we're all in a supply chain. Most of us are in the middle, and you should understand the upstream ingredients that went into the goods. Now this data layer has, can support plural use cases. One of them you mentioned, which is potentially mapping known and/or exploited vulnerabilities so that you can answer during an attack. "Am I affected?" and "Where am I affected?" Instantly or quickly, as opposed to months or years later or after you've been hurt. But that is not the only use case. In fact, early adopters of SBOM and software supply chain principles were in the DevOps early movement, the CI/CD pipelines who wanted to cut down on elective developer waste, right? Unplanned and scheduled work break fixes. They felt they could go faster and be more profitable by doing this. So there are efficiency, profitability, reasons to have supply chain principles applied. There are license management and legal concerns to have it. There's providence and pedigree concerns. So as a necessary but not complete set of tools and data layers, it's, it unleashes and enables and supports various use cases, including, but not limited to vulnerability management.
What are some of the objections to SBOMs?
The flow of the talk, generally. Broad brush strokes, I wanna start and ground us in first principles. I tried to mentor myself to Dan Geer and other pillars in industry, and we have very few first principles in cybersecurity.
But when you, when someone deviates from one of our core beliefs, it should be noteworthy and the burden should be on them to prove why there, there's a deviation. So I'm at first gonna establish what are the things that we believe and take as articles of faith in cybersecurity. And I crowdsourced a whole bunch of them, but obviously one of them is that security through obscurity is no security at all.
And one of the principles related to that is why we believe it's better to have a, CVE published details that might be actionable. It's better to have coordinated vulnerability disclosure, or even full disclosure than to not know. That the adversary knows the system and shows should the defenders, that the MITRE attack framework is not considered a roadmap to the attack rate, but it is literally a roadmap of attackers successes in the past.
So we tend to believe on almost every other topic that transparency to defenders enables informed risk decisions and there seems to be quite a deviation from that first principle. So I start anchoring us in first principles. Then I pivot to some. Framing of mis- dis- and malinformation, because people still don't know what those mean.
But sincere but false beliefs not meant to harm false beliefs intended to harm truths that are used. Deceitfully, misapplied, misapplied in, in quite insidiously. And there's some of those in this at play here in some ways. Yeah. And then bonafide truths. So I'll give some concrete examples.
Some are pretty spicy and I'm sure. There'll be some fallout. But then I pivot to say, okay these are symptoms. These are smoke screens. What's beneath them? Behind them? Let's not fight the heads of the hydra. What are the core dirty secrets or sins that are driving this mis and this dis and mal information?
Cuz the way you fight sincere but false beliefs is with information. But we have a signal noise problem where some of the louder voices continue to stoke the fears of people that have not yet gone through the journey. So if part one is what our first principles, part two are some sources of mis dis- and malinformation pushing past those I, say three in quotes, but there's four core drivers of the, of this active opacity.
Suppose you're gonna ask me what those are, but before I identify now that we've identified and name those demons, hopefully they have less power over us. It's still scary. So I apply the Kubler Ross model of the five stages of grief for SBOM to get you from your current state to a desired state where everyone benefits from this newfound transparency and the benefits of that follow from acting in a transparent way.
There's a bonus section now enabled by the President's National Cybersecurity Strategy. Pillar three goes right at incentives and in this strategy they identify that free market forces only take you so far. And too often the liability and the harm is placed at the feet of the victims, the owners and operators or contributors of free open source projects, and they wanna shift that burden to those least cost avoiders and those in the best position and burden bear that burden.
So the concept of software liability changes this entire conversation as well. Between the time of submitting the talk and now delivering it, we now have a new chapter which may invert the motives here, where if someone's historically been trying to hide their software bill materials, they may now be rushing to apply them as their argument for safe harbor and limited liability and that we are, I was transparent with my customers.
They were aware of the risk passed down stream to them. I gave them patches if they chose to accept those risks and chose not to apply the patches. That's on them, not on me. Some of these incentives will fundamentally alter the conversation as well. So I'm also indicating how recent congressional and executive branch action domestically, internationally may change the pH balance of the conversation.
But at the heart we have liability and accountability for products in every aspect of our life except for software. And historically software design and development and make a risk-based decision for their own risks. And they pass those risks downstream to their customers and their customer's customers, often without having any obligation to tell them.
And that is created market inefficiencies in cars. We would call this information asymmetry in a market for lemons. We used to have lemon laws and now we have Carfax, which adds information to dampen information asymmetry. So again, these are proven techniques from other industries that are well past due for software and for it, and as society increasingly depends on connected technology the ability to do harm in cyber physical systems like the water we drink, the food we put on our table, the oil and gas pipelines, the fueler cars, homes and supply chains, the schools your children attend, the timely access to patient care during a pandemic with now proven mortal consequences.
Software can affect basic, lifeline human needs of critical infrastructure, and it needs to be held to a higher standard. Historically people would make put a bunch of software in. It had some level of risk. And in a low attack density environment, that's okay maybe, but we're starting to see a whole lot more supply chain attacks in third party open source libraries, which don't hit one piece of software, they hit everyone that depends on that software or many.
There's various flavors of supply chain attacks now, right? But we want to have a vigilance and situational awareness of, am I affected, where am I affected? The two basic questions, whenever there's a new thing being attacked, we want to know, am I affected? Where am I affected?
Will the federal guidelines for supply chain security have any impact outside of the federal government?
I'd like to quote William Gibson. The Future is here already, just not evenly distributed. And anyone in my working groups for SBOM over the last several years is sick of hearing that. But, there's already been software supply chain transparency, widely adopted in financial services.
The largest producers of software in the world are financial services companies. One of the big banks I used to help wrote more software than Microsoft, Google, Oracle apple combined, right? Most of that software is written for themselves and owned and operated by themselves, so there's no organizational boundary or, legal boundary, but they track which parts go into which software packages such that when something happens, they can have a prompted agile response.
So those practices, whether they were called SBOM or not in federal policy or not, have been maturing and honing the tool chain around, creating them, ingesting them, tracking them, combining them, has been maturing over time. Yes, there has been federal action in the form of executive order 14028, in the wake of SolarWinds, even though SolarWinds was not a third party open source supply chain attack, it brought to the fore that we need more trust and transparency in our digital infrastructure, especially in the highest national security context.
So that recognition said we have a lot of good and proven practices that we could bring to bear to overall raise the trust and transparency. I'm quite fond of the wording of part of it, which is that in the end, the trust we place in our digital infrastructure should be proportional to how trustworthy and transparent that infrastructure is, and to the consequences we will incur if that trust is misplaced.
So it's, an important idea, long overdue and yes, that would use collective purchasing power of the federal government for any software whether its IoT or digital infrastructure sold to federal agencies. And you may think that that's just part of the market most of the people that supply the federal government also supply to other hospitals or other enterprises.
So it at least creates another wave of adoption and case study to show the benefits and advance and mature the tooling. But in parallel with that, I think people failed to pay attention to the fact that The Patch Act passed into law in December. Which is granting regulatory authorities in statute to the Food and Drug Administration for several minimum cybersecurity hygiene features for any FDA approved medical device.
And this has been parallel harmonized with IMDRF which is the International Medical Device Regulators. So what you have now is any medical device sold to any, the US government or any hospital? Now has a series of minimum seatbelt laws like Ralph Nader did for automobiles. Nine years of the making, almost to the day. And these are things like you should have a coordinated vulnerably disclosure mo program for your medical device to identify and monitor post-market defects. You should be patchable. And have capabilities to patch quickly in an urgent manner. You should have a machine generated machinery, software bill of materials.
So we gave new authorities in the law for FDA to have more aggressive enforcement of its prior guidance, but also budget for them to staff up and do it effectively. Despite the executive order having some pushback and some federal focus this is the first private sector mover, and it's already in effect.
In fact, they just put out a memorandum saying for now until October 1st, we will not outright reject a submission that's missing a software bill of materials, but you need to have it to complete your submission. Suggesting, don't even bother coming to us after October 1st.
Separate from this federal executive order, separate from this federal regulatory change for their food and drug administration, these open working groups we've been doing through nta.gov now, CISA.gov have significant pilot programs and adoption from free market owners and operators in energy, in aviation, in automotive, a significant international adoption of SBOs in Germany for automotive in Singapore. In Japan. So this is happening. It's just a matter of how consistently.
Do you expect to see federal agencies enforcing supply chain security?
Again, referring to the recent publication of the president's National Cybersecurity Strategy out of the Office of National Cyber Director and National Security Council. They have reiterated that they would like to go further faster on their existing software bill materials efforts.
They're currently matriculating the last stages of the implementation of the Executive Order 14028, but they want to go further. The Department of Defense has said specifically, the Army has said, we are all in on SBOM. We don't care if some of you opt not to do it. We're doing it and you should too.
Perhaps they have reasons and prior experiences that have given them conviction that this is both necessary and valuable. And recently, I think even this week joint publication between CISA and Idaho National Labs and the Department of Energy gave some guidance on how they're already doing this in privately owned and operated energy sectors.
Once these got discussed and got the bump into the forefront of public discourse many organizations have started saying, I want an SBOM. And a lot of the tooling vendors, I think one of the inflection points here is when you walk the show floor at rsa last year, a lot of people were talking about the importance of software supply chain risk.
Now people are demonstrating the ability to ingest, transform, enrich, mature software bill of materials. It was fairly easy for a new modern software development, but maybe legacy code is a problem. There's a whole bevy of people that are helping you fingerprint and reverse engineer facsimile or even very precise componentry within legacy tech or embedded systems.
The excuses in hand waving that said "this could never happen" (or) "there's no tools" have been disproven. The people who say this can't happen at scale or at speed? Microsoft has it built in to GIT Jenkins built to spit it out as a byproduct of making your software. This should be frictionless byproduct of modern software development.
Not additional cost, not an additional burden. And just like any information, people could misuse it or fail to use it, but we should be taking receipts on what goes into our digital infrastructure to equip us. For any sorts of use cases they will, there will be errors and omissions, but simply knowing there is Log4j even if you get the version wrong, can narrow you down from 20,000 devices now to maybe the 200 that matter.
Some complain that SBOMs can’t scale. Thoughts?
I find it fascinating when cybersecurity, people who don't do CI/CD are speaking authoritatively on behalf of CI/CD communities. The CI/CD communities were among the first adopters of this. They found it to improve their quality, reduce developer waste, reduce break fixes and compress meantime to identification remediation there are entire startups now and developer ecosystems around CI/CD - ChainGuard and SALSA and SIGSTORE and GUAC.
Like they have a whole snack food army emerging where the, idea of knowing am I affected, where am I affected, will manifest differently in, say a, constantly changing cloud stack or set of APIs calling other APIs. But, the ability to know, is this project even using the affected library or has it ever used it remain.
There's an entire CISA.gov working group on cloud use cases. That is populated by pioneers in these spaces and how it manifests how transparency and trustworthiness and an accountability and auditability manifests differently. But we have not had much pushback. In fact one of the exemplars here I love to go to is JupiterOne the second that Sunil Yu, the author of the Cyber Defense Matrix an early and prolific adopter of software supply chain transparency through procurement and acquisition when he was at Bank of America.
The, day he got there, they published their full SBOM @jupiterone.com /SBOM like nothing to hide. They change their software pretty frequently. He believes they should be free, open, accessible. There's different methods of sharing and update for intervals, but the pushback does not tend to come from the modern CI/CD pipeline people.
The pushback comes from the legacy vendors who have a lot of technical debt, security debt, and sins they do not want revealed.
So what’s holding back SBOM adoption?
The three main terrors that drive the resistance and assault on SBOMs and transparency casually put number one, there are people with technical license debt who are violating the terms and conditions of the open source libraries they're using, and they know they'll likely get sued or have to settle.
Once this newfound transparency reveals the legal violations. So it's terrifying. I remember a seminal case when Cisco bought an a home IOT router company and got sued. So it's not always that they committed the sins, but through mergers and acquisitions, you inherit quite a bit of legal risk.
This is why tools like Black Duck have existed for 20 plus years. So there's some latent legal exposure. They are afraid to get caught. Number two, they're unfixable, security debt items or cost-prohibited to fix that may lead to end of sale/end of life recalls out of the business plan. Like the component is so foundational to the product that the level of effort to replace it will exceed the level of value to do and they're concerned about scrutiny for the risk they have historically passed downstream of their customers, but also potential disruptions to their business plan and revenue model.
And the third one's more of a ongoing concern, which is they have enjoyed a lack of accountability or scrutiny from their customers, and they're worried about the ongoing cost of accountability, transparency for things like new known exploited vulnerabilities on the KEV list, they're gonna be expected to answer in a timely manner, and in the past, they could sit on it as long as they want, and now they're probably gonna have service level agreements of "I need a response within X days for known exploited vulnerabilities and Y days," for other lesser second, tier criteria. So those three things terrify someone with a large portfolio, I might get sued for license violations. I might get harshly, but probably fairly criticized for debts I've been passing downstream. Imagine if you've been selling to the federal government for years saying your FIPS compliant, but you're using a library incapable of supporting those parameters.
In light of those three, I encouraged through the five stages of grief that people understand and have graduated expectations over the next several years, and perhaps even that we tolerate as a social contract, some level of redaction or this house may contain lead, paint some, transition period.
Because if we don't, we're gonna have this continued opacity. And fail to avoid preventable harm. And I think we need to put on our adulting and realize that, look, none of us knew better. Those are the before times we know better now. So make a commitment to transparency going forward. Many of these things can be fixed in an 80/20 rule. It's actually much easier than people expect once they get into it.