<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1076912843267184&amp;ev=PageView&amp;noscript=1">
Season 6, EP3

Securing Medical Devices with SBOMs

In this episode, host Paul Roberts chats with Dr. Kevin Fu of the Archimedes Center at Northeastern University, about the new federal standards for the cybersecurity of medical devices, which includes the use of software bills of materials (SBOMs).

EPISODE TRANSCRIPT

Paul Roberts: And we are back for another episode of the ConversingLabs podcast. I'm your host, Paul Roberts, the cyber content lead here at ReversingLabs. I am really pleased to have with us today, Dr. Kevin Fu, of Northeastern University and the Archimedes Center for medical device research. Kevin, welcome back. I think we've talked before. I don't know if we've had you on ConversingLabs before. 

Kevin Fu: Yeah, it's great to be back. Always appreciate your thoughtful questions, Paul. 

Paul Roberts: Thank you. And you're not at Northeastern right now. You're actually down in Washington, D.C. Tell us what you're up to. 

Kevin Fu: Today there's an event at FDD and they're hosting a special webinar with Jen Easterly and other principals on the PCAST report about how to build resilience into cyber physical systems that are critical infrastructure of the nation, the 16 critical infrastructures as defined by CISA.

Paul Roberts: There's a really big focus of CISA right now on kind of building security and trying to change culture, in many ways across these areas. You're there because obviously medical devices and healthcare are one of those 16 critical areas.

Kevin Fu: And, changes both technical and cultural, as you point out. Engineering is relatively easy. We have consensus bodies, formal methods for getting the standards going. But on culture change, that requires leadership execution. And that's what we'll be talking about today the panel on the PCAST report.

Paul Roberts: So very cool. Talk about your journey into medical device security because you actually started working in a hospital very early in your career and that inspired you. 

Kevin Fu: That's right. I started working in healthcare around 1993 or so, a small community hospital in Holland, Michigan. And I was the the gopher of the room. At night I would have a beeper, and if something would go wrong, such as a clinician losing their authentication tokens, they'd come down with me to get a new set of tokens. And it was a really good experience and opportunity and eye opening because I realized just how complicated the security story would be because of the human factors involved. And because the mission for patient care is, it's just very different from, say, logging into your, cloud email. There's a lot more at stake. 

Paul Roberts: So you started the Archimedes Center when you were at the University of Michigan. And before that, you started your career at the University of Massachusetts, focused very early on, on the cybersecurity of medical devices. Talk about, first of all, the research that you do at Archimedes with your students and the types of problems you focus on and devices. And also just give us a sense of like how that work has evolved over the last 10 or 15 years. Cause you were in early, you were really one of the first people really looking hard at this question.

Kevin Fu: To answer your last question first what sort of evolved since then? We started around 2007 with some early research on pacemaker security. And that was at UMass Amherst in Western Massachusetts, and that evolved over time to more public policy with Archimedes Michigan and now Northeastern. I would say the biggest change, if you take a look at some of our writing from 15 years ago, we would talk about risks to healthcare from cybersecurity threats. And the response was that's very hypothetical. And today healthcare organizations paying $22 million because of cyber threats with ransoms. And we did not predict ransomware, but we did predict malware. Of course the instantiation is perhaps stranger than fiction, right? It's really changed from theoretical to actual risks with actual adversaries who are usually economically motivated or motivated politically. But 15 years ago, I would say we were shifting from worrying more about the bored kid in their basement breaking in to hospitals, and now we've really shifted to sophisticated organizations and threat actors. 

Paul Roberts: And of course, as we're speaking today, we're in the midst of a huge crisis in the healthcare space with the hack of United Healthcare, which owns a company that basically facilitates a lot of the billing that hospitals and doctors offices do so they can get paid for their services.

Subject of a ransomware attack, the LockBit group, which a few weeks ago we were saying had been taken out by U. S. authorities. And then they reminded us that they hadn't. I'm understanding this, this is a new ransomware attacks have affected healthcare organizations. What's the bigger story behind an incident like the one at United? 

Kevin Fu: I think the bigger story is not necessarily what's happening right now in this very week, which is quite concerning because of so many weeks of downtime affecting such a large percentage of all Medicare reimbursable procedures. But I think the bigger story is this is just one big company and there are dozens, if not hundreds, of other single points of failure that really cut to the main problem of, how do we be resilient even if the cloud goes down? How do we be resilient even if an adversary breaks through a series of firewalls? And through engineering, and the classes that many faculty teach, we know it's possible to build a resilient system even if an adversary sits on the network and controls it. I have to say today, most products I see don't build their systems under that threat model, and so I think the bigger question is, how can we get away from reporting on these incidents and get more toward engineering and building in the security by design that we already know how to do but have not chosen to do so. How do we get that out into the marketplace and incentivize it?

Paul Roberts: This is a conversation that's happening across our twenty-two trillion dollar economy, right? But in the healthcare space, in the medical device space, there's actually some progress that's been made. There's certainly some reason for optimism. There's the passage of the Patch Act, which is one of the most kind of toothy substantial cybersecurity federal regulations regarding medical devices. And then, as you noted on LinkedIn, the Biden proposed federal 2025 budget contains quite a bit of money, more than a billion dollars, focused just on cybersecurity for healthcare environments. That's a proposed budget, not a passed budget, so we don't know. But, where should some of that money be spent in your mind and when we look at the Patch Act, have we seen a practical impact in terms of increase cybersecurity levels as a result of the passage of that?

Kevin Fu: Let me first talk about the Patch Act and then we can get into this new proposed 1.3 billion dollar budget to incentivize cybersecurity practices at healthcare delivery organizations. So the Patch Act was introduced and proposed by Representative Burgess, he is a physician from Texas, and that act required for instance, the submission of a software bill of materials, among what I would consider just basic hygiene. But it really raised the bar from voluntary 

recommended practices that FDA would introduce. In other words, it's a law. So a law is much stronger than an office's guidance documents or recommendations to manufacturers. The Patch Act eventually became part of the language you'll find inside the Omnibus Bill of last year. Just yesterday, the FDA published guidance on how it's implementing the 524B requirements that the 524B requirements- if you look at the language, really derives from that proposed Patch Act legislation.

Paul Roberts: One of the notable features of the Patch Act was this requirement around software bills of materials for medical devices that manufacturers are expected now to provide. Again, part of the larger conversation that is bigger than just the medical device space. Has that had an impact and are we seeing delivery on that requirement? And if so, have we learned anything after looking at some of these SBOMs? 

Kevin Fu: I would say the jury's still out. I'm very optimistic. The finalized guidance from FDA that discusses the type of contents of software bill of materials that it would like to see in its pre-market submissions was only released last year, so a lot of these things are still working its way through the marketplace. FDA only recently opened what's called eSTAR, their new electronic submission system for manufacturers who submit their what I would consider sort of term projects. Basically, the design documentation on why their devices are safe and effective. They now have a cybersecurity component to the submission process. I'm optimistic, but I think it's going to take at least a year, if not years, to really know more about the outcomes or even know about some of the numbers of for instance, how many pre-market submissions were disrejected or just forgetting to include a cybersecurity case, which I suspect will be growing, or when FDA reviewers are looking at the quality of the threat modeling or some of the missing risk mitigations, to what degree do they accept the manufacturer's decisions or pushback saying, we need you to modify this and then send us an update and then we'll reconsider.

Paul Roberts: One of the interesting things in the medical device space is you have this regulator, the FDA, that actually has a gatekeeper role that can keep products off the market if they don't meet the standards that they said. And as we talked about with the Patch Act, cybersecurity becomes a much bigger part of that bar that you've got to clear. But of course there are also just the consumers, the healthcare organizations that end up buying and deploying this technology. Is it your sense that they're attuned to these problems or is it, hey, once you get past the FDA, once you're cleared to get on the market, people are just going to take it at face value that you've got adequate cybersecurity. Or are you seeing kind of hospitals, healthcare organizations, starting to look hard at issues like what's in that SBOM and is our procurement going to be influenced by what's in that SBOM, for example?

Kevin Fu: I think you'll find different answers depending upon the type of organizations. There are some healthcare systems that are leading the way and leading things related to SBOM development as well as the effective use of an SBOM. I can certainly just hand you an SBOM and you might say, what do I do with this? You could choose to ignore it as a healthcare delivery organization, you could choose a third-party provider to essentially write you cliffnotes to tell you, is this a good SBOM? At the end of the day, it's all about risk management, and so it's all about accepting risk and then understanding what are some of your weaknesses that you may need to spend a little bit more observation time on just in case something goes wrong in what we call the post market. Now, with FDA serving as this role of clearing or approving devices into the U.S. market, that's not a get out of jail free card. What's that saying? You've met our bar of safety and effectiveness, as well as for the case of cybersecurity the statutory requirements from that 524B section. Things can still go wrong in the post market, and there's an entire second set of processes for what happens when there's a vulnerability that then affects a medical device, or more concerning, an actual incident. So you can have a near miss, like a vulnerability, you know something could go wrong, but there are also several documented cases of actual harm happening, where a medical device is compromised and then there needs to be a post market action. Typically, this involves, what's known as a recall, and there's a whole set of language that I'm surprised a lot of manufacturers are not aware of. The general counsels, if they don't know about this, they need to go probably take some of the classes we offer at Archimedes. But it, it talks about the basic requirements of it's called the 803 exemptions. There's a certain set of criteria where if you meet it, FDA exercises its discretion, its enforcement discretion, and you basically have an easier time as a manufacturer. One of the requirements, for instance, is giving notice within a certain number of days once you discover this vulnerability to, for instance, some of the affected entities in the healthcare community. There are other criteria to meet that exemption, but that's there as an incentive system to encourage the manufacturing community to be more transparent about these vulnerabilities so that the healthcare organizations, hospitals for instance, small clinics, at least have a chance to take some action before any harm happens to patients. And harm can take many forms. Harm can be delaying of patient care, harm can be unavailability to deliver patient care, as we're seeing in the case of ransomware hitting healthcare organizations.

Paul Roberts: Right, that is what is so distinctive about this particular vertical. This sector, you are literally talking about life and death, you're not just talking about availability, you're not talking about economic damage. You're talking about people potentially not being able to get life saving treatment when they need it. And there are no stakes higher than that. And yes, this transparency issue is a really big one. We see that with the analysis of the Ivanti Pulse VPN. Turned out it was running really old open source code and had vulnerabilities and was exploited by Chinese actors and so on. In your research right now, what are you seeing as the main kinds of risks or threats that organizations face with regard to medical devices? Because hospitals and stuff, they all have IT infrastructure, just like any other organization, that's typically what's being targeted by ransomware groups and stuff like that. But when we look specifically at medical devices what do you see as the biggest threats that are out there? 

Kevin Fu: There are a couple ways you could interpret this. I think there's looking at risk, there's actual risk, what's at stake, and then there's more decisional risk, what are we doing that we ought to be doing differently as a defender? On that note, I think one of the big risks organizationally is thinking about these problems through the IT lens rather than the OT lens. So OT, it's operational technology, these tend to be kinetic things, cyber physical things like factory floors, autonomous vehicles, medical devices. We have a large group of people who are very well trained in IT security. But guess what? Some of the things that you're taught are the right approaches in IT security are the exact wrong approaches for OT security. Let me give you one example, in IT security we think isolate. We think if there's an infection, we shut the system down or we disconnect it. That's not what you do for an OT system, for an OT system what matters is continuity of operations and high availability. And so you would make different choices, you want to make sure that patient care continues to be highly available, even if ransomware gets in. So the question is, how do we design and architect these systems to be resilient, even if ransomware gets into a system such that it's self containing rather than having to just pull the plug on an entire country. So that's one of the more organizational risks. And I think it's difficult to predict threats and risks. So 10 years ago, we were predicting malware would be a big problem. We were seeing a lot of malware breaking into hospitals, simply because hospitals had old legacy software. You could actually find Windows XP still on MRI machines, and I would not be surprised if you find some out there today. By the way at Archimedes, we're going to be hosting a jazz funeral procession for Windows 10, Windows 10 is going out of support. I want to draw attention to this.

Paul Roberts: When is this? We gotta be there. 

Kevin Fu: Oh, you gotta, it's going to be May 1 and 2 in New Orleans. And so the jazz band will be marching down the street, holding signs drawing attention to goodbye Windows 10, thank you for your service, and it's time to patch. It's time to update, because you want to know what healthcare systems are going to get hit with the next ransomware and $22 million ransoms? It's the ones running the outdated software that's not only insecure, but insecure-able because they no longer will have security patches. So one of the biggest risks is really basic hygiene stuff that engineers will go, why are we even talking about this? Of course you need to do that, of course you need to patch your software, of course you need a threat model. But there's a large part of healthcare infrastructure that simply hasn't done what I would consider the pre-lab, before you're allowed into the classroom to do an experiment, you actually have to do the basic homework, understand what assets are at risk, before you can even begin to ask the question, what do we do about it? And you'll see some of these in the HHS cyber performance goals, it even has an acronym, of course, because it's government CPGs. But you'll see elements of this I think being socialized from these cyber performance goals at HHS. 

Paul Roberts: You mentioned that Archimedes does training for uh, IT folks who work in healthcare. What types of things, what types of training do you offer and what types of problems do you try and orient folks to?

Kevin Fu: We have a couple training options going on April 30 in New Orleans. One, we have Michelle Jump from MedSec who's going to be teaching more the regulatory affairs side. So if you're a regulatory affairs professional and you want to know about cybersecurity, she's going to talk about how to stand up a cybersecurity program within an organization, whether it be a healthcare system or a medical device manufacturer or maybe a vendor who's serving in a medical device manufacturer or healthcare delivery organization. And it's very related to the Joint Security Plan, which was just released today from the Health Sector Coordinating Council, the HSCC. And that's a framework designed to help healthcare organizations stand up their own cybersecurity program so that they don't have to repeat all the mistakes that all of the rest of us have made, but can learn and follow a good template. So that's one of the courses, how to stand up a security program within a healthcare organization. And then the other training by Adam is going to be on threat modeling. So Adam Shostak ,very well known for his book on threat modeling and he's been with us at Archimedes for before the pandemic, actually.

Paul Roberts: Been a guest on ConversingLabs as well. 

Kevin Fu: Oh, terrific. He always has a good Star Wars joke. 

Paul Roberts: He does. That's true. It's true.

Kevin Fu: He'll be teaching about how to do threat modeling and really, I think bringing it down to earth so that you don't have to fear it. And I find that most professionals, whether they're engineers or product directors or risk managers, or even executives or vice presidents who oversee somebody who's organizing cybersecurity, it really teaches them sort of how the thinking works, it's going to help them understand red flags of choices and threat models. So if a security director reports to you and they tell you something like, we've never been hacked, we don't need to worry. That's when the executive can say no, I learned in Adam's class, that's not how it works. So he really brings it down to earth through exercises. Makes it become tractable. 

Paul Roberts: Yeah, he's got a great way of just simplifying and really focusing what could be a very amorphous, scary conversation. When I look at healthcare and healthcare environments, it seems like some of the trends that are influencing those and influencing cybersecurity are trends that you see elsewhere as well, like the digital transformation, the increasing reliance on cloud based systems and hosted applications. Just the incredible influx of smart and connected devices that, 20 years ago or 30 years ago were, even if they were electronic, probably weren't internet connected. Amazing new capabilities, and definitely want to own them. And yet sometimes difficult to appreciate the additional risks that comes along with that. And in some ways we see that with this United case, right? Having all these healthcare organizations migrate to this one provider, great until that provider goes down and you've got no backup. Do you see that dynamic playing out just within the healthcare organizations that you work with? And, is there an easy solution to that problem of both taking advantage of kind of these amazing new capabilities, but also managing the cyber risk?

Kevin Fu: So I would recharacterize this as how do we manage the risk as we're moving from the old world of on-prem software to cloud services? And let me be, the first or second to say, I think the cloud is great. The cloud can do great things, but you have to do it with the OT cybersecurity lens. If you take the IT cybersecurity lens, you're likely going to shoot yourself in the foot. For instance, perimeter based thinking, that's very quaint, that's very 1990s. Firewalls can be useful to reduce risk, but if your security system depends upon an adversary not getting through a firewall or not getting access to some resource, then I would say that's a very perimeter based thinking and it is not resilient. Resilience means if an adversary breaks in, you don't have these catastrophic system wide failures or you don't have to take down the whole system. You might have a partial failure or a small part or maybe a small geography would be temporarily down, but not the whole country. And so I think if we can get some of the OT cybersecurity thinking into the cloud services for healthcare delivery, then we're going to be able to better manage these risks. And that's why at Archimedes, we have executives coming to speak with our attendees from Google Cloud, as well as Amazon Web Services for the life sciences. And they're going to talk about how medical device manufacturers and health information technology providers use their services, and it's going to be open floor to ask questions about how do you manage risk when it's healthcare specific rather than just accessing your email, right? Because it's very different. If somebody's breaking into your email, three tries on a password, you get locked out, that's okay. But if you have a pacemaker and mistype a password three times, you probably don't want to lock out a clinician, it's not the right approach. But the cloud is already being used for many things from health information technology to radiology. But we did see an incident similar to Change Healthcare, but at a smaller scale, when ransomware infected a radiation therapy manufacturer. This was three years ago, in fact. And they did the exact same thing as Change Healthcare, they took down their cloud. But when they did that, suddenly, for several weeks, none of the cancer radiation therapy devices that depended on that cloud could function, the dosimetry was stored in the cloud. And so it's okay to use the cloud, but we have to design it to manage that risk and understanding the product should still work, maybe in some kind of degraded form if the cloud becomes unavailable. 

Paul Roberts: And these are conversations that need to happen really at the very earliest stages of product design, or maybe if the product's already designed, you need to be kind of reframing this. Do you see medical device makers doing that? For example, products these days might have red teaming or other forms of, let's put on an adversary's hat and look at this device and think through different scenarios and then figure out how we address those. Is that becoming commonplace or is that more the exception than the rule these days with medical device makers? 

Kevin Fu: I would say it's becoming more common only because it's written into the guidance documents and it's facted that you have a penetration testing, for instance, which is a form of the red teaming, you'll find in some of the guidance documents, it says when you do this red teaming or the penetration testing of your product, make sure the people doing that are not the same as the design team. A lot of people make this mistake without realizing of course you think your system's secure because you designed it, you need to get a different set of people who have different biases, different ways of thinking. You want someone such that the designer's reaction is wait, that's not fair, you're not allowed to do that. That's when the red team is effective. And in fact next month on the UC San Francisco Stanford CIRCE seminar series that I host, I'll be hosting David Brumley, who will be talking about red teaming and binary analysis and things of that nature.

Paul Roberts: Do you have time for two more questions? 

Kevin Fu: Sure. 

Paul Roberts: Okay. So you worked for a while within the FDA as their acting director for medical device cybersecurity. I'm interested in that experience, what you took away from it and how that's influenced you. 

Kevin Fu: I would go back to work for FDA anytime, redo that experience. Not only was it a great learning experience, I also felt like I was able to make contributions and my successor is running a great team, which is now an entire division of medical device cybersecurity that was just announced a couple of weeks ago. And they have a great team that's forming. I would say, people talk about government dysfunction. The group I was in, they were stellar, and I continue to just be so impressed with every individual, with every leader, and how they explain their thinking and how they're so thoughtful. And I can't say that's normal, right? Most organizations don't have a good way of bringing up sort of new leaders. And it was just so wonderful to be able to interact with great role models for just really good leadership, especially when they need to convey news that somebody doesn't want to hear, negative information, I really appreciate their approach. I certainly learned a lot. For instance how a bill becomes a law, SchoolHouse Rock, it's a little bit different when you're in government. 

Paul Roberts: Not just a bill.

Kevin Fu: Yeah. Let's sing along, kids. But there's a whole other set of processes through good old OMB, Circular A19, I believe, for how you can work with Congress to create legislation with the consent of the full executive branch, and that's not something you learned in school. Today I encourage students to take internships at the FDA, especially computer science students or regulatory affairs students who maybe like computing, but maybe don't want to spend their entire careers just programming, right? So if you understand computing, and you want to be motivated by helping people go work for the FDA. They're a great employer and great people.

Paul Roberts: And they actually can affect change. That is something that I think in other federal agencies, it might be harder for people working there to see, how are we really making things change? The FDA really can. 

Kevin Fu: Yes. I can't agree more, and I'm just glad we have what I think are the right people in the right roles for medical device security at FDA. It didn't have to be this way, because they're not all government offices are shall we say functional, but I'm quite pleased that FDA has been able to retain such a great talent and also export talent. There are people who leave the agency, go to the private sector, and continue to do great things in medical device security. 

Paul Roberts: One of the things that we write a lot about here at ReversingLabs is threats to software supply chain. We've obviously seen those with SolarWinds, 3CX, MoveIT, and so forth. We talk a little bit about software bills and materials. Are both device makers and end user organizations, are they attuned to this risk, particularly as regards their use of things like open source software, but also commercial software, they've got long software supply chains just like everybody else does. Is this something that has hit home in the medical device field? To the best of your knowledge, or not as much? 

Kevin Fu: I think on the medical device manufacturing side it is now law, you must submit an SBOM. It's not, do you feel like it? Is it a good idea? You simply have to, FDA is not even allowed to say don't, they're required to require an SBOM because of the law. So I think every manufacturer, if they don't already know, they probably need to hire a different consulting group or regulatory affairs person because this is law. It's been around for over a year. 

Paul Roberts: Do you sense that it's changing practices, is the more meaty question there? Yes, okay, we can comply with this. It's like any other compliance conversation, right? Like you can comply with the law, but is it actually having downstream effects within your organization in terms of how you make stuff? 

Kevin Fu: So there's checkbox compliance, and then there's actual cybersecurity. So the SBOM, in some sense, you have to checkbox the SBOM, but while you're doing that, you're likely going to be learning a lot about risks you didn't realize you were taking, or they're going to be at the system engineering level as systems are being integrated, you're going to realize that some of these risks become actually existential to the entire system. Imagine like Change Healthcare. When you don't realize how much you depend upon certain pieces of software. So one of the benefits of an SBOM is you're going to be much more quickly able to identify and answer the question, am I vulnerable to that new zero day? Right now, you might run around with your head, chomped off like a chicken, right? Just saying oh my god OpenSSL, do I run that version? Log4J and all that. SBOM should help on that for manufacturers to much more quickly, for instance, tell the FDA, don't worry, we don't use that version or oops, we do use that version, here's our mitigation plan. But don't worry because we already have a plan, if one of our software packages gets compromised, here are the steps. And then for the healthcare delivery organization, they're not required to ingest the SBOM, but it's certainly a very valuable piece of evidence, so valuable to just know what's at risk. So if you're trying to convince your cyber insurance provider that you're doing a good job, your SBOMs can help you there, because you're going to know more about what risks you're taking. And I know some of the more sophisticated healthcare systems have learned how to use SBOMs in their procurement stage, such that they can really push back on a manufacturer where the sales engineer says, don't worry, we're going to provide you updates, and then later you say, wait, you're shipping me Windows 10, and that ends in one year. So what are you going to do after that? Oh, so the SBOM really keeps people honest.

Paul Roberts: It gives you a stage on which to have what might be difficult conversations or conversations that previously you might have only had in the wake of some, adverse incident, right? Whoa, wait, that was running Windows 10, why didn't you tell us? Yeah. Which is good. That's a pleasant thought, which leads me to my final question, which is medical device cybersecurity is really at the forefront of what the federal government is doing in regard to trying to improve cyber resilience and set down some really clear guidelines for manufacturers and vendors to follow with regard to cybersecurity. I guess I'd ask you, what lessons have we learned from what has happened thus far? And, the kind of the response we've seen that might be applicable to other sectors, other industries, either critical infrastructure, other critical infrastructure sectors, or just more general, in terms of what, what's working, what isn't working. 

Kevin Fu: I have a feeling not too many of your listeners are from oil, gas, and water. But if they are, come to our next live podcast, that's actually happening in just half an hour. They're going to be talking about all 16 of the CISA defined critical sectors for cybersecurity, not just healthcare. Healthcare is only one of the 16, and although we have a lot of problems to solve in healthcare cybersecurity, I think we are at least asking a lot of the right questions, we're getting a lot of the right people in the room, we understand what a lot of the problems are. That's not true for all sectors, so there's still a lot of places where a threat model, they might not even know what that is. OT cybersecurity is still a foreign term in many different sectors, and I think one of the things that other sectors can learn is how to stand up these communities. So, the Health Sector Coordinating Council is a great example. They are the designated assembly place for the healthcare sector for writing shared knowledge frameworks. How do you stand up a security program and organization? HSCC has a document catering to healthcare. We don't yet see that universally in all 16 sectors, but those are some things I think they might not be able to just cut and paste, but they can learn from what are some of the effective techniques to get people together to solve the risks that are specific to their sector.

Paul Roberts: Kevin Fu of Northeastern University and the Archimedes Center, thank you so much for coming on and speaking to us on the ConversingLabs podcast, it was a pleasure. And I would inform our attendees that you're about to step into another meeting on fortifying cyber physical resilience, people might want to tune into that as well, but thank you so much for doing a preview here on ConversingLabs. I really appreciate it.

Kevin Fu: Yeah, you're welcome Paul. Great to be on the program, thanks for your questions. 

Paul Roberts: Yeah, always a pleasure and we'll do it again Kevin, of course.

Kevin Fu: I wish you high availability. 

Paul Roberts: Thank you. Same to you.

Paul Roberts

About Author: Paul Roberts

Content Lead at ReversingLabs. Paul is a reporter, editor and industry analyst with 20 years’ experience covering the cybersecurity space. He is the founder and editor in chief at The Security Ledger, a cybersecurity news website. His writing about cyber security has appeared in publications including Forbes, The Christian Science Monitor, MIT Technology Review, The Economist Intelligence Unit, CIO Magazine, ZDNet and Fortune Small Business. He has appeared on NPR’s Marketplace Tech Report, KPCC AirTalk, Fox News Tech Take, Al Jazeera and The Oprah Show.

Related episodes

Subscribe

Sign up now to receive the latest weekly
news from ReveringLabs

Get Started
Request a DEMO

Learn more about how ReversingLabs can help your company.

REQUEST A DEMO