Break Free from VirusTotal with ReversingLabs Threat IntelWatch AMA Replay

The State of Vulnerability Management

In this episode of ConversingLabs, host Paul Roberts interviews Casey John Ellis, founder of Bugcrowd, about the state of vulnerability management and bug bounties in 2025. Casey shares his insights on current changes impacting both the threat landscape and the cybersecurity industry, such as matters at the federal level and increased AI usage. Looking at the future, Casey also mentions how important it is to welcome the next generation into cybersecurity. 

EPISODE TRANSCRIPT

Paul Roberts: Hey there everybody. Welcome back to another episode of the ConversingLabs podcast. My name is Paul Roberts and I am the director of editorial and content here at ReversingLabs, host of the ConversingLabs podcast, where we dig into the latest developments in areas like malware and threat analysis, software supply chain risks, cybersecurity in general.

I'm thrilled this week to have with us in the studio, Mr. Casey Ellis. Casey [00:01:00] is the co-founder of Bugcrowd and one of the leading advocates of bug bounty platforms. He's also the co-founder of disclose.io, a project to standardize vulnerability disclosure, and a leading advocate globally for greater software transparency and security. We're thrilled to have him on the show. Casey welcome. 

Casey John Ellis: Thank you for the glowing intro, Paul. Good to be here. 

Paul Roberts: It's best I could do, man. It's great to have you, man. First of all welcome. I don't think we've had you on the show before.

Casey John Ellis: No, I don't believe so.

Paul Roberts: Yeah.

Casey John Ellis: Yeah. It's good to be here.

Paul Roberts: So that's gonna put on you the requirement to give us a little bit of your origin story-

Casey John Ellis: Oh, geez.

Paul Roberts: And how you found your way to cybersecurity and vulnerability hunting and stuff like that. 

Casey John Ellis: Yeah, sure. Short version of it is I grew up hacking stuff as a kid.

Got introduced to technology at a pretty young age and [00:02:00] immediately started to break it. I think it was the combination of like curiosity and the fact that I'd enjoy thinking like a criminal, but don't wanna be one. So that was high school. And then pretty much coming out of high school, I like tripped over into a pentesting job.

So that was the beginning of my information security career that was in the early 2000s, right? So I'm dating myself a bit there. Went on doing that for a chunk of time. That was a really fun time to be hacking the internet. It feels a little bit like today, but back then before the trustworthy computing memo and before like default firewalls on home PCs and just all sorts of crazy fun that you could get up to at that point.

But I actually made a transition across into solution sales and architecture in 2006, just off the back of, really enjoying breaking into stuff, but actually really enjoying the business and the marketing side of it as well. It's one thing to solve a problem, but if you're not connecting that problem-solution fit to where the problem exists, the problem remains unsolved. So I got [00:03:00] deep into the sales and marketing side of it. One thing led to another and started Bugcrowd. Really started working on it in 2011 and kicked off in 2012. And the idea behind that was looking at the problem that we have to solve as defenders, which is basically outsmarting this crowd of adversaries that has like lots of different skill sets, lots of different motivations and an incentive structure that looked very different to how we were doing it at the time. We were expecting to be able to compete with that, with someone being paid by the hour or with whatever automation we had available.

And it's like the math is wrong. So that was just annoying me. But the other side of it, which really was what spawned a lot of what Disclose.io works on is having grown up as a hacker, and inside that community, it's like there's a lot of people like me that are disconnected from the problem because the world assumes that we're bad 'cause of what we can do and 'cause of what we're curious about.

That's a stupid problem for the internet to have. Let's try to fix that. So it was those two ideas floating around that kind of collided [00:04:00] at some point and yeah, I started off Bugcrowd and here we are. 

Paul Roberts: And that point in time was one in which there were a number of companies that founded with a similar approach. It was like a shift, like you said.

Casey John Ellis: Yeah.

Paul Roberts: For many years there was this assumption that if you're finding vulnerabilities in software, you're a bad guy. Instead of, which was the sort of security and obscurity argument, right?

Casey John Ellis: Yeah.

Paul Roberts: Instead of, no, actually they're doing good work. They're finding flaws- 

Casey John Ellis: Important work, right? Yeah.

Paul Roberts: And really important work. And should be compensated for it. Incentivized.

Casey John Ellis: They should be compensated for it. At the very least, they shouldn't be like operating under a chilling effect or at risk for prosecution. That was the big thing as a starting point.

It's let's just make this not a crime. In the same way, or at the very least, acknowledge the fact that hacking is a dual use skill. The bad guys use [00:05:00] it too. And that's why, people were so freaked out about it, but there's the good version. There's the locksmith version versus the burglar version that we're disconnecting from the problem by keeping it like this.

So yeah, it was interesting because we didn't invent bug bounty or vuln disclosure, that was all prior that dating back to, the early nineties, and before for vuln disclosure. But we did kick off this idea of basically putting a platform in between as much of the communities as we can gather up, and then as many different problems as we can find on the defender side.

And today there's 80 companies that do the same thing, the idea-

Paul Roberts: Wow.

Casey John Ellis: Is truly out of the bag at this point.

Paul Roberts: Dude, you guys need your own conference.

Casey John Ellis: We're not gonna claim DEF CON, but-

Paul Roberts: You need an industry group!

Casey John Ellis: Was a lot smaller when we started. It's a lot bigger now. I think we played at least some role on that.

Paul Roberts: Yeah, of course. You need an industry lobbying group just for the bug bounty industry. [00:06:00] Yeah. That's what you need.

Casey John Ellis: I think that's not wrong for sure.

Paul Roberts: That's what you need. 

Casey John Ellis: Yeah. 

Paul Roberts: Take us back to 2012, around the time you set up Bugcrowd and looking at today, how has the market for bug bounties changed during that time? What have, you know, things that Bugcrowd has helped to realize, what do you see as- 

Casey John Ellis: Yeah, no, definitely. And I think I wanna draw a distinction there, like the market for vulnerabilities, like in a bug bounty context, and just this general idea that like humans write software, humans aren't perfect, therefore vulnerabilities are gonna exist and sometimes you'll need humans to find them, which I just class as almost like this top level kind of version of what we have to solve. 'Cause you're not gonna pay for them unless you accept them in the first place. I feel like there's a ton of progress that's been made. Like one of the interesting parts of the Bugcrowd story is that we actually landed in the US the same [00:07:00] month that Snowden did his thing and I think a really interesting side effect of that whole kind of situation was it was one of the first times I think the world had all collectively thought about hacking as something that potentially affects them. 

Paul Roberts: Yeah. 

Casey John Ellis: Like at the same time, like no matter whether they were lay people or industry folk or whatever else, it's oh wow, okay.

I thought that was that weird thing in the corner. But now all of a sudden it's potentially impacting them as individuals and I feel like that was a turning moment for the cybersecurity industry just in general.

Paul Roberts: Interesting, yeah.

Casey John Ellis: So we definitely took advantage of that. It was interesting from there because as you said, there was a lot of probably the first maybe two years of Bugcrowd and then seeing, some other competitors join and get pretty well funded as well at that point in time- was mostly about going out and convincing people that hackers aren't always evil. 'Cause there [00:08:00] was a lot of are you insane? This is a crazy idea, kind of response that we got at that point in time. And it was really when the Department of Defense did Hack The Pentagon.

That was to me another turning point, 'cause folk are looking at them saying, Hey, if you're, if you got all the bombs and guns and people, and like you're this sort of apex predator-

Paul Roberts: Classified information, right?

Casey John Ellis: Yeah. Classified stuff. But like you've got all this capability and if you're still tapping a 15-year-old on the shoulder to come help you out with your computers, then maybe I should do that too as a Fortune 1000 or whatever else, right?

Yeah. So that was I think, where it was. I think today we're in a position where it was getting better pretty rapidly for a big chunk of time there. In terms of internet security and safety, I feel like COVID really threw a bit of a wrench in the machine, because everyone had to do a bunch of stuff really quickly to pivot towards, like work from home and whatnot.

And speed tends to be the enemy of [00:09:00] good decision making. When it comes to security. And now, in 2025, we've got like AI-powered coding. We've got the restructuring of economics and labor and how, people wanna operate their companies. There's all of this sort of flux and speed getting added again to the mix. So it's an interesting time. I think the role that researchers play now is working out- it's the same as it's always been. You figure out that cutting edge. But it's, how does all of these new tools and these kind of new environments that we're going into globally, like how does that affect, internet security at a product level, but then at a system level as well?

Paul Roberts: Yeah, no, that's a really interesting point. And when you think about it too, like it's not like we ever really got our feet beneath us after COVID. One of the things that COVID put a rocket-

Casey John Ellis: I think that's right.

Paul Roberts: Booster on was the [00:10:00] migration from on-premises to cloud infrastructure.

Just because you suddenly had to support this, perimeterless environment with workers all over the world working from home. And then shortly thereafter, right? AI, ChatGPT 3 comes out and generative AI starts to really just also do that type of curve.

And that's where we are now. 

Casey John Ellis: Yeah. And then you've got a whole lot more people doing a lot more things with a lot more capability. And I don't think that's inherently a bad thing on the builder and the defender side, but- 

Paul Roberts: Yep. 

Casey John Ellis: Going back to what I said before, it's like speed is- and to some degree, even like capitalist competition is the natural enemy of good choices when it comes to building stuff. So the faster things go logically, the more, potentially fragile they're gonna be. And we're certainly seeing evidence of that actually playing out at this point.

Paul Roberts: [00:11:00] Totally. I sometimes use the example of just imagine cities like, Manhattan or Sydney or Boston with, no building codes, right? And no, no structural engineers.

Casey John Ellis: Yeah.

Paul Roberts: But with an emphasis on like speed to construct and, architectural innovation, right?

Casey John Ellis: Yeah.

Paul Roberts: What would you have? You'd have really interesting buildings that-

Casey John Ellis: You'd have things that fall over, sometimes.

Paul Roberts: Fall over, or, small fires that just consume whole buildings and that's what we have in the software industry. I don't know. 

Casey John Ellis: Definitely, and I think that's a legacy. That's as much a legacy problem, as much as it is a cutting edge one at this point.

Paul Roberts: 100%, yeah.

Casey John Ellis: It's like what COVID did is it exposed a whole bunch of stuff to the internet that shouldn't have been exposed to the internet necessarily in that way, 'cause of the priority towards remote access and different things like that.[00:12:00] 

And, you look at what the typhoons are doing and what some of the IABs are doing, all that kind of stuff, at this point in time, like that's pretty clearly being like heavily exploited and that stuff is hard to fix.

Like it basically sits out there and powers the entire internet. So back to your building metaphor, like that's the foundation that we're on. And in the meantime, we're building up as quickly as we possibly can at the same time. So it's a pretty, I think there are things that can be done to solve and address this. The list of kind of variables that are in the pot at this point in time makes me a little bit uncomfortable, 'cause there's just a lot happening all at once. 

Paul Roberts: So one of the really important things that Bugcrowd did was to take the job of setting up sort of a platform for vulnerability researchers to interact with companies rather than leaving it to the company to be like you gotta set up the portal and you've gotta manage these relationships and you've gotta create incentive programs.

[00:13:00] I think you took that all off their back and said, we'll do all that for you, and we'll grow this community, we'll go out and, find new people and pull 'em in. Do you and I know obviously over the years your customer base has grown greatly. Is there still more work to be done, I guess I would ask, as more and more companies become software companies, even companies that used to be just, manufacturers of stuff? Are you seeing more migration to this idea of yeah, okay, we, we need to have a formal kind of security function. We need to have a front door for security researchers and be able to accept what they're giving us. Is that starting to become just a pillar of the community, or is it still a hard sell in some cases? 

Casey John Ellis: I think it varies depending on which part of the community you're in. I think for companies that have started like more recently, [00:14:00] like companies that are cloud native or AI native, in general, they'll have at least some kind of influence within the organization to say, Hey, like we're not gonna be perfect at this.

We need to be able to let folk from the outside know when we've introduced risk. And they need to know that they're safe in doing that.

Paul Roberts: Doing it to us, right?

Casey John Ellis: As a bare minimum. So I think that's pretty common as a default thing that you just do. GitHub and your different kind of code repos have a push button, gets security.txt file piece, for example. And Disclose.io has a policy generator that gets a ton of use that's actually boilerplate language that protects folk in both directions. So that's, I think, pretty common amongst tech and more modern organizations. The older ones are still catching on in a lot of ways.

Paul Roberts: Old economy companies that are...

Casey John Ellis: Yeah, old economy. And I think there is this sort of leap of [00:15:00] humility that you need to make before you'll actually accept that input from the outside. 

Like you've gotta go from like ostrich risk management where it's if I just ignore this problem, it won't exist. To saying, hey okay, to error is human let's get better at not doing that.

But let's also assume that we'll never be perfect at avoiding that and figure out, how else we can get help. There's a posture shift inside an organization that needs to happen before they'll do that. I think in general, older, larger organizations have a harder time with it. Just across the board.

So that's VDP and vuln disclosure, I think bug bounty is definitely like this sort of odd, like to me it's like the odd stepchild of vuln disclosure in some ways, because I believe that vulnerabilities are valuable for an organization to actually offer to pay money for that and to have that information be directed to them instead of going somewhere else.

That's a similar kind of thinking shift that we were just talking about with vuln [00:16:00] disclosure in the first place. Adoption there is growing like crazy. I think it is that's also driven, I think in some ways by efficiencies of the labor market. Like this, going back to the founding principle of Bugcrowd, it's like it takes a crowd.

One of the reasons attackers are successful is 'cause they're so diverse. So like this idea of having this diverse kind of intelligence capability to be able to basically have compete with them, that makes a lot of sense, especially when you think about the right people being hard to hire or it being competitive or whatever else.

So there's a lot of different things that are driving that. But that'd be my summary.

Paul Roberts: And we've seen some evidence that, of just how hot this market is among them. I believe Apple just announced a $2 million bounty for an iOS zero-day. Or three. Is it three?

Casey John Ellis: For a full train, yeah. 

Paul Roberts: Yeah, this was for what? A no click, remote [00:17:00] code? 

Casey John Ellis: Yeah. I believe it's no click, full train. So it's not just exploiting, it's like getting to, you know, ring zero or below on the iPhone. Yeah, bugs-

Paul Roberts: And my guess is there are other darker organizations that would probably pay more for that.

Casey John Ellis: A lot more, 

Paul Roberts: Yeah. Yeah. 

Casey John Ellis: Yeah. So if you're talking about the offensive procurement market versus the defensive one, like this is something that I've been adjacent to, and in some ways part of my entire career. And the reality is like it's shifted I think over the past, probably again since COVID, since your like initial access brokers and you kind of like cyber criminal riffraff have entered the chat.

Prior to that, you wouldn't really see offensive exploit procurement for anything that was hosted as a [00:18:00] platform. 'Cause the logic was partly that if we exploit that and it gets detected, it gets fixed in the one place, and then our ability to use that exploit goes away.

Paul Roberts: Goes away.

Casey John Ellis: That's to me, shifted quite a bit over the last five years. You look at the IABs I've mentioned a couple of times, like initial access brokers. You look at the tactics of the typhoons, where it's just basically opportunistically spray, like machine gun the entire internet and see what sticks.

We've definitely drifted away from stealth on the attacker side. And I think what that does is it means that there is actual economic competition for those bugs at this point, it used to just be like Adobe, Oracle, Apple- 

Paul Roberts: Microsoft.

Casey John Ellis: Android, Microsoft, 10 maybe vendors where there'd be that kind of competition around that packaging information that, to drive it up to something like 3 million bucks to buy it defensively like Apple are doing.

[00:19:00] That's not true for, that's not as true for a lot of other organizations, but it's starting to become true across the board, I think. 

Paul Roberts: So you just attended a first ever conference called Offensive AI. I think it was in San Diego? 

Casey John Ellis: Yep.

Paul Roberts: Super interesting. As its name suggests, this is around using artificial intelligence to find vulnerabilities and other problems with software that could be used by an offensive actor.

Casey John Ellis: Yep.

Paul Roberts: You were one of the organizers of that or advisors?

Casey John Ellis: I was involved very earlier on. Oh no. I'm an advisor to a company called DreadNode, who were the originators of the conference. And given all the conferencing stuff Bugcrowd's done over the years, I gave 'em a little bit of help in the early days.

But it's definitely to the credit of the team putting it on, it was phenomenal. 

Paul Roberts: Yeah, I was gonna say, first of all, great idea for a conference. Second of all, so what was your take? What do you see any interesting [00:20:00] presentations and what do you take away from that? 

Casey John Ellis: Yeah, there was a lot. I think the biggest take that I came away with is that there's this sort of, the interesting thing about offensive AI and offensive applications of AI is that you've got, like the machine learning and AI people approaching the whole thing from this like probabilistic math direction. And then you've got all of the like old school cyber operators basically saying Pocket GTFO. So it's like you've either got shell or you don't. Right? And just watching them slowly get closer and closer together and figure out like how to help the other has been-

Paul Roberts: Interesting.

Casey John Ellis: Yeah, really interesting. Over the past three years, this was a really good example of an effort to do that. The thing that really did jump out was that that is starting to work pretty well now especially when you've got like deep tech AI folk working like hand in hand with offensive practitioners. They understand the mission at this point, which is cool. 

Paul Roberts: Right.

Casey John Ellis: [00:21:00] There's definitely a gap between the people that do that and don't. A lot of these kind of AI powered pentesting platforms that are popping up rely almost exclusively on the state of the art, like foundational models. And coupling those to tooling and prompting them and doing other kind of fairly frankly, easy to do things when it comes to AI engineering.

And as the state-of-the-art models get better at offense and better understanding code, the differentiation across that market's gonna get bouy, I think. So there's a lot of talk about like reinforcement learning and I can get super nerdy, but I won't there. Probably the other piece is, so there's those two sides.

Paul Roberts: You can get nerdy. It's okay. 

Casey John Ellis: There's those two sides and then you've got like the policy and like the national security side of it as well. 'cause there was a lot of folk with that bent in the room.

Paul Roberts: In the room. Yeah. 

Casey John Ellis: Yeah. Policy hasn't caught up with this [00:22:00] stuff at all. Aside from a lot of the conversations coming outta DC and frankly other parts of the world as well, that we need to just collectively get better at offense in a cyber warfare context. So it's less about pentesting and more about, let's go disrupt the thing, right? 

Paul Roberts: Yeah. It's the mutually assured destruction model in some ways, right? Like- 

Casey John Ellis: There's a lot of calls for deterrence outta DC which is, I don't disagree with it, but the thing that's awkward about that is that if you talk about mutually assured destruction in a nuclear context, like the outcome of a bomb going off is basically the same no matter where it happens in the world.

Whereas with cyber, it's gonna depend on your infrastructure. It's gonna depend on how well you defend yourself, how well you can eject attackers, all those different things. So that's a bit of a weird one. But it did come back to [00:23:00] on the NatSec side, some of what we were just talking about where it's like there's all this conversation around cutting edge code and all of the new stuff that's going out, but in the meantime, we've got Telco is getting popped through 15-year-old CentOS in an edge device every week. Do you know what I mean? So it's let's not forget, the kind of the root cause problems that we've got. 'Cause if you apply AI to attacking that stuff, you can do a lot of damage very quickly. And that's now in the hands of a lot of people. 

Paul Roberts: And we've just seen Microsoft end support for Windows 10, which is by one count more than a billion devices globally, and that doesn't even count all the Windows 8 and when you start looking into what versions of Windows are still out there and operating.

Yeah, I mean that legacy code problem is just the elephant in the living room in some ways for organizations, [00:24:00] especially older, especially more established organizations.

Casey John Ellis: Exactly. Yeah. And those below the security poverty line as well.

Paul Roberts: Yeah. That's right.

Casey John Ellis: And that was a thing that really jumped out at me because a lot of the folk that didn't grok that concept aren't necessarily those who've been around as long. So this idea that like you can't just go and update the Windows XP box that's driving your medical device because the middleware that powers the company, like the company that makes that, went bankrupt in 2003, which is why you're stuck on Windows XP.

Like you can't just forklift the entire thing. It's not a simple software solution in the way that I think a lot of people tend to think it is. If they've not been around to see that type of thing. So that was a really interesting takeaway from it. Just seeing how real that is.

Paul Roberts: And part of the problem and we see this in other areas of our economy too, is that, like you said, there's a security poverty line and you've got a lot of businesses [00:25:00] that are just trying to make ends meet so they don't necessarily have the resources to invest in a six or seven digit software upgrade.

Casey John Ellis: Yeah.

Paul Roberts: Just like towns can't upgrade their sewer lines from, the lead pipes that were installed in, 1908 to, whatever.

Casey John Ellis: Yeah, it's not a snap the fingers kind of thing in the way that I think a lot of folks that have come in more recently think it might be.

Paul Roberts: Right.

Casey John Ellis: Yeah, that was a really interesting takeaway seeing that. Offensive AI is, like the stuff you can do with it, especially right out at the cutting edge is pretty crazy. Like automated exploit generation, for example. So taking a big pile of code, going through identifying vulnerabilities, but not stopping [00:26:00] there actually, basically attaching capability to that vulnerability and then creating a payload that's usable at that point in time, like automating that entire pipeline. That's a thing. I think when people hear stuff like that, they're like, oh, I bet it doesn't work on iOS. And you're right, it doesn't. But if you're talking about simpler, like single or two or three shot bugs and you've got a wide enough set of things to potentially target with capability like that, then you can end up with a capability set that's actually pretty useful. So this idea of just being able to scale attacking, especially the bottom of the pyramid, which is where we've got some serious problems, right? That's actually getting really good at this point. 

Paul Roberts: I've read articles that sort of suggest, will AI ever be as good as people, as individual security researchers in, finding [00:27:00] bugs, hacking applications, that the sort of, the intuition and the kind of go with your gut nature that the best security folks have is gonna be very hard to replicate in large language model, AI, which is more algorithmic in nature. I don't know what your thoughts are on that. What do you think? 

Casey John Ellis: I have many, and I try not to be like, horribly biased on this, 'cause obviously the thesis behind Bugcrowd and a lot of the work I've done over the years is that cybersecurity isn't a technical problem, it's a human one, which means it's creative. So this idea of, theoretically if automation and AI was to solve or exploit every single problem that we've got available to us right now. If I'm a criminal, I'm not at that point just packing up and going home.

Cause that predates the internet by a couple of thousand years, and I don't think it's just gonna go away overnight, so there's that [00:28:00] just as a starting point. Very practically, the thing that AI struggles with is it's twofold, it's basically specialization and context.

So the idea of it being specifically, are all the trains with trajectories into understanding particular vuln classes and all of the different permutations and how to think that through, and then how to coordinate with other agents that are trained on other things. Like you've gotta very deliberately build that for it to work, and most people aren't, that's not what the state of the art models do. So I think there's a lot of folk expecting a fish to climb a tree when they're saying, Hey, can Claude hack the Gibson? Or whatever it might be. But that stuff will improve and get deeper into the possibility of outcomes it can [00:29:00] produce over time as the models get better, as they get trained with more data, like all that kind of stuff.

So there's that. And on the context side, I think the bit where I really see AI struggling to fully replace human creativity when it comes to vulnerability discovery, but then exploit development as well, is that it's not a linear path in the way that a lot of people think it is. It's not oh, that's a candidate for IDOR, I'm gonna go pop IDOR and I'm done. If you're talking about a complex target or a network, or even like a hardware device that's been hardened, for example, what you're doing is you're accessing this like graph of different things that you know about antipatterns and potentially exploitable conditions that exist within that target space.

And what you're trying to do is to create pathways through that graph. And then permutations and [00:30:00] combinations of how those different things interact with each other to get to the outcome that you want. That's-

Paul Roberts: Very fuzzy.

Casey John Ellis: Yeah. The way that we're dealing with memory in AI at the moment in particular is just not meant for that.

So there'll be innovation around that in time, I'm sure. But, in terms of AI as it exists today, eh. It is definitely making life harder for folks that don't want to apply themselves to actually like taking advantage of this technology and leveling themselves up and like we've seen that before, right?

You think about when H.D. More dropped MetaSploit, like prior to that everyone was writing shell code by hands and all of a sudden the uniqueness of being able to do that sort of went away, 'cause you had a framework to do it for you. And you've got this like new set of people that have come into the mix.

Same thing happened with attack surface discovery. When it's like, how do I find things that are left on the internet? Like to me, this is just in some ways another version of that, at that level. [00:31:00] That's my convoluted answer. 

Paul Roberts: Yeah. We've seen over the decades new technologies come along or new capabilities come along, and it just changes the path of the work that people do and the way this who comes into the industry. So one of the presentations I think at Offensive AI was Caleb Gross, did this presentation on a free OSS tool to do static analysis.

Casey John Ellis: Yep.

Paul Roberts: On like open source code. One of the ideas is could we go out and find and remove all the vulns and flaws in open source code? Which is, we know, 'cause we're seeing so many supply chain attacks and so many just, attacks on vulnerabilities in open source, would be highly valuable. Do you see that as possible application? 

Casey John Ellis: Yeah, a hundred percent. We got pretty involved in [00:32:00] the DARPA, the AICC Cyber Grand Challenge, and that was actually one of the goals there, was the idea of don't just find vulnerabilities, but actually, patch them and fix them and do that in a way that doesn't break the rest of the code.

I do see that going there. Like I'm involved with a company called Corridor that's doing basically AI assisted linting and kind of detection of this type of stuff in the IDE. So it prevents it from being introduced into the code base in the first place. There's a variety of different solutions that people are trying to apply to that problem. For open source, to me, the thing that needs to be overcome as a challenge there is that finding bugs and even proposing a fix is not the hard part. The hard part is on the shoulders of the maintainer trying to figure out like, how do I test this code?

How do I make sure that it doesn't cause a [00:33:00] regression issue in some other part of the functionality of my library or whatever it might be. And if all of a sudden they're getting like jack hammered with this stuff from everyone with this type of tool across the internet, then you're gonna end up with a triage issue on their end.

So I think I do see it being a part of the solution, but I don't think- everyone's thinking about their own kind of particular little slice of the chain.

Paul Roberts: Yeah.

Casey John Ellis: And I don't feel like we've linked that all up yet. 

Paul Roberts: So if you were to say now how you thought AI is going to influence the future of vulnerability research, vulnerability management- 

Casey John Ellis: yeah. 

Paul Roberts: Bug bounties. You're probably already seeing it starting to play out already, but what are your thoughts and, if you had to look down the road two or three years and say, this is what this market's gonna look like, what do you think? 

Casey John Ellis: Yeah. [00:34:00] I trotted out actually we'll come back to that. Look, I think what it's doing is it's reducing the, you must be this tall to ride height for its for attackers, full stop. And this is bug bounty hunters. It's vuln researchers. It's bad guys. It's-

Paul Roberts: Or even you need to be a human being to ride.

Casey John Ellis: Yeah, ultimately you still need to have a human press play on that type of thing at this point. So for as long as that's true, I'm not super concerned about that one. But yeah, that one is definitely a little scary.

But yeah it's basically, it's gotten people to a place where they can get to impact, they can get to vuln discovery or even exploitation without having to have gone through, [00:35:00] the 10 years of just being immersed in compute or the CS degree or the courses or whatever it is that they've done to get to that place.

And in some ways I think that's a good thing. I do think there's like an atrophy that we're at risk of just by being able to press easy mode and get over that hump. That's it, we'll see how that kind of all plays out. But I think in terms of how it's gonna affect bug bounty and vuln disclosure just in general over the next period of time, is that we're gonna end up with a pretty major triage issue.

Because like just what I was saying with open source, 'cause the reality is, and ask me how I know this, right? The reality is that everything is way more vulnerable than anyone really wants to admit. And just the sheer volume of vulnerabilities that you can discover is like far higher than I think most people would realize, even on the engineering side.

So if everyone's now able to discover [00:36:00] them and send it through, then all of a sudden you've gotta drink from that fire hose.

Paul Roberts: And of course here in North America and the US like the CVE system broke even when it was basically humans discovering the CVEs and reporting them, not AI.

Casey John Ellis: Yeah, I've apologized to Steve and Art and some of the other guys in there a hundred times over, 'cause we definitely played a role in that. There, there was definitely-

Paul Roberts: No apology necessary, Casey.

Casey John Ellis: It was like, sorry for the hard work, but like it needed to happen in some ways 'cause there was a linearity to what they were expecting to process before the bug bounty and vuln disclosure movement hit, and then all these people join, and then the volume increases and it created a sublinear scaling need, which they struggle with. So this is just another version of that I think. 

Paul Roberts: And this kind of leads to our next question. I see one of the problems is just a lack of public investment in supporting, [00:37:00] the ecosystem, right?

So when everybody started buying cars, we as a society decided we need some better roads and we better start building those roads for all these cars, but when everyone started running software and finding vulnerabilities in that software and so on, it was no, we got the same team at NIST or MITRE. We don't need to grow it. We need to put more money into that. Just keep-

Casey John Ellis: Yeah. 

Paul Roberts: And so it broke. It broke. 

Casey John Ellis: Yeah, I think, like the highway example that you just gave, that was ultimately the kind of the Federal Highways Act that led to the 10th Amendment at that point in time.

So there are examples historically of there being acts of Congress or acts of federal government that enable that type of thing. I don't feel like we're in a great place to do that right now when it comes to stuff that's proactive, when it comes to [00:38:00] defending the country around stuff like this.

Paul Roberts: Yeah. 

Casey John Ellis: I mean there's a bunch of different things that are feeding up into that. When you think about conflict and like the amount of money that's been diverted towards defense over the past period of time, like this does play into that, seeing how that plays out. I ultimately think we've gotta find a way to make security like more attractive for the consumer than insecurity.

So the idea of okay, I'm gonna buy this product because I believe that it's safer for me and my data and all those different things than this other product.

And try to create some positive incentives around this, 'cause I don't feel like we've done a great job at that. When you look at like Facebook, Google, some of the like larger social media companies, Microsoft did this as well. Like they pivoted [00:39:00] towards putting privacy and security first, at least in their story.

Paul Roberts: Sure.

Casey John Ellis: At a pretty important time to do that. And I think they've

Paul Roberts: Trust with a computing memo, right? 

Casey John Ellis: Yeah, TWC in 2003. You talk about all of the Facebook stuff and even the Google Bug Bounty program, like they position that to the front to help the consumer feel confident in the safety of their data and it worked for them. So it's like, what's the 2025 version of that? I'm not quite sure. 

Paul Roberts: Yeah. So one of the big concerns in the past few months has been funding for some of these key cybersecurity, federal cybersecurity agencies, CISA, National Vulnerability Database and MITRE and so on. There was big kind of, we got shut down the NVD. We've seen some of those crises averted, others not. What are your thoughts on where things stand and what we might wanna do to shore up some of those systems and programs? Or even better, grow them and evolve them. 

Casey John Ellis: [00:40:00] Yeah. No, definitely.

Paul Roberts: I think this is an opportunity to change.

Casey John Ellis: I think like NVD in particular is a ripe candidate for the actual constructive use of AI. If you talk about like data decoration and all those different things, like it's a pretty good use case for that. So there's, I think, solution approaches that, that need to be thought through at whatever level.

CVE is a really interesting one because, at this point in time, I think it used to be like the global premier list for this stuff and I feel like it lost that spot a little bit with all of the funding stuff. And now you've got like the UVD and all that kind of stuff popping up around the world and the ones that are already doing it are starting to double down as well.

So yeah, that creates coordination issues and keying issues, but at the end of the day, I think we're [00:41:00] probably gonna land in that place regardless of whether we like it or not. Yeah I actually think that in North America, I don't know, there's the CVE Foundation that's popped up.

There's a bunch of different people working on this problem. And I don't wanna tread on any of their toes, but I do think that there's gonna be, just different variations of solution to this problem that'll pop up around the place and the winner will be the one that does it best, over time. Which is not the ideal way to get there. 'cause you're trying to keep it stable in the process, but I think that's probably what's gonna end up happening. 

Paul Roberts: One of the things that happened recently was Bugcrowd announced a partnership with the state of Maryland around-

Casey John Ellis: Yeah.

Paul Roberts: Their own statewide vulnerability disclosure program. So for all state agencies basically can now have a way to disclose vulnerabilities and software that they're using. Are you seeing more interest at the state level in [00:42:00] stuff like this? And it was part of a larger kind of reframing for Maryland of their kind of statewide cybersecurity policies. Really interesting. Is that something we're seeing more of or you're seeing more of? 

Casey John Ellis: Definitely. I think there's two things there. Like states are recognizing that they need and want the help, 'cause when you think about state infrastructure's the same as federal infrastructure, but like generally not as well funded. Yeah. So you just end up with stuff everywhere.

Paul Roberts: Mish-mash, yeah. 

Casey John Ellis: Yeah, and it's not necessarily their fault, it's the product of it basically growing and building over time. So like the whole idea of there being value and as many folk as you can possibly get helping you with surfacing risk.

There's a lot of interest in that side of it. I think as well, there's a lot of folk at the state level, like the legislative work that we've done over the years [00:43:00] to try to get like hack the X, Y, Z things through Congress and the Senate, like trying to get the CFAA reformed through DOJ and just those different things like they've not-

Paul Roberts: Computer Fraud and Abuse Act just for-

Casey John Ellis: Yeah. Computer Fraud and Abuse Act, the one that puts hackers in jail if they do the wrong thing or if they look like they're doing the wrong thing, which at the time meant just hacking. So we wanted to get that changed. It's if you're hacking and then you commit a crime, then you should be prosecuted for that. But unless you do the second part, then, you're just hacking. It's fine.

Paul Roberts: The fair use exemption, you might call it.

Casey John Ellis: There you go. 

Paul Roberts: Yeah. 

Casey John Ellis: The work that's been done on those at the federal level has definitely slowed down over the past bit just because there's so much going on in DC right now.

So back to your question, like there is a lot of opportunity, I think for folk to go to their local representatives at the state level. And basically educate them, do what we've been [00:44:00] doing with hackers on the hill with Congress and the Senate. But do it with the, at the state level and help policy makers understand the things that are important to you as a technologist or as a hacker, as a cybersecurity person if, depending on who you are, listening to this 'cause they're willing to listen to it and they'll act.

Paul Roberts: Yeah. Yeah. 

Casey John Ellis: And I do think if there's enough inertia behind that, then you know, that starts to reflect up and put pressure back on the federal side of things to make changes at that level as well, yeah. 

Paul Roberts: Got time for one more question? 

Casey John Ellis: Yeah, sure. Go for it. We're gonna dig into that one some more, but it's good. There's a, sorry, let me just one last piece, on the call to arms side of things, not literally "call to arms," but call to action for folks that are passionate about this stuff. The other side of it is just seeing vulnerability disclosure adoption by as many organizations as possible just to [00:45:00] normalize it at that level. And then, trying to get the narratives out there where we're actually celebrating the fact that like they've basically created this humble posture where they're receiving information from the outside, which means they're probably more mature from a security standpoint.

But also they're creating a safer environment for hackers. Like I think getting even the public to some degree at least interested in that story. I think that's the other opportunity to create that kind of bottom up pressure on your federal stuff. 

Paul Roberts: Yeah. That is the kind of, in some ways the irony of vulnerability disclosure programs is you would look at a company like Microsoft or Google and be like, wow, they've got tens of thousands of vulnerabilities reported for their software in this company. They don't have any.

Casey John Ellis: Yeah.

Paul Roberts: So they must be much safer than Microsoft. No, it's actually the exact opposite because they're actually working on the security problem. This other organization is basically just ignoring it. [00:46:00] But unless you're steeped in this industry, you might not get that, right?

Casey John Ellis: Might not necessarily recognize that. Yeah. A hundred percent. 

Paul Roberts: Yeah. So some kind of law that would level that field and be like, no, you got a vulnerability, you gotta disclose it. You gotta be transparent about it. And that way we know, more or less looking across all these firms which are doing a better or worse job.

Casey John Ellis: Yep. Agree.

Paul Roberts: Because we know we're looking at the, it's apples to apples, but participation and vulnerability disclosure is completely arbitrary, completely discretionary at this point.

Casey John Ellis: Yeah. That's correct.

Paul Roberts: And if you don't have a security culture in your company, then you're probably not gonna, you're gonna be like, why would we do that? We don't wanna put our dirty laundry out there.

Casey John Ellis: It's gonna create more of a lift. And this is where it does come back to this idea of making it not just as easy as possible, but like attractive for the organization.

Paul Roberts: Attractive.

Casey John Ellis: This is a part of what we've tried to do with Disclose.io is it's like you can tap Bugcrowd or any of the other platforms on the [00:47:00] shoulder to help you actually run the program, but this at the very least will get you started off with the policy and the core things that you need to be able to have a legal framework for that interaction. And that's open source, that's free as it should be from my perspective. Yep. 

Paul Roberts: Okay, final question is a generational question, which is one of the problems that we're seeing is an aging of the open source maintainer population. Lack of interest in younger folks to get involved in open source projects. Back in July, Bugcrowd hosted Bug Bash, which was a live hacking event focused on K-12, and Ed tech software. Really interesting. And I guess you had college students who found something like 30 previously unknown vulnerabilities as part of that.

Casey John Ellis: Yeah.

Paul Roberts: What's your thought on the generational issue and how we get young folks to come into [00:48:00] this industry and get involved? 'Cause we need their energy and their smarts and their years to keep this going. 

Casey John Ellis: Yeah. Especially the last part, I actually gave the closing keynote at BSides Las Vegas this year on exactly this subject.

Paul Roberts: Yeah. 

Casey John Ellis: I've had some sort of fresh reminders of the fact that it's important, but also the older generation's not permanent and these guys are the ones that are gonna actually inherit the problem. 

Paul Roberts: Yes. You are not old, by the way, Casey, but go ahead. Anyway. Yes. 

Casey John Ellis: Appreciate that.

Paul Roberts: Let's be clear here. 

Casey John Ellis: Just older.

Paul Roberts: Yeah. Okay.

Casey John Ellis: So I think getting them involved, yeah the thing about Gen Z as far as I can tell, 'cause I don't want to get "Okay Boomer'ed" on this one. But there is a mission drive to it and there's a community sense of a mission drive.

Paul Roberts: Yeah. 

Casey John Ellis: [00:49:00] Within that generation. I do feel like cybersecurity as an industry has lost that a little bit as we've commercialized so rapidly over the past decade.

Paul Roberts: Yeah.

Casey John Ellis: So finding ways to bring some of that stuff back in. One thing that's been really interesting is getting involved with a group called the Hacking Games, who had basically trying to do this, create a Gen Z like talent pipeline. But the bulk of their angle is like this is to make sure that your peers don't get recruited into cyber crime. Because that's the thing that's happening, right? So it's less about the bugs and let's protect the corporations or any of that stuff. It's more like we just don't want you guys to go to jail.

And if that's something that pulls on your heartstrings and makes you think about your peers, then you can get involved in actually diverting them into something that's productive, as a career and an industry. So I think that seems to be working fairly well. But yeah it's key that we start, [00:50:00] I think proactively talking to each other across generations, 'cause the other side of it with this younger group is that they're native to an internet that I'll never be native to. And the same's true in reverse, right? So I know things from my experience that they'll never fully understand natively or might take time to learn, and then they can see an internet that I just will never be able to see in the same way. So it's like, all right, there's value in getting those things to talk. 

Paul Roberts: And you might think you go to DEF CON and it's such a scene. There are so many people there, and yeah, there are a lot of young people there. There are a lot of older people there too. But you also have to realize, DEF CON and Black Hat have their roots back in the 90s, when in-person events were how you do things. Young kids these days, so much of their life is about what they're experiencing online and who's contacting them online.

Casey John Ellis: Yep.

Paul Roberts: And so you gotta fish where the fish are. You gotta [00:51:00] go out and meet people where they are on Discord or whatever.

Casey John Ellis: Yeah. And actually find-

Paul Roberts: And use those platforms. Yeah.

Casey John Ellis: I think that's a huge thing. Like the DEF CON Black Hat thing has been the subject of much discussion by both conferences over the past couple of years. 'Cause it's like, we're aging out and we're not really finding ways to replace or create the same environment that we want for the younger generations.

And whenever I'm in a room where that question comes up, I'm like, who here in this room is from that generation? And can you give us the answer to that question? And no one ever puts their hand up.

Paul Roberts: Right.

Casey John Ellis: So I'm like, there's a clue, right? Like the idea of, looking at the different places that these generations are getting together, but then as well, not necessarily trying to take ownership of it as the older generation. It's like how do you identify the people that are the future leaders of that group? 'Cause they're the ones who are gonna get it. 

Paul Roberts: Yeah. 

Casey John Ellis: And they'll explain it to us over time, but they're like, [00:52:00] we're gonna take a long time to actually understand natively how to make this work compared to them. So to me that just makes sense.

Paul Roberts: Alright man. Casey Ellis always really interesting conversation and thank you so much for coming on and talking to us. We'll definitely have you on again and-

Casey John Ellis: I look forward to it.

Paul Roberts: Yeah, absolutely. And thank you for all the hard work you do. Thank you for all the advocacy you do. And-

Casey John Ellis: Appreciate it.

Paul Roberts: Yeah, absolutely. We'll talk again soon. 

Casey John Ellis: Thank you so much, Paul. Cheers.

Paul Roberts: Thank you and thanks everybody for joining us. 

Back to Top