Season 4, EP 2

The Future of Bug Bounties

April 19, 2023

In this episode, host Paul Roberts chats with Katie Mousourris, CEO and Founder of Luta Security, about the history of professional hacking and bug bounty programs, as well as what their futures hold.


Welcome back to ConversingLabs podcast. I'm Paul Roberts, your host. And we're here with the amazing Katie Mousourris of Luta Security. Katie, welcome to ConversingLabs.

Thank you so much for having me. It's a pleasure to be here.

Katie, you are the CEO, and founder of Luta Security. For our listeners who might not be familiar with Luta, tell us a little bit about what you do.

Sure. Well, I founded this company about seven years ago, back in 2016, and that was right on the heels of starting Hack the Pentagon, which was the very first bug bounty program of the U.S. Department of Defense. Before that I was with Microsoft for about seven years. I started their first bug bounty program
10 years ago in June. So that is a 10 year anniversary of Microsoft's Bug Bounty programs. And before that I was I actually was a professional hacker. So I was doing professional penetration testing. I was one of the artists formerly known as "At Stake". And and before that I was, you know, a systems administrator and a molecular biologist who worked on the Human Genome Project.
So in other words you know, your average pink haired old lady of cyber.

You're an og. It's really interesting cause I just got done talking with Chris Thomas, Space Rogue who has a book out on Loft Heavy Industries and how Loft kind of changed the world. You weren't a member of Loft.
You were a little bit young, too young to, to be in the kind of original instantiation of that group, but you were an early employee at at Stake, which is which purchased Loft. Basically, it was sort of the commercial version of Loft back in the early 2000s. Talk about that experience and sort of what you learned from that and from sort of proximity to those Loft characters.

Well, you know, that is actually how I learned how to hack was early on I was dialing into the same bulletin board system. So I'm actually not too young to have been in the Loft. I am, I just am well preserved. Lot of, lot of sunscreen, but But what happened was I was on the same bulletin board system with the Loft and some other, you know, local hacking folks.
And, you know, it piqued my interest. I was already fairly technically savvy for the time. Very few people even had a modem or knew how to use it. And They started doing these physical meetups in Harvard Square originally, and those morphed into what became 2,600 meetings, which were, you know, sort of wild, wildly popular you know, hacker meetups on the first Friday of every month, but they originally started as meetups of this local bulletin board system.
The works B Bs. Yep. The works. Yeah. Yeah. And that was, you know, that was sort of the origin of it, but I just, I remember, you know, making some, of friends that I still have to this day. So we're talking, you know, over 30 years of friendship with these folks. But that was also, you know, it's formative in that everyone was there to learn and to teach, you know, and, and it was a very conducive environment for exploring your curiosity, which is really kind of what hacking is 
all about.

How did you get as a sort of teenager, I mean, you know, this was the personal computer era, right? It wasn't so unusual that you might have a computer in your home, but still early days and definitely early days of kind of like the internet. So how did you kind of get your get involved in that and get, get curious and exposed to that?

You know, my mom was a scientist and she knew that computers were the future. She didn't actually use a lot of computers in her daily work at the time, because again, as you said, the internet wasn't really, you know, wasn't really a thing. So much for, for the mainstream. But she knew it was gonna be important.
So she bought me a Commodore 64. And when I thought it was for playing video games like Pacman, she basically said, you know, I can't afford any more video games, so you need to read the book that came with it. And, was a basic programming manual. So I started teaching myself how to program. And then what happened was when I hit high school, you could take basic, I already knew basic, but you couldn't take the next class in computer science unless you just took this basic programming class.
So I ended up getting paired with a friend of mine his name was Tim, and he had a modem, so where I'd go over to his house and help him with the programming homework, he showed me this new thing, which was the early internet. So at that point, you know, it was like peanut butter and chocolate and and I started dialing in using a modem at my house after that and getting in trouble because I was raising the phone bill.

What was that for you? Like, what was that experience like for you back in the, you know, mid to late 90s, you know getting together, you know, in Boston, in this kind of warehouse artist space really that was just filled with like hardware and computers and folks like yourself who were, who were really curious about it.

Honestly, you know, it had a kid in a candy shop feel, you know, because you'd get to, you were pooling your hardware, right? You were bringing things that you dumpster dived for or that you bought at the MIT flea where they would sell hardware and, you know, figuring out how things worked and teaching each other.
So there was just this like collegiate environment of people who mostly weren't in college, right? Either weren't in college yet, or, you know, in my case, I decided not to study computer science. Even though I could program and everything, I wanted to use my programming skills towards you know, towards research science in the medical field.
Which is why I studied molecular biology and mathematics. And ended up, you know, kind of detouring until the industry sort of caught up with me and what my, you know, hacking skills were. Because I moved from doing bioinformatics work on the Human Genome Project to being a systems administrator.
And suddenly we were getting hacked, right? So I had to dust off the hacking skills. I had to scan our networks to make sure that, you know, we were patching things and we had things, you know, as under control as possible. And then I had to remediate. So I developed a lot of empathy for all these different roles in cybersecurity.
And I think that, you know, a lot of us are really obsessed with the hacking side, which of course I love to do. But even as a professional hacker, I got bored with that because a lot of times you come in over and over again and it's the same vulnerabilities or the same class of vulnerabilities. Yeah. And it just got boring.
So, you know I think we're, as an industry, we're still stuck in that groove a little bit. You know, where it's like attack, show the hack. Everybody celebrates the hacker for pointing it out. Great. But then what is the sustainable, you know, cybersecurity that you're gonna have to live with after that?
And, and how healthy can you keep that system over time?

It's funny, I was talking with like Caleb, Caleb Sima a while back about about his, his work kind of in the same time period and like SQL injection and sort of like realizing like SQL injection is a thing and then looking for it and being like, this in literally every single website that I am looking at has this vulnerability.
And it's exploitable. 
Now, sort of. Right.

And how and how come we haven't solved it yet? You know?


Understand vulnerability using class of vuln to scan for, using automated tools. And it's actually one of my favorite ones that I call very "hack-ccident prone" type of bug. Mm-hmm. Right?
Mm-hmm. I used to live on O'Ferrell Street with an apostrophe in it. And so I'd accidentally hack things all the time and find SQL injection because as you know, the apostrophe is a special character to a sequel database. So I'd get verbose SQL errors back just putting in my address, and I'm like, oh.
Better walk away. 
Right. I didn't do anything, you know?

Yeah. Just tripping over the bodies.

Tripping over. Yeah, exactly. Yeah. So we still trip over bugs and that's a problem. Cause you know, I've been a retired professional hacker, you know, since 2007, technically, and yet I can still find bugs. And this is not, this doesn't bode well for the state of the internet security today.

So, like you said, most of your work in the last 10, 15 years has really been focused around promoting standing up these bug bounty programs where companies will offer incentives to independent security researchers come in and look at their stuff, their software, their hardware, what have you find vulnerabilities, report those get compensated or at least thanked for it.
Talk about how you've seen that the notion of bug bounty programs evolve and kind of where we are right now with bug bounty programs. I think the, I think most companies accept like, this is a legitimate thing to do. Obviously there's, you know, it's an industry with, with companies that'll make it kind of turnkey.
But there are some challenges as well. Talk about it.

Well, you know, we kind of evolved from the, who would hire a hacker professionally to, oh, we should hire hackers, but they should be professional and vetted. Therefore, the pen test companies, you know, that early 2000s era where, you know, the Loft and At Stake Sticks, you know, made their claim in history.
Right? And then we went to, well, open source is crowdsource development. Why don't we crowdsource security and do these bug bounty programs? Right? And it all seemed very logical until you realize that like, wait a minute, who is fixing these vulnerabilities? How are they dealing with, you know, not just the problem of duplicate reports, but the what is, what does a duplicate actually tell you?
It actually tells you that a vulnerability is fairly easy to find, meaning more than one person could find it, and probably it's telling you that it's fairly easy to exploit as well, right? So what we kind of have hit in terms of hitting a wall here is the logical limit of how useful bug bounty programs can actually be when organizations haven't made the investments in that sustainable cybersecurity.
So that's kind of the arc, you know, of the past like 25 years or so. It's don't hire these hackers, they could do some harm to, oh, maybe, yeah, actually we should probably hire the hackers, but the professional ones only. And then it was like, hmm, maybe the white hats maybe support them. And now it's like, actually we can't deal with these reports.
Maybe we do private bug bounty programs, which are basically a variety of pen test, you know private pen test.

Right. So really interesting and like you said, you, you stood up the first bug bounty program at Microsoft. I don't think people remember, but certainly even when I started covering cyber back in the kind of early 2000s, Microsoft's relationship with the security community was not great.
I think these days they're think much more highly thought of , seen as more cooperative generally. But back then it was a very antagonistic relationship. So when Microsoft actually kind of signed onto a bug bounty program, thanks, in large part to your work behind the scenes it was actually a really big deal.
It was an acknowledgement, it was kind of an arrival of this notion that, that this, you know, if a company like Microsoft is doing it, then in theory everyone should be doing it. When you started that Bug Bounty program at Microsoft, did you have utopian thinking about bug bounty programs? Or did you kind of see the whole thing playing out more or less
like you just sketched it out to us? Like, yeah, this is a good step forward, but we're gonna hit a wall at a certain point and the wall is gonna be around the ability to process the output of these programs? Or were you sort of like, no, this is gonna fix everything?

No, I mean, lucky for me, I was very fortunate in that I was sitting at the
largest intake funnel for regular vulnerability disclosure, you know, without being paid cash, right? No cash rewards before the bug bounty programs. But you know, the volume of cases that was coming in to was between a quarter million to 350,000 non-spam email messages a year. So they already had to have a well oiled machine to sort through the wheat from the chaff and figure out what action to take.
All the while. You know, and that was just the intake funnel from the outside. All the while they were improving their security engineering and proactive security to try and prevent those bugs. So I was fortunate in that I got to start the biggest bug bounty programs in the industry at the time, you know, for any vendor.
They were the highest paying of any vendor at that time. And but I got to do it at a place that had well-oiled machinery. Even so, we still had to tailor those bug bounty programs because if you think about it, why would Microsoft agree to start paying when they were already receiving that volume of work to do from the public for free?
And that was where we really tailored them to be. You know, it's like we know we're getting eyeballs, but when are we getting those eyeballs and on what products are we getting those eyeballs? So can we use these incentives, these cash rewards, to focus those eyeballs on the places where it makes sense? So we did a bug bounty at the beginning of the beta period of the browser.
You know, the latest version of the browser. Because we spotted a pattern where everybody was holding onto their bugs during the beta period before that, and there'd be this giant spike of submissions to the Vuln Disclosure Program after the beta was over. So kind of the worst time possible to hear about all those bugs and your customers, you know, have finished testing.

They've deployed it.

Yeah. And they've deployed it in live production. So, we shifted the traffic to the beginning by using a bug bounty. Similarly, we wanted a headstart on, you know, big classes of vulnerabilities. So we started the mitigation bypass bounty, which is just a fancy way of saying, we wanted new exploitation techniques, not just the next 0-day, but the next exploitation technique.
And what ended up happening there was we launched it just in time. Little did we know the, Wassnar Arrangement, which is this international, you know, regime on export control between 41 countries now 42 countries, India has been added to it. But the Wassnar Arrangement had just six months after I launched the bug bunny program, added
the exact things that, you know, essentially to the export control list of these countries added a bunch of these new exploitation techniques to what had export controls on them. So had that timing not worked out, had they added that to the export control list before I launched that bug bounty? We probably would never have seen Microsoft's bug bounties hit the light of day.
So, you know, that was serendipitous right there.

What are the biggest, so what are some of the challenges, 'cause with Luta you advise companies and organizations as well public sector organizations on implementing programs like this. What, what are some of the biggest challenges you see just in
both rolling out and maintaining and sustaining vulnerability disclosure bug bounty programs.

Honestly, it comes down to this. It's a very simple concept to explain to people, right? If you see something, say something, put up a way for them to contact you and fix the bugs, right? This is so simple to explain.
Devil is in the details of implementation, and I think that because it's really relatively simple to explain, there's a gross overestimation in the power of people to just kind of figure it out, right. And I think, you know, the U.S. Government is a great example. There was a binding operational directive that was the first binding operational directive issued by CISA.
And they said that all federal agencies have to have a vuln disclosure program. Not a bug bounty, but just some way for people to contact you. Seems simple enough, right? Their whole FAQ started with what if I don't have anyone to fix it? And it's like, well just figure that out, but at the very least, have a point of contact so people can tell you about it.
So even the federal government was underestimating what it was gonna take to actually implement one of these programs. And then fast forward a couple years, you know where. You know, CISA issued another operational directive where they basically said all federal agencies need to patch all these known exploited vulnerabilities, right?
You'd think with a vuln disclosure program, there would be no such thing as, you know, unpatched known, exploited vulnerabilities. And yet CISA still had to issue that directive. One of the federal advisory boards, I'm on the NIST board for information security and privacy advisory board. We got a report out from CISA on how that that program is going, how the known exploited vulnerabilities list program is going.
It turns out not so well. Why? Because the federal government doesn't have the resources to sustainably keep up with even the patches that we already know about, right? Even the issues that are being exploited out there. So it's somewhere between 1.2 and 1.4 million unpatched, internet facing federal endpoints that are still hanging out out there.
So add that to the oversimplification of, well, we'll just start a vuln disclosure program or a bug bounty program and you begin to see the scope of the problem, right? Simple to explain. Very difficult to execute with any degree of regularity, safety you know, and sustainability.

One of the things we've seen just from a evolution of threats and attacks perspective, right?
Is a shift left, you might say malicious actors much more interested now in development environments, development pipelines open source code, or you know proprietary code software supply chain as an avenue into organizations. So we've seen that most recently with the Last Pass attack, right?
Attack on a developer system CircleCI. And obviously going back SolarWinds and Codecov and places like that. Organizations obviously struggling to address this risk. Is there a role for a bounty type program approach to that software supply chain problem, or is that really not applicable in the context of software supply chain risk?

Well, it's tricky to use bounties in the context of software supply chains, but where we've seen them used successfully are things that, where there are packages that are inclusions, you know, in larger code bases. And if you can, you know, essentially as a bug bounty hunter, if you can substitute your package for, you know, what the intended correct package was, you're demonstrating a supply chain risk, right?
You're demonstrating a dependency risk. So we have seen some of it used in that way, but there's so much more to, you know, a robust, sustainable, and secure, you know, CI/CD pipeline, that bug bounty alone is not going to be enough to help enumerate all of the places where an adversary could get you.
But I do, you know, I do admire, I guess the the efficiency of hacking software supply chain because, you know, you've gotta hack once, pwn everywhere. And I think that, you know, the efficiency there is something that we should be trying to pay attention to, you know, as defenders. But yeah, bug bounty is not gonna, is not gonna solve that entire problem.
But certainly in identifying some dependency confusion issues is a good place to start.

Right, and my sense is there just isn't- one of the realizations is there just isn't as much monitoring or attention to development environments, development pipelines. So that is an advantage for adversaries as well.
One of the things that I think is encouraging about sort of the focus on supply chain risk, and we're seeing the Biden administration talking a lot about software supply chain risk is that it does address the sort of underlying code quality issue. And this is something that you're talking about as well.
Like we, we can have bug bounty programs where we're just tripping over SQL injection and cross-site scripting. All these vulnerabilities we've been talking about for two decades are still getting created. We haven't really moved the needle in terms of code quality. It would seem like, as we talk a little bit, as we talk about supply chain risks, that there is more of an issue on code quality.
I'm sure this is something you're thinking a lot about, right? Which is how do we, as I said, move the needle within development organizations on getting vulnerabilities out of code, making sure they, they don't end up there in the first place. Versus, you know, discovering them when they, when they get created.

Well, I think, you know what, what is good about the Biden administration's, you know, National Cybersecurity Strategy is that we're finally seeing what I would call a shift in the Overton window on software liability. You know, it's been a, you know, nuclear waste kind of topic. Nobody wants to touch it.
Why? Because the software industry has very successfully lobbied against it for a long time. And something that's important to the United States is being, you know, an innovator in technology. That is something that I didn't realize how entwined with USA DNA, that really was until I visited the Capital, got a tour and saw, you know, artwork around the rotunda that showed telegraph
polls and things like that, and a plaque to you know the inventor of Morse Code in the Capital, right? So so it's very important, you know, to the United States to continue to lead in technology development, which has put a damper on regulation and software liability. So I think the you know, the Biden administration putting, you know, a line in the sand and saying, effectively, look, if you are a for-profit company, and you are, you know, let's say using some open source in your for-profit products and you have not invested into the security of that open source, 'cause they very much were saying the liability doesn't belong on the open source community. 
These are, you know, people who are vol mostly, right? Largely volunteers. They're not being paid, but the for-profit software industry definitely should be taking much more of an aggressive role in protecting the ecosystem, especially when they're using and benefiting from you know, from open source software.
So I kind of think of it as, look, don't turn in an essay that you haven't run the spell checker on, right? Do not release software where you haven't used some basic, easy to use security tools to find the low hanging fruit bugs. And that's the same, you know, that's the same practice that I preach for bug bounties, right?
Like, don't be running a bug bounty or even a vuln disclosure program if you yourself are not regularly running the same, you know, kind of security tools that are freely available out there for everyone.

One thing I'll, I'll note also, I think is sort of like a looming problem if it actually, it's already upon us, is the fact that, you know, the definition of "software company" is changing. We have the Internet of Things right, which is growing by leaps and bounds. We have all of these companies who have maybe historically not been software makers, hardware makers, equipment makers, who are now writing applications, and supporting infrastructure, wrestling or not wrestling with these very same issues and in some cases, you know, making the exact same mistakes, right?
Kind of tripping over the same hurdles that companies were tripping over back in the 90s, or early 2000s. In one way, you know, a lot of green fields for a company like Luta Security on the other hand a little scary cause you start to talk about kind of cyber physical impacts and things that are really, you know, in people's bodies or in their homes and businesses and really have the potential to cause a lot of disruption.

Yeah, no, that's exactly where we've seen security kind of moving through, different layers I guess. I mean, it went from a lot of stuff was concentrated mostly on servers and there wasn't too much, you know, of interest on clients. But then personal computers got more powerful, you know, and you know, network connectivity got faster.
And so then you started seeing much more hackable things, you know, in people's personal lives, right? So the shift from network-based vulnerabilities to application-based, you know, vulnerabilities was a big shift in the early 2000s. And then we started seeing this, you know, progression and these little, you know, these little devices, these little computers.
The mobile security space was lagging far behind the desktop security space for quite some time, and we didn't even have update clients for mobile phones for a really long time. Right. So so I think that, you know, every industry has to unfortunately, relearn these lessons as soon as they are touching code, which is, you know, where there is code, there are bugs.
Where there are bugs, there are hackers.

So you talk about this, this kind of challenge that companies that introduce bug bounty programs have to respond to the data, to the work that they create. What is your advice to companies either who have existing bug bounty programs and are struggling with that?
Scaling the response problem or who are contemplating a bug bounty program and might hear Katie and say, aha, see we shouldn't do this because we're not gonna be able to deal with the output of this program anyway. So what, what's your sage advice 
to these companies?

So honestly, you know, everybody knows about the role that hackers play in these bug bounty programs.
It's all these other cybersecurity jobs being on the receiving end of these reports, and I'm not talking just triage, but actually like pushing the bug through to its logical conclusion to get it fixed, that kind of job. Mostly people don't wanna be in that job for very long. Like Microsoft. I think Popular Science called Microsoft's you know, the, the job of Microsoft security grunt among the top 10 worst jobs in science.
It was between like elephant vasectomist and whale feces researcher. So it's a terrible job. So what I would give as advice is understand that there is no glamor in that job. Once people get good at that job, they wanna move on. So, where a lot of our customers actually end up using Luta Security is the fact that they don't know how to hire people who know how to do this, because most of them have runaway screaming if they've done it before.
And two, you know, if you try and pile it on to yet another thing that another hat that your existing security people have to deal with, that becomes the least favorite part of their jobs and it costs you in the incident response arena, or it costs you in the preventative, you know, secure development life cycle arena because these people are dealing with these bug bounty reports.
So, one, understand, you know, running these programs efficiently, it's the toughest job people will never love. And you know, training for it, just understand that you are gonna be in a perpetual state of training and onboarding for these roles because by nature people are gonna move onward and upward and out of those roles.
 And that's where we see sort of strong bug bounty programs start out. Someone who's really passionate about it, they leave the organization. Maybe it changes hands to a few other hungry individuals who want that experience. But eventually it just kind of starts to disintegrate. And, you know, without the, the boring parts of security, the process, you know, and the documentation on how to run these programs consistently, you end up in this state of, of, you know, aqueducts collapsing.
Whereas, you know, they, they helped move water in Rome before, right?

In some ways it's kind of similar 
to almost like a retail type environment, right? I mean, companies like, you know, McDonald's or CVS or whatever, they just kind of intuitively understand like, yeah, a lot of these jobs, cashier or whatever, like, there's just gonna be a lot of churn in this job.
It's not something somebody's gonna do for 20 years, but you know, some people will move up, and some people will just move on, and that's just the nature of this job. But I think in, maybe in technology or InfoSec, there's this notion like, well, all these jobs are sought after and sexy and things that people will wanna do forever.
And you're sort of saying, yeah, don't, don't flatter yourself. Right? Like, just be prepared for the inevitability that you're gonna have to keep, you know, moving people in and out of these jobs to keep the 
program running.

Yeah. And I think a lot of people think that the bug bounty platforms will handle that for them, right?
That well we're outsourcing triage so we don't burn out our internal security folks. Well, the triage is pretty imperfect if you don't have that internal context. Right. So what bug bounty platforms are great at, you know, and they're varying degrees of great at, but the idea is they de-duplicate it for.
And they also, you know, will, will validate the, is this bug real or not? What we've seen are bug bounty triage companies or bug bounty platform companies with triagers shutting down valid issues, right? Like improper triage of valid issues. We've also seen them you know, doing things like
misunderstanding the severity, saying like, yes, this repros, but you look at it and there's no security or privacy impact. So you're like, yes, that set of of steps definitely works, but to what end? You know what I mean? So it's that context to wear bit. That actually can focus the attention of your internal security team.
That's the missing link that we see a lot. So, you know, and unfortunately that context to where triage and sustainable programs, that's the thing that people are lacking. And right now, as far as we can tell, there's nobody else who's doing this. For organizations in this area except for us, right?
And we're eager to get more professionals cycled through. But we have a really interesting model where we actually allow our customers to hire our contractors off of us at the end of a staff augmentation engagement. Because we believe in labor rights and labor mobility, you know, and we don't wanna trap people in either, you know, a bad bug bounty gig economy job where they might not get paid.
Right? They found the same bug as the next person, but they were a little too slow, and so they didn't work.

Oops, sorry. Yeah, right.

You know? Or, we don't want people, you know, feeling trapped in a steady job, but that they actually hate. You know what I mean? And, and so we, we understand the nature of this labor and we just have a different labor model around it that's more realistic.

So you kind of anticipated my final question, which is, we're speaking to you on International Women's Day and you, among your many other accomplishments really been a outspoken advocate for women in cyber and information security. You know, I think probably when you were starting out back in At Stake this was a small group.
It's a much bigger group today. So I wanted to just kind of get your thoughts and, and reflections on International Women's Day and also, what work's left to do here at information security to make it a more hospitable profession to women than has been historically.

Yeah. I mean, I think, so my big issue is pay equity. Yeah. And I think if we don't solve the pay equity problem, you can get more women or more underrepresented people in a given industry. But if they're paid inequitably, guess what? The wages go down in that industry. We saw that happen in computer science, right.
Early computers were these hidden figures. These women who were hired to program this calculating machine, and we don't even know their names. Right. And this was wartime doing these calculations and whatnot. We saw the NASA movie, Hidden Figures. And what happened was those were the lowest paying jobs because they were not considered to be, you know, essentially managerial,
or strategic, they were considered to be a variation of a typist job, right? And so, if we don't solve pay equity and solve the pay gap between, you know, women and especially women of color and you know, traditionally white males, then we're not gonna be solving anything. We're just gonna be lowering the pay rate for the profession.
So, I think step one, Pay Equity Now, which is the name of my foundation you know, is where you can find out more about that. But honestly, I think the more we commoditize cybersecurity jobs, the higher the risk is that we will be hiring out of these underrepresented groups like women and other, non-white people.
And we'll just end up lowering the pay. So, yeah, not let that happen.

You're seeing that happen in medicine, right? Where professions that are you know, have become, come to be dominated by women, pediatrics and primary care. You're seeing wages go down in those, in those sectors, whereas, you know, surgery and other specialties that are still kind of dominated by men salary is much higher.
I mean, it just, you see it in industry after industry, case after case, it. It is just shocking.

Yeah, you know what's really amazing is as educated women and women with higher you know, degrees and multiple degrees as that increases the amount of pay that you can expect with, you know, doctorates
 and multiple advanced degrees is now going down. Black women are among the most educated group of people who are graduating not just from college, but from graduate programs and and MD programs, and we are just already seeing that effect.

And some of that is structural, right? Some of that also is like penalties that women incur around things like having children and taking maternity leave, right?
That end up affecting their whole kind of, career trajectory. Right.

It's so true. And I think one of the answers to that is having gender free parental leave. And having all parents have to take it, you know, as opposed to, oh, it's optional for the, you know, the non-gestational carrier. Right.
Yeah. If we normalize, you know, dads taking off early to go to kid baseball game and whatnot, and normalize, you know, the fact that your family should come first, I think that will go a long way to helping remove that penalty. But you know what, look, I'm busy hacking capitalism over here and you know, I'm just gonna keep hacking these systems in whatever way I can until you know, until we get them to budge.
And so far so good.

Katie Mousourris, CEO and founder of Luta Security, thank you so much for coming on and talking to us on ConversingLabs podcast. It was a really great conversation and we will absolutely do it again.

Paul Roberts

About Author: Paul Roberts

Cyber Content Lead at ReversingLabs. Paul is a reporter, editor and industry analyst with 20 years’ experience covering the cyber security space. He is the founder and editor in chief at The Security Ledger, a cybersecurity news website. His writing about cyber security has appeared in publications including Forbes, The Christian Science Monitor, MIT Technology Review, The Economist Intelligence Unit, CIO Magazine, ZDNet and Fortune Small Business. He has appeared on NPR’s Marketplace Tech Report, KPCC AirTalk, Fox News Tech Take, Al Jazeera and The Oprah Show. You can find Paul online on Twitter (@paulfroberts and on LinkedIn).


Sign up now to receive the latest weekly
news from ReveringLabs

Get Started
Request a DEMO

Learn more about how ReversingLabs can help your company.