Season 2, EP 4

Déjà Vu: Uncovering Stolen Algorithms in Commercial Products

August, 2022 | Paul Roberts

We chatted with Black Hat Speaker Patrick Wardle who joined us to talk about unauthorized algorithm use.

EPISODE TRANSCRIPT

PAUL ROBERTS
Hey, welcome back to ConversingLabs. This is ReversingLabs podcast, where we talk to the top minds in threat intelligence, threat hunting, software assurance, and cybersecurity, of course. And so happy with us, so happy today to have our guest, Patrick Wardle with us. Patrick, welcome.

PATRICK WARDLE
Thanks, Paul. Good to be chatting with you again. Yeah, really great to be on the podcast. Looking forward to talking to you today.

PAUL ROBERTS
So we're talking to you. You're actually in Spain right now, and so there's a little bit of a delay, but for the folks who aren't familiar with you and your work, Patrick, just tell us a little bit about yourself and about Objective-See.

PATRICK WARDLE
Yeah, my name is Patrick Wardle. I am a passionate Mac security researcher, and I've really kind of put this passion, this love, into a nonprofit called the Objective-See Foundation. And we do currently three main things. First and foremost, I create free, open source Mac security tools. The kind of Objective-See suite which a lot of you are probably familiar with. Firewall products, tools to monitor microphones to check persistent malware, that kind of stuff. Also the author of the recently published The Art of Mac Malware. It's a book describing how to analyze Mac malware. It's fully free, available online, also published by No Starch Press. And then finally, I also organized the Objective by the See Mac Security Conference. The reason I'm actually in Spain is that's coming up in October and a few months here in Europe in Spain, and we bring top speakers, researchers from all over the world in a community focused conference to talk about the latest Mac security and iOS security exploits, tools, malware, all the fun things.

PAUL ROBERTS
Yeah, you are kind of the Obi-Wan Kenobi of Mac... Malware and Mac security.

PATRICK WARDLE
Wait, isn't he dead?

PAUL ROBERTS
I was going to say I'm not making any predictions about how this is going to turn out, but that is really kind of what you're known for, not just Mac security, but over the years, you've discovered a lot of really interesting kind of vulnerabilities or malicious code lurking in otherwise kind of benign seeming applications. I think the last time we spoke was back in 2020. You had discovered a couple of really serious bugs in the Zoom application right at the onset of the pandemic, when pretty much everyone, their mother was using Zoom for everything. So this is kind of this is your background. And how did you get interested just in working on the Mac platform? Obviously, as a threat person, most of what's out there is Windows based. So how did you get interested in Mac?

PATRICK WARDLE
Yeah, that's a great question, Paul, and it kind of fell into it almost by accident. So, in our prior life, my younger days, I was working at the National Security Agency, the NSA, where I was a malicious code analyst, and also then working on creating, let's just say, offensive cyber capabilities. When I left, I went to help start a company, and I wanted to be able to use the same foundational skills, reverse engineering, tool writing, but not stepping on any pose, right? You don't want to piss off the NSA. And so I said, hey, there's this kind of like new, at least to me, platform OSX at the time, right? Kind of not even macOS. And I can see his popularity was increasing. And I said, cool, if I transition to this, because at the NSA, I was doing only Windows research and work that I won't be stepping on anyone's toes, but I can use the same skills, analyzing malicious code, writing tools, et cetera, et cetera. So in retrospect, it was a very auspicious decision, and I have the NSA indirectly to thank for that. But that's really how I got involved and interested in the macOS platform. Very quickly, I saw this need this there was a gap for special security tools. The story I always like to tell is I had a friend in Hawaii. He was a surfboard shaper, and his Mac computer got hacked. Adware there was pop ups, really kind of standard stuff. And he said, hey, Patrick, you do computers. Fix my stuff, right? This is the struggle all of us in this field face fixing friends and relatives, computers anytime something goes wrong. So I said, yeah, sure. And I was like, okay, well, Windows, I would download system kernels, auto runs, see what was persistently installed, and then from that very easily find malware. And so I said, okay, where's the auto runs for macOS? And this was 2012-ish, and there wasn't one. And I said, interesting. So I kind of whip together this embarrassingly bad Python script that enumerated login items, launch daemons, launch agents brought it back, ran it on his computer, and it uncovered the adware, which wasn't that sophisticated, but that, to me, was the kind of light bulb moment that, hey, there's a need for these tools. Because at that time, too, me as a Mac user, I wanted to say, hey, what's persistently installed on my computer? And there wasn't an easy way, a comprehensive way to do that. So that was kind of nice. And I also got a custom surfboard out of the deal. So I was like, man, win win. So that's kind of the long-ish answer to how I got into Mac and how the focus on Mac security tools really came to fruition.

PAUL ROBERTS
Yes, for listeners. Patrick is a resident of the beautiful state of Hawaii, and so having a custom surfboard that comes in handy.

PATRICK WARDLE
Key.

PAUL ROBERTS
If you're in Massachusetts, it doesn't have quite the same value.

PATRICK WARDLE
You can surfboard down snowboarding, right?

PAUL ROBERTS
There's some surfing here. There's some practice to say, and really interestingly, you set this up. Again, Objective-See Foundation is a nonprofit, so you really set this up. Instead of cynically going out there and just deciding to make a lot of money off this, you set it up as a nonprofit, just talk about that decision.

PATRICK WARDLE
Yeah. So from the beginning, everything I wanted to be free, and not everything was open source at the beginning, which is what we'll talk about today. And the reason why is exactly because of the topic we're going to dive into. But my tools are focused on end users. And really I thought that end users, in my opinion, shouldn't pay for security tools. There's been a few interesting cases over the years that have really kind of reiterated that to me, what I always talk about is deals with this malware called Fruit Fly. Fruit Fly was around for about ten years before it was discovered. It was written by allegedly by this individual who wrote some custom malware to target Macs. And allegedly what he would do is I say allegedly because he's awaiting trial and got to be a little careful here. He would deploy these two hack into Mac systems, deploy it, and then he would turn on the Webcam to spy on individuals. And a lot of his victims were children, which that's really just gross. Right. And so to me. I always thought that if these individuals had been using even basic security software. Something that perhaps were to about the Webcam coming on. Because what this individual allegedly would do is wait till the people were not in front of the computers. Then his malware would turn on the Webcam. The indicator light would come on. But they'd be in the bedroom or something. And so they wouldn't see that. And he would spy on these people for over a decade. The FBI got involved. They went so far to set up kind of like help lines with psychologists to when they alerted the victims, the parents say, hey, your child has been spied on for the last five years. I mean, that's like emotionally just even incomprehensible. So, yeah, and I was like, from a security point of view, if they were running basic security tools, we could have caught this a lot earlier and maybe prevented some really damaging situations. And again, the malware wasn't super sophisticated, so even simple security tools. And so part of the reason was a lot of people thought Macs couldn't get malware. And this is Apple marketing. So one of the things I really always like to focus on is that Macs are computers, they're going to have malware, they're going to have vulnerabilities, but then also, hey, can we make free open source security tools so that end users can add that extra layer of protection to hopefully prevent the next reply or everything in between? And so that was always the mission, it aligned really well with the idea of a nonprofit really just focus on doing community activities. I see a huge the adversaries are getting very sophisticated as Macs become very prevalent in the enterprise. They're targeting Macs more and more. And there's really this gap where there's not as many Mac security researchers, Mac malware analysts. So if I can share my expertise or help organize conferences to bring speakers, to bring awareness, it's really a win win for everyone in the industry, us as defenders. So the nonprofit seemed to be a natural fit. And I have a lot of really amazing companies that sponsor that, too. So I give a lot of credit and thanks to them as well, because they recognize what we're doing. And that's kind of why this all comes to fruition. So it's really a joint effort, community focus, and something I'm really proud about.

PAUL ROBERTS
So I reach out to you, Patrick, because you're doing a talk with a colleague and researcher at Johns Hopkins University, Tom Maguire, at the Black Hat Conference, which is next week. And it's a little bit off your usual. I mean, again, you're best known for delving into Mac security issues and Mac malware. This is a little bit different. And the talk is called Deja Vu, and you're really talking about a pattern or that you've seen of really code theft and reuse that might be getting missed by organizations that are doing software development. So can you talk first, I guess, how you got turned on to this subject?

PATRICK WARDLE
Great question, Paul. And again, thank you for giving me the opportunity to talk about this, because I think this is a systemic issue that's affecting the community in a negative way. And I think just bringing awareness to the issue will allow us to really move forward positively and kind of squash this. But again, having awareness is really, I think, what was missing in the first place. My Black Hat talk is basically on the issue of corporations stealing algorithms, essentially, from Objective-See's tools and reimplementing them in their commercial products, which is obviously not okay. How I stumbled across this kind of humorous in retrospect.

PAUL ROBERTS
I'm not just a researcher. I'm a victim, too.

PATRICK WARDLE
Actually. Yeah, so this client said, hey, Patrick, we have some files that are antivirus flag to take a look where something more we should be worried about? And I was looking at one of them, and it was one of these pop products, right, this potentially unwanted software. And so I was like, eh, you know, it's good to get this out of your network, but it's not something that's super worrisome. But as I was looking through this potentially unloaded program, it was like one of these fake security products, and it basically said, hey, we can alert you when your mic or your webcam is being utilized. And I was like, oh, that's interesting. I was like, yeah, my tool does that, too. And I was like, there's many ways to do this, so I'm interested in how they're doing that. And I started looking for the code. I was like, okay, this is similar to my code. And then I was like, wow, this is exactly like my code. And we'll talk a little bit more later of what that is, because there's some gray area where just because someone is using the same approach, right, they could be equally inspired or stumbled across it. But when they're using the same hard coded constants, when their code has the same bugs from your code, it really turns into this verbatim, almost plagiarism, that really, to me, crosses the line. And so that was kind of interesting, but I kind of brushed it off. I was like, well, it's this shady company. Like, no surprises. I'll ping them and see if we can resolve this. Fast forward a few years recently, my tool, Oversight, which monitors the mic webcam to detect things like the Fruit Fly malware we talked about broke. And as you mentioned, first and foremost, I'm a security researcher. I do write tools, but I always say I'm not like an expert software engineer. So the way I write tools, sometimes, it's a little janky, very unique, and especially in the context of Oversight. This is especially true because macOS provides no simple way to answer the question, what program or application is accessing the mic or the webcam? So it's very easy to get a notification that the mic or the webcam has been activated. But then to determine who, what product, what application is not, there's basically no deterministic way. Which is unfortunate because if it's something like Zoom or FaceTime or Skype, that's fine. You want to allow that and move on. But if it's an unsigned piece of malware running in the background, that's like big red flag and we'll show that your system has been infected. So that was Oversight's killer feature, was the ability to uniquely identify what processor application. And as I mentioned, they did that in kind of a janky way. I'll get into that in the talk. But basically I'm looking for mock ports. I'm like sampling the candidate process and looking for strings in the stack trays. I'm running Apple utilities and parsing the output from standard out like things that are kind of hackerish, right?

PAUL ROBERTS
Kind of a kitchen sink approach.

PATRICK WARDLE
Yeah, and it worked really well, but it wasn't elegant. So when Apple changed some of the operating system, my tool broke horribly. And I got all these bug reports, and I was like, yeah, not surprising I should fix this. So I was kind of googling around to see what Apple had changed, and I found other bug reports, and I was like, man, my tool is broken everywhere. But then I read the responses and people pointed out that, oh, no, this is another company's product. Then I was like, this is a very specific niche bug. It's interesting that these other companies are utilizing that. So I did some analysis on their tool and again found out they utilize the exact approach that my tool, Oversight, had, which at the time was closed source. So that means they had to reverse engineer it. And again, they copied it. So specifically that they copied the bugs in my code. The analogy is it would be like plagiarism, where someone's plagiarized your writing so much that they make spelling mistakes and grammar. Yeah. And so it's kind of like, okay, so at that point, I was like, there's others out there. I wrote some YARA signatures to check my algorithm, ran across the network and found some other companies. And so the end result was that I found several completely unrelated companies that had all implemented webcam and microphone monitoring in their commercial tools in a way that was identical to Oversight.

PAUL ROBERTS
And not all of them shady, kind of low profile.

PATRICK WARDLE
No. The first one, I was like, I'm surprising with these other ones. I was like, man, these are very well respected cybersecurity companies.

PAUL ROBERTS
How did you go about determining that their code was more or less identical to your code? What was the process involved in doing that?

PATRICK WARDLE
Yeah, that's a great question because I kind of alluded to the fact that there's some gray area here where if a company or a product is doing something similar, that could just be coincidence, or even if they were influenced by your tool. Like, is that copying? Is that inspiration? Right? And so the process then became and this is how my colleague Tom got involved. Tom is an instructor professor at Johns Hopkins. He teaches reverse engineering, operating security. He's like just a top tier, incredible human, brilliant. And so I said, hey, Tom, I have some products, and I think they're verbatim copying, but I'm a little biased and more defensive. Right? It's always good to kind of bring in someone who's maybe not as vested, especially when we're making accusations, right? And I was like, hey, can we collaborate together? And basically, here's my algorithm, here's the source code for it, here's how it looks in Disassembly. And then here are these other commercial products, right? We only have the compiled binary in the debut or disassembler, so that it became a very kind of low level exercise where we would basically look for equivalencies, which in some senses seems difficult. Especially some of these were written in Swift different programming languages. The equivalents actually then started to fall out really quickly. Like, for example, I would query the IO registry on macOS for specific e-value pairs, which would contain the pit of one of a handful of processes that had access to my own webcam recently. And if you Google these same strings, there's zero hits. Similarly, how I parse the output from Apple's utility that can enumerate mock ports, because again, an application or product that's talking to the mic or the webcam, whether that's FaceTime or malware under the hood, there's going to be mock messages sent back and forth. So if you enumerate those messages, you can see perhaps who is responsible for accessing the mic on the webcam. Well, you can't do that directly. You don't have the correct permissions and entitlements but there's manual and utilities that ship with macOS that you can execute, but then you have to parse their output, right? And so there's a million ways to parse output, right? Like a real software engineer would probably use a regular expression. Patrick goes over three characters and then looks for a comma, and then it's just very janky. But you then see this exact parsing approach in another tool. You're like, I mean, they're not even trying to hide this. So once you add all of that cumulatively, right, there's these undocumented strings again on Google, this exact same approach, and then they have the same bugs. Because of bugs in my code, you can make a very compelling issue report, especially once you're doing side by side comparison. The code doesn't lie, and there's really no other explanation of how this was done so exactly the same other than it was essentially. 

PAUL ROBERTS
Okay, so you found these similarities and then presumably reach out to the offending companies?

PATRICK WARDLE
Yes, and this is where it gets really interesting because I really talk about this in my talk, because it's like, now what? And it's interesting because I had very different experiences and there's a variety of challenges, and I think we'll get into this more kind of talking about what can corporations do when they're approached by this? The good thing is most companies have mechanisms where you can report intellectual property issues. You can always reach out to the security team. What I did was I wrote, the first important thing is I have to realize what I want. And do I just want an apology? Do I want financial compensation. Do I want to name and shame them? Really, what is the goal?

PAUL ROBERTS
Is there an "all of the above" option?

PATRICK WARDLE
Hey, how can I provide? Maybe. But most companies are not going to want to be disparageless. Then I would spend hours putting together a very detailed report describing my tool, showing my algorithm, talk about the bugs in my code, the similar bugs, showing bug reports on their website from their users, talking about the binary code side by side. And then I would present that to them and say, hey, now the best case scenario, and this happened sometimes, was that CEO would get involved and say, "hey, we're super sorry," put this in an email. I was like, wow. "How can we fix this? This is something that shouldn't happen. We're taking proactive steps. We're going to remove this from this code. Can we acquire license? How can we financially compensate you?" That's the best case scenario because it really gives the opportunity for an amicable win win resolution, which I think is. You run into scenarios too, then, especially with larger companies who, if I put myself in their shoes, they're often approached by patent controls and the legal team, and their default response is, "yeah no, we don't see any equivalency here. We're going to do an internal investigation. We didn't find anything." And that's very frustrating. And then as the security researcher, you really have to think about "hey, can you provide them more information?" I had one where there was like, it's us versus you. And I'm like, fair. Like, I'm saying this and you're saying that, but I'm like, this is what I do. I'm like, I will stake my career on this. I'm so confident in this. And the code doesn't lie. Eventually, they kind of come around. But you do have to sometimes realize that, especially dealing with intellectual property experts, they might not understand exactly what's going on. Right. They'll have someone that runs, like, a code equivalency tool across the two products and be like, "it says it's 90% different." And you're like, well, obviously but that 10% is copied verbatim and found nowhere else. That's really what we're talking about. So you really have to do some negotiating and put on a hat, realizing that who you're talking with. And what I found is really helpful is...

PAUL ROBERTS
It's like, you stole 100% of my stuff, but it's only 10% of your application. That's the message.

PATRICK WARDLE
Yeah, that's why I started. I was like, come on. But you have to know that you're dealing with lawyers, and their goal is to deflect and mitigate. And so I found that when I kind of stepped up my game. I reached out to EFF and said, hey, I'm having some trouble here. They were incredible. They said, hey, we'll provide you pro bono legal advice. Kind of one of the things that EFF offers, which is incredible. And then you can go back and say, okay, hey, I'm working with EFF. They're helping me out, too. Very quickly, the conversation changes, which is unfortunate, but sometimes, I guess, you have to play the game.

PAUL ROBERTS
The reality of the legal world, which is being right, is really just part of the thing. You can be right, but it's really about how much are you willing to pay to fight that? How much ambiguity is there.

PATRICK WARDLE
Exactly, talking to lawyers, and this is an important thing as well, if you haven't patented an algorithm, right? It's difficult from a legal point of view, especially depending on what market it is. In the EU, there are specific directives that say, if you reverse engineer a tool to create a competing product, that's illegal. Right? But that's in the EU, right? This is another country that might not apply. You haven't patented the algorithm. There's also some gray areas. Damages are often awarded based on or rewards are based on damages. If your tool is free, they're like, okay, we stole. Really? The angle, too, that I think is really important is optics, right? Corporations aren't going to want to be essentially called out and said, "hey, you guys clearly stole from a nonprofit." And so that's, I think, an angle that security researchers can play into, for better or worse. It's kind of like if you want Apple to change something, like, bad press is the best way to do that. And that's just kind of the reality. That was also a lesson to learn, like, kind of what angle to take. It's not like I'm going to sue you because they're going to outlaw you and have more money. It's kind of like, do I want to publicly talk about this? And I will, or can we figure out a way to resolve this? So that was kind of an interesting learning experience for me, I would say.

PAUL ROBERTS
As you think about this, is it your impression that the lifting of your code was a decision of maybe an individual developer who have been tasked with developing these features or a top-down, hey, time to market, let's just borrow some of what's already out there?

PATRICK WARDLE
Yeah, and that's a great question because that's something I had the wrong answer to going in. Originally, the title of my talk was like, Evil Corpse. I was like, these I'm going to bleep myself are like, stealing from me as independent security researcher for my nonprofit. Like, they're greedy. And then really, once I kind of talked to the C-level execs on the team, the developers, it really became very apparent that almost always there was an individual who had been tasked with adding a feature. And it was like, oh, this is a good feature. We want our cybersecurity product to monitor the mic of the webcam. Our Windows product has this. We want parity on macOS. How do we do this? Talking to one of the companies, they said, Oversight, your tool was the only one that did that. And so the researcher, the developer who was tasked with adding this feature reverse engineered your tool and incorporated that. And they're like, we obviously don't condone that. We're really sorry about that. It's not who we are. And so that was almost the case across the board, which I think is good in a way, right? It's not like there's a grand conspiracy that these corporations are out to steal from nonprofits, at least in the cybersecurity industry, because we're kind of in the same space here. It's more that individuals maybe don't understand the ethics of it, or there's no repercussions. To be fair, if I put myself in those developers shoes, not to know what they did, but all the companies I've worked at, when a feature is requested, no one's ever been like, how did you come up with this? Right? I know for me, ethically, I'm not going to steal somebody else's code, but for others, maybe it's a gray area. The corporations aren't asking these questions or enforcing that. It kind of slips through the cracks. And so to answer a question that always seemed that it was the case of one misguided individual who would then ultimately be responsible for that.

PAUL ROBERTS
Okay so then the big question is for organizations out there, how do they go about figuring out if this is happening with their own code? You mentioned you wrote some YARA rules to help you find other examples of this. I'd be interested in knowing what's in the YARA rules. But also your situation is a little bit unique because there aren't that many programs, applications that do what yours do, right? So it's a thin population out there to look within. How would you go about even looking? Some of the scenarios you described seem almost kind of happenstance, like OS 10 change and it creates problems. And you're googling around and you're finding other problems that sound like that. How do you do this as a company? How do you operationalize that?

PATRICK WARDLE
That's a great question. I think there's two things to this, right? It's like, how do you look for this? And then also how do I make sure that's not happening within? And so part of my talk is we're talking about both of these. So I'm really glad you brought that up. Unfortunately, I don't know of a silver bullet, a panacea approach to finding this. So I kind of talk about the approaches that I utilize, which I think are good steps. But yeah, like you said, there's not necessarily a really good way to do that. And so first is identifying what's unique about your tool. And then start step one is kind of identifying competitors that might advertise the same feature capability. Right. If you add something kind of new and then six months later, you see your competitor adding the same feature, like red flag. And it doesn't mean they stole it. Right. They could have reimplemented or yeah, maybe take a peek. Unfortunately, other than reverse engineering, you're really not going to know. And so the source code is not available. What worked really effective is matching bug reports, I don't think that's necessarily like a best approach. Right. Hopefully your product doesn't have bugs in the first place. But for me, that works really well. As I mentioned, and you kind of touched on again, if you can write a detection signature for your algorithm, my approach in Oversight, which is now open source so everyone can see it's, like super we mentioned, kind of janky, very unique. So you can basically then write a detection signature for certain strings that are in your binary and then run it. Virus Total, ReversingLabs, any site that has a corpus of binaries. I would imagine certain companies have better ways to do this, but the individual security reacher, that was kind of my approach. But I think that can still apply if you're a company and kind of the red flags you mentioned, you see a competitor with the same approach, maybe take a peek to see how that's going. So that's kind of like how do you detect it, that it's happening to you? And this is also a good time to talk about closed source versus open source. So the only reason my tools were... 

PAUL ROBERTS
You anticipated my next question yeah, go ahead.

PATRICK WARDLE
Closed source was to I thought make this a little more difficult. And I know obviously you can reverse engineer anything, but I thought that would be a barrier enough that people would like say, yeah, I can reverse engineer this to get the algorithm, but I'm not going to do that. When something is open source, there's a little more gray area and it's like, well, it's sitting here on GitHub and like that, so that I learned very quickly it's completely a misnomer that closed versus open source. So now all my tools are open source, which also then means because they're like GPL and stuff. Now if the company does use this in some ways there's maybe more legal standing to go after them because now it's like their GPL violator and still probably don't want to take that to court. But I think that's more well established the lines that are drawn there. So that was a good learning experience to me.

PAUL ROBERTS
So as part of GPL they would need to contribute back any modifications they made to the GPL code, right?

PATRICK WARDLE
Exactly. Yeah. So it kind of, I think, articulates a little bit more what the expectations are. But again, the takeaway for me, I was like, well, people are feeling I want to make everything open source because I believe that's better transparency anyways. And again, that was the only reason they were originally closed source. Everything's open source. So push me in that direction anyways.

PAUL ROBERTS
Yeah, and there is obviously so much of application development these days relies so heavily on open source. There is obviously this culture of kind of just grabbing stuff and using it, cobbling together applications based on stuff other people's written. But in this case the code wasn't necessarily there to be grabbed and reused. Right. It was closed source at the time. Lessons learned for you going forward, obviously you open source your applications and that creates a more clear legal framework for maintaining the integrity of that. Other lessons learned?

PATRICK WARDLE
Another lesson I learned, which doesn't really apply to me, but I think it applies to the companies and corporations, is just like what can you do to make sure that this isn't happening, that developers when the organizations are inadvertently doing that. Because again, the lesson learned for me was that that is what was happening, right, versus evil corps stealing from small nonprofit and talking to these companies who were fairly amicable once they realized that I wasn't just a patent troll, that hey, let's work together to fix this. I'm really not trying to burn to the ground, let's professionally resolve this was they said, hey, we have mechanisms in place to make sure open source software is inadvertently used. Right? Like if something's licensed, we have the lawyers review it, we're scanning for that, which is good because they don't want to get in a position where they're using things incorrectly. But they said, hey, we didn't have any mechanism in place for this. And so there's huge opportunities for educating, I think the developers. As I mentioned, none of the companies I ever worked at like sat down and told me this was illegal or asked me where I got features from. I knew from an ethical point of view that's stealing. Apparently not everyone knows that, but I think if corporations just added that or when a new feature comes, it's certain, like very unique. It's kind of just saying, like, hey, how did you come up with this? And don't steal it from other tools and talk about the implications. Because the problem is it puts these corporations in some legal hot water potentially, which they want to avoid. And from an optics point of view, it really puts them in an awkward...

PAUL ROBERTS
Reputation damage.

PATRICK WARDLE
Yeah, exactly. So if I come out and it's like XYZ stealing from nonprofit, here's the proof, especially because I have a platform where I can talk about this, that's something they really want to avoid. And also a lot of these companies, I believe they do like to adhere to ethics. They want to do the right thing. And so I think there's this kind of blind spot where they kind of missed out on that. And really one of the lessons is what can they do? I think there's some very easy steps. Like I mentioned, talk to your developers, articulate what's acceptable and not acceptable. If you're reverse engineering closed source products to reimplement algorithms, that's a very gray area and kind of stay away with that because there's other options. Right? Like if they reached out and said, hey, we want to license this, I would have been very open to that conversation. So that would have been a more professional way and in the long run would have been more of a win win. I think a lesson is for corporations to realize this can happen and proactively, take steps to avoid that and then for their benefit, that will be the best approach as well.

PAUL ROBERTS
Yeah, it's funny. I mean, we teach kids in school, right? In middle and high school, you're learning like, here's what plagiarism is. You can't take somebody else's words and represent ideas, represent them as your own. You've got to quote them and cite that and so on. So that's part of your education as a student on how to create new expression. But in the development community, like you said, I think a lot of developers understand it just intuitively. But my sense is it probably isn't an explicit part of their developer education like, hey, don't steal people's code and represent their work as yours.

PATRICK WARDLE
And it is putting the corporations at risk, as I mentioned. So it does behoove the corporations to be proactive about this, both to make sure it's not happening to them. But I would say even more so that an individual developer who doesn't see this being as clear cut in black and white really knows, hey, this is not something we can condone or support.

PAUL ROBERTS
Patrick Wardle of Objective-See Foundation. First of all, anything I didn't ask you that you wanted to say?

PATRICK WARDLE
No, thank you again so much for giving me an opportunity. I think the fact that we're able to bring this to light is a really positive thing. I noticed some of the companies who were initially a little apprehensive about talking to me about this or addressing this once, I was like, hey, I'm getting a Black Hat talk about this. They really quickly changed their tune, which is unfortunate. I feel though, I was in a position where I have a large audience, so they kind of have to take me serious, where other developers who are doing just as great work as I am, but might not have that audience, don't quite have that pressure, and maybe we just get left under the rug. So I think by bringing awareness to this problem, I think we can kind of hopefully squash it because it does seem to be fairly systemic, but we mentioned with simple solutions where even corporations can implement some safeguards.

PAUL ROBERTS
It's a really important topic and we're focusing more and more obviously on software supply chain and so on. A lot of that is about malicious components working their way into your applications and your environment, of course, but not only right and this is kind of part of the conversation as well, which has to do with the integrity and the provenance of the code that you're representing as your own.

PATRICK WARDLE
Yeah, and that's a really good point too, because I remember a case in some of my intellectual property classes at Johns Hopkins where they talked about people who publish phone books who put fake numbers in, so if someone copied that, they could prove that was the case. And so it's like if you're developers are going to add bugs specifically that might not trigger but unique fingerprints, watermarks in your code to prove that. And that's something I think about.

PAUL ROBERTS
I see a Black Hat talk in this, or maybe this is a Defcon talk, actually.

PATRICK WARDLE
Consciously, yeah, maybe. But now after talking to you, I'm like, oh, this is a good idea because again, proving the equivalency, especially when you're talking to the legal team, is challenging. So as a developer, again, if this happens to you, feel free to reach out, I would say to have patience, especially when dealing with legal teams. But again, I'm hoping just being able to talk about this more transparently will really move us all in a positive direction.

PAUL ROBERTS
We will include your contact information when we post the conversation. The ConversingLabs episode and Patrick Wardle of Objective-See Foundation. The talk at Black Hat is "Deja Vu: Uncovering Stolen Algorithms and Commercial Products." And it's next Thursday, the 11th, a week from today, out in Las Vegas at Black Hat, and so everybody should check it out. Patrick, thanks so much for coming on and talking to us on ConversingLabs podcast.

Paul Roberts

About Author: Paul Roberts

Cyber Content Lead at ReversingLabs. Paul is a reporter, editor and industry analyst with 20 years’ experience covering the cyber security space. He is the founder and editor in chief at The Security Ledger, a cybersecurity news website. His writing about cyber security has appeared in publications including Forbes, The Christian Science Monitor, MIT Technology Review, The Economist Intelligence Unit, CIO Magazine, ZDNet and Fortune Small Business. He has appeared on NPR’s Marketplace Tech Report, KPCC AirTalk, Fox News Tech Take, Al Jazeera and The Oprah Show. You can find Paul online on Twitter (@paulfroberts and on LinkedIn).

Subscribe

Sign up now to receive the latest weekly
news from ReveringLabs

Get Started
Request a DEMO

Learn more about how ReversingLabs can help your company.

REQUEST A DEMO