Season 2, EP 3

Not All Developers Can Be Security Jedis

August, 2022 | Paul Roberts

We chatted with Black Hat Speaker Adam Shostack about the need for better developer training.

EPISODE TRANSCRIPT

PAUL ROBERTS
Hey, welcome back to ConversingLabs. This is ReversingLabs' podcast, where we talk about the latest in threat intelligence, threat hunting, software assurance, we're talking to the best and brightest minds in the cybersecurity/information security industry. Really thrilled to be back with you, doing another one of our series of Black Hat Focus talks. And our guest today is Adam Shostack of Shostack & Associates. Adam, welcome.

ADAM SHOSTACK
Hey, great to be here.

PAUL ROBERTS
Great to have you here. And before we move on to your talk at Black Hat, maybe just tell our audience if they're not familiar with you a little bit about yourself and the work you do.

ADAM SHOSTACK
Sure. So these days I'm very focused on helping organizations do a better job, threat modeling, helping them anticipate and address questions of what can go wrong and what are we going to do about it as we're building things. I got here after helping create the CVE, many different startups, a decade at Microsoft. I'm on the review board for Black Hat, and I'm even speaking at Black Hat this year, which I'm super excited about.

PAUL ROBERTS
You are indeed, you are indeed. And for our audience. When you talk about threat modeling, that's your area of expertise. You wrote books on it, you're an expert in it. What are we talking about with threat modeling?

ADAM SHOSTACK
So what we're talking about really is anticipating the problems, the threats that are going to face our systems once we've built them, deployed them in whatever way that is. And really, I mean, obviously, I wrote a really long book on this subject. But we can boil it down to four very simple questions that we're always answering. What are we working on? What can go wrong? What are we going to do about it? Did we do a good job? And these are an academic wrote a paper not too long ago and called them deceptively simple questions, which I just love because we can take these questions and we can just ask someone, hey, you're working on this. What can go wrong? Or we can get into using a tool, something like Stride or like a kill-chain to provide structure to how we're going to think about what can go wrong. So that if you and I and the person listening to this all independently start to model, say, the act of creating and delivering a podcast will come up with similar answers. Right? If that's what we're working on, what can go wrong? We might have, I don't know, browser tabs, chewing up our CPU, we might lose our bandwidth.

PAUL ROBERTS
I don't know what you're talking about Adam. A lot can go wrong, let me tell you.

ADAM SHOSTACK
And you've got specific answers based on your experience with podcasting?

PAUL ROBERTS
Yep.

ADAM SHOSTACK
And I have specific answers that are driven by my background and our listener has specific answers driven by their background. And so that can cause all of this to look a little bit random. And so if instead we use Stride (spoofing, tampering, repudiation, info disclosure, denial service, expansion of authority) and then we can say, hey, we've got tampering problems. And so we're using this tool that you like that's making local recordings of each side. We might have info disclosure. We might leak the big reveal before the end of the podcast. We might have denial of service. But we can use this so that we get consistency, because management loves consistency, right? Nobody wants and again, just sticking with the podcast, how many episodes are you going to produce, Paul? Well, some. When are they going to be ready? Soon. Versus we're going to produce two a week or two a month. Let's be realistic about our time. Right. We're going to produce two a month for the next year. Okay. That consistency, we in security because we're often interim driven, it's hard for us to do that sort of thing. That makes it hard for us to support the teams around us and that leads to conflict.

PAUL ROBERTS
Right? It all kind of floats in this sort of amorphous. Make the product more secure, keep hackers out right? As opposed to fear of attack and fear of compromise. So your talk at Black Hat addresses many of these issues. And the title of your talk, let me just get my voice right here, is "a fully trained Jedi, you are not."

ADAM SHOSTACK
Well done. You did.

PAUL ROBERTS
Here we go. And from here on in, we can only use the Yoda voice in this podcast. So listeners, prepare yourself. No fully trained Jedi you are not. And this is a rep for the few people out there who may not have seen the Star Wars movies. Or. This is from the Empire Strikes Back. And it's a reference to Luke goes to train with Yoda to become a fully trained Jedi. But then Luke, being Luke, gets impatient, wants to be able to go out and help his friends and doesn't really want to have to go through the whole training. And so he basically bails. And Yoda is trying to convince him that he shouldn't do that. He should stay and finish his training so that he can be a true Jedi and not kind of a halfassed, halftrained Jedi. So I guess the big question is, first of all, are you saying that Yoda is wrong? That Yoda's model for training Luke is incorrect and that Luke, in his hasten impatience, is actually right? I mean, the movies would kind of bear out that maybe Luke didn't make the wrong decision in taking off leaving early.

ADAM SHOSTACK
So first, I love that we are doing a Star Wars geeking thing here.

PAUL ROBERTS
Let's just lay it out there.

ADAM SHOSTACK
And so I have a bunch of different answers. My first answer is that I actually wasn't planning to go that deep, but I will come back and answer your question. But let me answer your question first, actually.

PAUL ROBERTS
Yeah, okay.

ADAM SHOSTACK
I wasn't planning to go that deep, but I will say that Yoda is not wrong, but his training model is right. He's correct. Luke is not a fully trained Jedi and only a full Jedi Master at the height of their powers can actually confront Darth Vader and expect to win.

PAUL ROBERTS
Right.

ADAM SHOSTACK
But Yoda's training model is awful. It's 900 years old and it doesn't look good.

PAUL ROBERTS
Yeah, I mean, the Jedi is kind of quasi religious and it's all about mastery, right? There are no...

ADAM SHOSTACK
I'm all about mastery. Mastery is great.

PAUL ROBERTS
Yeah.

ADAM SHOSTACK
But standing on your head, lifting boxes in the air, how does this help you defeat a Galactic Empire?

PAUL ROBERTS
It's hard to scale that. Really?

ADAM SHOSTACK
It's hard to scale that. It's not clear how I'm going to use this in my job. Do I need to stand on my head to make a gift commit?

PAUL ROBERTS
Right. And the Empire kind of got that because they were like, you know what, we're just going to throw like 1000 robots at the Jedi and like, yeah, they're great, but we're just going to kind of overwhelm them with our cheap and disposable opponents.

ADAM SHOSTACK
So my talk is not quite going there.

PAUL ROBERTS
Sorry, I couldn't resist.

ADAM SHOSTACK
It was a great question. I love your question and it's fun. My talk is more about we talk a lot about Jedi in cybersecurity. We talk about heroism...

PAUL ROBERTS
And mastery.

ADAM SHOSTACK
Let me stick to heroism for a second first. Demands for heroism are unhealthy, right? Yeah. We need engineers, not heroes who are going to go run off and save their friends and put themselves at bodily risk to secure their products.

PAUL ROBERTS
Luke

ADAM SHOSTACK
Yeah. We do need mastery from some people of some things, but not everyone wants to be a cybersecurity expert. I know people who, for example, are UI Mavens, Usability Mavens, or they are really good at performance or they're really good at scalability. And if we're going to build a system in which we require everyone to be a Jedi Knight, what are those people doing here? And if we're making that demand and they're not ready to meet that demand, that leads to hatred. Hatred leads to anger, and anger leads to the dark side.

PAUL ROBERTS
Right. You're going to have a staff of Darth Vaders on your hand. It's not going to be pretty.

ADAM SHOSTACK
Yeah.

PAUL ROBERTS
So let's talk about the reality in most development organizations today. And as I wrote in my questions to you, they seem more jowa than Jedi, if I might say that. It's a high amount of chaos and kind of loose practice within many software development organizations. Obviously the larger companies, Microsoft, Google and Apple have huge security teams and a lot of discipline, but down there in the rank and file there's a lot less. Is it you're thinking that one of the reasons for that is that organizations are kind of development organizations are kind of throwing up their hands at the let's turn our developers into security Jedis problem, knowing that it's going to be too time and resource intensive and so sort of we can't climb that hill, so we're just going to kind of ignore it.

ADAM SHOSTACK
Yeah, 100%. If I tell you that in order to succeed, first you should get a PhD, and then you should do five years of additional research in static analysis, and then, padawan, you're ready to tune the static analysis tool, there's only twelve of those people in the world. Right. So this is part of why some grep is getting a lot of traction today, is because it's designed to be usable static analysis. And if we're going to say, you've got to climb this mountain before you can do anything in software security, a lot of people will say, okay, thanks. Have a nice day. What can we do in 30 minutes?

ADAM SHOSTACK
Right. And in some ways, your talk at Black Hat is kind of answering that question. Let me tell you what you can do in 30 minutes, or the equivalent of 30 minutes. Right. You talk a lot about I heard you talk about this before, but like, for example, Bloom's Taxonomy, which is a tool for teachers, really, that talks about kind of the progression from remembering to at the bottom of the pyramid and then at the top. It's kind of creative thinking, right?

ADAM SHOSTACK
Precisely.

PAUL ROBERTS
Talk about that and how that kind of figures into your thinking about how to address developer security education.

ADAM SHOSTACK
Sure. So when I started training, and I now do a lot of training, I had this implicit belief that everyone wanted to become a Jedi, that everyone was excited to take a multi day training course and learn the things that I could teach them.

PAUL ROBERTS
Read a book.

ADAM SHOSTACK
Right. Why have me do your training if you don't want to learn the things that only I can teach you? Well, as it turns out, a lot of people want some skill. They want to learn how to do this. They want to learn how to do it from someone who's really good. And so I've learned I have things like the world's shortest threat modeling course. It's a collection of 1 minute videos. You can find it on YouTube. I'll send you a link to it. And I have multi day intensive trainings, and they serve different audiences. And I use tools like Bloom's Taxonomy to help me think about what is the students learning journey, what order do I need to give them the facts that they remember, then teach them to apply those facts, et cetera. And the reason I'm going to talk about Bloom's Taxonomy at Black Hat is because I believe that we as a community need to have a conversation about what it is that we should expect a normal developer to know in, say, 2022 or 2025. And I don't have to reinvent the framework for that. Right. Bloom's Taxonomy has been around for 50 years. It's been around because it's useful. There's a lot of useful tools to help you use it. So why don't we start with that and start filling it with information and saying, if you want to be a developer in 2025, you should know this. Or if you want to be a level six developer here at company, we expect you to know these things. We expect you to be able to apply these skills in these ways.

PAUL ROBERTS
How does that translate through to security skills? So, like at the bottom of the pyramid again, is sort of remember and the next level up is, I think, understand or something.

ADAM SHOSTACK
Remember, understand, apply it.

PAUL ROBERTS
Yeah. What are the kind of base things that we're trying to remember and then maybe the conceptual things we're trying to have them understand?

ADAM SHOSTACK
Well, I mean, if you look around at the training space, apparently the base things you're supposed to remember are like how to write a cross site scripting exploit or why SQL injection happens versus. Remember...

PAUL ROBERTS
Not to create buffer flows or something like that.

ADAM SHOSTACK
Well, even simpler though, remember that if you're parsing, if you're combining user supplied data and code, things can go very wrong.

PAUL ROBERTS
Right.

ADAM SHOSTACK
Then be able to apply that to say, hey, this function, this method takes user supplied data and it constructs a SQL statement dynamically. Maybe I should then apply that knowledge and see. We have done a bad job, I think, as a set of security specialists of defining this knowledge. Right. Things that we teach people like sanitize data no, please don't.

PAUL ROBERTS
Right.

ADAM SHOSTACK
If you can't parse it, you're not going to make it better by changing the way you're not parsing it. If that crap is coming from the internet and it looks dangerous, it probably is, right? And so we ask people to learn things that are actually quite complex and we treat it as baseline knowledge. And so much of what I see out there is it's a little bit like maybe taking a photography course and the instructor says, the first thing we're going to do is we're going to shift your camera to manual mode so that you really need to learn exposure time and F stops. Whoa... I'm trying to figure out how to make my camera focus here.

PAUL ROBERTS
Right, or what I should be pointing at, right?

ADAM SHOSTACK
Yeah. Maybe we should leave the cameras on automatic mode for a little bit and talk about composition.

PAUL ROBERTS
So, I mean, one of the things I've heard you talk about is this idea that threats should be the fundamental unit for engineers as they contemplate code security rather than things like risk or compliance or whatever, national security. Could you just talk about that as sort of the organizing factor? I think you sort of hinted at it before when you were talking about just SQL injection and how those types of threats get created. But talk about this need to focus really on threats when you're trying to communicate or educate developers around security concepts.

ADAM SHOSTACK
So there's a lovely little book entitled "Start With The Why" and threats are the why. Threats are the things that can go wrong, the promises of future violence. The example I always use is he threatened to beat me up if I didn't give him my lunch money. Right. There's a promise of future violence. There's a thing I can do to avert it. And I believe that that informs the way developers should think is, why do I care about this? With compliance, what happens is we give them these checklists of do this, do this, do this, but without a why, without a reason for doing that thing, I can't judge whether or not I did it well. And I could do my own taxes, probably, but I have somebody else do it because I don't actually know whether or not did I put the right value here? What's the relationship between revenue and income? The compliance lists feel a lot like that, where I don't know why I'm putting this number here. I don't know if I'm putting the right number here. I don't know how to judge my work. Where if I start with the threats, if I start by saying I'm concerned about someone tampering with the recording of this call, then when I get to Mitigations, I can say, for example, I'm going to make my own recording of it. So that if we have a disagreement, I'm not thinking, I really need to do this with you, of course, but if we have a disagreement, I've got my own tape and I can release that tape. And so that's a really lightweight example where if I understand the threat, I can think better about the Mitigation and know why I'm doing what I'm doing.

PAUL ROBERTS
Over the years, it's often been said, or folks have kind of advocated for the fact that our universities that educate software engineers should be doing more to teach secure development principles at the undergraduate level and turn out developers who are familiar with these concepts, but that it still to this day isn't a big part of the curriculum. I almost wonder if that's kind of an outdated conversation, even given how many avenues there are into software development these days, most of them not through the university. But I'm interested in your thoughts. Is that part of the problem here, just with the lack of general knowledge around software secure development concepts?

ADAM SHOSTACK
So I love this question. I'm an affiliate professor at the Computer Science school at the University of Washington. And so I get to think about what is the difference between what I do when I'm teaching a university course and what I do when I'm teaching a training course. And I think the thing that comes out of a good university education is really the upper levels to go back to bloom the upper levels, the reflective, the long term, the principles. And I do believe that those ought to be part of an undergraduate education. And one of the challenges that I think we face in integrating them is what principles should we teach that are long term? Right. Some of the things that I learned when I was in college were critical thinking skills. I learned writing skills that I still bring to bear a lot later. And if we teach people about, say, buffer overflows, that's a useful bit of knowledge if you're writing code in C, but hopefully you're not going to be writing much code in C. So what's the thing that you're supposed to take away from it, right, which is maybe input is dangerous, maybe it's some of the lang sex stuff where we talk about weird machines. But I think the biggest there are two really big challenges, actually. Big challenge number one is time, right? The undergraduate curriculum exists. There's the ACM reference curriculum, for example, and each accredited school has a curriculum which works and that curriculum is full, right? There's not an extra nine course hours and six assignments that are just waiting for us in security to pop in and say, add this. And so there is work being done on how do we teach this, what do we teach? The other problem is what do we teach that's I talked about this, but what are the enduring lessons for security that we want people to know and take away for every engineer? Your point about not everyone is coming through the university. There's lots of boot camp programs that are happening which are great. And similarly, I think we need to think about what they need to be teaching, what's the educational bit that they should have and obviously it's different. But the thing I'm hoping that comes out of my Black Hat talk is a conversation about what these learning goals ought to be because once we know what those are better, we can apply them. We can say, hey, let's figure out where to integrate this lesson. Right? Maybe buffer overflows has become part of a compiler's course. We can find ways to fit them in when we have more of a shared understanding of what we in industry think every developer actually needs to know. Because today I think we don't have we go to the university and we say, and here I'm putting my industry hat back on. We go to universities and say, teach security and they say, what do you want us to teach and what do you want us to remove from the curricula? And we don't have clear answers.

Speaker 1
You obviously advise companies that's really what you do for your day job is advising companies and consulting with them on issues like this. And you are in this Black Hat talk, are really going to be talking about this notion of security shifting left, moving into or certainly being much more intertwined with development, but also that that's going to force changes on development groups in ways that they may not be ready for. So I guess through your consulting work and so on, I mean, obviously you're seeing sort of shift left in the trenches as it's happening. Talk just a little bit about what your experience has been and what are some of the tensions that develop as, again, security teams and the focus moves into development groups where it hasn't been that much of a focus.

ADAM SHOSTACK
Wow, that's a great question. Let me say that in the world in which we live in, there's a lot of external stress from the pandemic, from world events. And at the executive level, one of the jobs of the CEO is to figure out what change needs to happen this year. What are the important things that we're doing here? For example, I can see your ReversingLabs shirt on. What are the important things we're doing here at ReversingLabs this year? What are the three things that matter to me that I told the board we're going to do? And when security tries to force its way left, and you use that term and I sort of twigged a little bit, it fails. What we need to do is we need the CEO, we need the VP of Engineering to be saying things like, our customers care more about security and therefore we're going to integrate security earlier into our development activities. Or they might say, we have so much rework and it's killing us because instead of getting it right the first time, we get it right the third time, and it's destroying our schedules. Or they might say, I'm just tired of these leap in escalations where developers show up the day before they're planning to ship and say, hey, Ms. CISO, can we get an exception? Because we didn't know we had to do security for this thing that's processing medical data and Social Security numbers.

PAUL ROBERTS
Right.

ADAM SHOSTACK
The first thing that successful companies do is they generate a sense of urgency. Why is this change happening? And you notice I'm starting with the why? Again, my answer is different. But we're doing this to make this change happen. And frankly, a lot of my business is companies that didn't do this, that tried to force their way left, that are now having a crisis because they didn't pay attention to the human side of change. And so we come in, we've got a team that does this, we listen to their problems, we help them solve those problems. And the other customers that I work with are thinking in this way and they have a reason to do more secure development. They're looking for early successes, they bring us into do some training. They send a couple of people to one of our open courses. They learn how to think about this, and then they go and they do. And then they celebrate the success here on our podcaster project, we succeeded because we thought about these things and boy, nothing blew up at the end. From a security perspective, it was nice, it was different. And so I think that you, my colleagues, ought to give this threat modeling thing a try because it actually led to smoother development. And so when we see shifting left happening, we see it working because of the respect for the difficulty of change and that people are really thinking about how they're going to lead change, how they're going to use threat modeling as a new language to talk about what they're working on and what can go wrong. And then the shift left project succeeds.

ADAM SHOSTACK
You are a big advocate, or just judging from hearing some of your other interviews, instead of focusing on teaching, let's pick a grab bag of different skills that we want to teach people. Really basing a lot of this around frameworks for security and using those really as a foundation for your approach to developer education is around adopting secure frameworks and more or less following the framework. So natural question would be, well, what are some frameworks that you think are useful for development organizations to look at?

PAUL ROBERTS
The first framework I like to teach is the four question framework for threat modeling. It's in my book. It's in the threat modeling manifesto. What are we working on? What can go wrong? What are we going to do about it? Did we do a good job? Almost everything else I teach fits into that framework. So we'll talk about stride. I already rattled off stride. It's a way of thinking about the question of what can go wrong. But the key is that human brains are really good at pattern recognition. We put patterns onto things that don't even have patterns. And so if we give students the patterns and tell them these are the cubby holes that you're going to put the info in, they learn better.

PAUL ROBERTS
Final question would be obviously one of the ways that companies tried to enforce secure development and level up is through hiring and kind of trying to set a bar at the hiring process around security knowledge. I'm interested in your feelings about that. And are there bare minimums that companies should have when hiring developers? Understanding obviously that the higher you set that bar, the fewer people who are going to clear it, and it might take longer to fill that open wreck. Right.

ADAM SHOSTACK
So you might think I'm biased because I do training, and the lower the bar, the more training I get to do. But today's hiring market is really weird. There's super intense competition for really good folks, and there's a lot of people banging on the door trying to find a way in. And if we have people who are banging on the door trying to find their first job in security, and we already asked for five years of experience for your first job.

PAUL ROBERTS
This is something I've heard from folks who do placement too, that this is really common, that you get these entry level job racks where they're asking for multiple years of work history. It's sort of like, okay, those two things don't go together. Right? But yeah. Anyway.

ADAM SHOSTACK
I think that before we can think about, for me, job descriptions and promotion ladders are end states, right? That once you've trained your people, once you've made this the way we ship software here at the Tyrell Corporation, then you can add this to the ladder, and you can say, here's the sort of things we look for. But today, the other thing that springs to mind is there's a set of interview questions which are tantamount to tell me what I'm thinking. I saw recently a thing from a consulting company that said threat modeling is all about think like an attacker, disagree with them. People are welcome to have different opinions about what makes for good threat modeling. The reasons I don't like think like an attacker are because I don't know how to teach it right. I don't know how to go up blooms taxonomy for it. And many law abiding engineers don't feel like their job should be to be a criminal. And so I've heard a lot of pushback on this. But most importantly, if we want to hire, train, and apply only once, we're able to really apply consistently and to work at the levels of evaluate, compare, and contrast and have a mature discussion about threat model because we've been doing it for a while no, there's multiple ways of doing it. Can we think about putting it in as a hiring bar?

PAUL ROBERTS
Adam, is there anything I didn't ask you that you wanted to say?

ADAM SHOSTACK
So this has been a lot of fun. You've had some of the best Star Wars questions. And if I may, a lot of my thinking about these issues has been driven because I'm working on a new book which is titled Threats: What Every Engineer Should Learn from Star Wars. It's coming out early next year. Nice. Thank you. I'm super excited about it, and it's really when I write books, I take a long time to write books. This has been in the works for a while because I'm really trying to ask that question in the subtitle. What does every engineer need to know? Or what should they learn from Star Wars? But the Star Wars cover is really just to make it a little bit more fun. You got to be a Star Wars nerd to understand it. One of my beta readers actually has never seen the movies, and they yell at me every time I make an incomprehensible reference. Okay. And some of it is so pop culture that they know about it anyway, and some of it like, I'm deep in the weeds. But I don't want that to be a hiring criteria anymore than I want I don't want to exclude anyone, but I do think that this conversation is crucial as security has moved over the last decade from nice to have to mandatory.

PAUL ROBERTS
Right.

ADAM SHOSTACK
We need to change the way we teach. We need to define what we expect of people crisply, understandably. And so that's the thing that I've been spending my intellectual energy on. And so, I'm excited for my Black Hat talk, I'm excited for the book, and I'm excited for more Star Wars geekry.

PAUL ROBERTS
I would just say to your reviewer, you just got to watch Star Wars. It's a collection of archetypes. I'm not even sure if you really do need to see it, but you should probably watch.

ADAM SHOSTACK
You know, I did a couple of practice runs for my talk, and someone said to me that they'd never seen Star Wars. And I'm like, do you watch TV at all? I mean, what were you watching recently that's going to be better than that? Honestly, there's some great stuff that's come out lately. I'm not saying it's the only good thing on earth.

PAUL ROBERTS
Yeah, you need the vocabulary of Star Wars because it's just part of people's union. It's part of our collective unconscious at this point. All these Luke and Darth Vader and all this stuff is just kind of how people frame the world. So it's like you kind of need to see it to be able to engage in those conversations.

ADAM SHOSTACK
Yes, there's certain things. Right. And there's the meme of the day club, where if you haven't seen the thing, you don't know what it is. But there are enduring references that we rely on, shared experiences. Look, the reason I use Star Wars is because it's accessible to most people. I don't go deep into the Star Wars geekery, but it's a fun little add on that helps us because the security stuff can really be scary. It can be intimidating. Like Darth Vader. And so I'm looking to make it accessible. I'm looking to say what everyone should know. And the fewer demands we put on people, the easier it is for them to hit those marks.

PAUL ROBERTS
Absolutely. I agree. A good teaching technique if nothing else. Adam Shostack of Shostack & Associates, thanks so much for coming and speaking to us on ConversingLabs podcast. And your talk at Black Hat is "A fully trained Jedi, you are not." And that is Wednesday, August 10 at 11:20 out at Black Hat in Las Vegas. So if you're there at the show, by all means.

ADAM SHOSTACK
I believe we're streaming live this year as well.

PAUL ROBERTS
Yeah, you can attend virtually, actually all the sessions at Black Hat, which is great. So if you're not in Vegas, because why not stay home if you can, you can check it out virtually. Adam, thanks so much for coming on. Really appreciate it.

Paul Roberts

About Author: Paul Roberts

Content Lead at ReversingLabs. Paul is a reporter, editor and industry analyst with 20 years’ experience covering the cybersecurity space. He is the founder and editor in chief at The Security Ledger, a cybersecurity news website. His writing about cyber security has appeared in publications including Forbes, The Christian Science Monitor, MIT Technology Review, The Economist Intelligence Unit, CIO Magazine, ZDNet and Fortune Small Business. He has appeared on NPR’s Marketplace Tech Report, KPCC AirTalk, Fox News Tech Take, Al Jazeera and The Oprah Show.

Related episodes

Subscribe

Sign up now to receive the latest weekly
news from ReveringLabs

Get Started
Request a DEMO

Learn more about how ReversingLabs can help your company.

REQUEST A DEMO