Ana içeriğe git
Background image

With Software, Hope is a Strategy? With Jonathan Knudsen (Part 2)

Share

Podcast

About This Episode

Joining us this week is Jonathan Knudsen, Head of Global Research for the CyRC, cybersecurity research center, at Synopsys Inc. To understand the vulnerability landscape in software, you have to first understand how software is made. Jonathan shares insights on software development and where vulnerabilities (or many, many vulnerabilities) can be integrated in the final product (although software is never really, final, is it?)

And as we round out March Madness for 2023, he shares some sobering findings from his recent research into sports betting apps and the more than 179 vulnerabilities on average uncovered. We also dive into software composition analysis, the future of security ratings, and the notion of security as an enabler to business. This is another episode that made sense to be a two-parter!

Podcast

Popular Episodes

      Podcast

      With Software, Hope is a Strategy? With Jonathan Knudsen (Part 2)

      Jonathan Knudsen - Head of Global Research, Synopsys Inc.

       

      [0:24] Nested Russian Doll

      Rachael: This week, we welcome back Jonathan Knudsen, who is the head of global research for the CyRC, Cybersecurity Research Center, at Synopsys, and we pick up right where we left off.

      Jonathan: One other thing about the open-source components is the more you get into it, the worse it gets. So it's this concept of transitive dependencies. So sometimes developers will say, "Hey, I want to use this open source component or this framework," or whatever it is, and they'll pull it in.

      And then it turns out that open-source components can have their own open-source components inside them, right? Components within components, and we call those transitive dependencies. 

      Petko: I call that a nested Russian doll.

      Jonathan: Yes, exactly. And it just makes it that much harder to keep track of everything. So I just wanted to throw that out, too.

      Rachael: Thanks, Jonathan. That's awesome.

      Jonathan: Yes. We'll all do some shots when this is over.

      Rachael: We'll pile it up. So do we ever see a time where there is any kind of accountability? I think about organizations like NATO or United Nations. They're trying to tackle that big rock of, okay, well, all the countries around the world. How do we have any kind of accountability, like nation-state attacks, and get everyone on the same page? But accountability seems to be a real, real, real problem. Are we ever going to get to a place where is it criminalized? Should it be criminalized? What's the answer?

       

      Misconceptions About Security

      Jonathan: I don't see it happening, but I'm not a real government public policy kind of guy. But I would be surprised. 

      I think what we can say is that organizations that build software, which is basically everyone, I think are slowly coming to the realization that security is an enabler, not an impediment. So traditionally, security's seen as a roadblock. S

      o traditionally, their dev team writes the software and they throw it over the wall to the security team. 

      The security team is swamped because they're doing this for everybody. And they do some testing and they send you back some 2,000-page report that you don't understand and you're like, "We can never fix all these things in time for our release." And eventually, some vice president says, "It's okay. We'll just go ahead and do it anyway."

      But that paradigm's shifting. So people are starting to realize that when you do the software development right, when you've got the secure end-to-end process, you end up with better software. 

      It's not just that you're lowering your security risk, but you're also just making it better because you're thinking about it more at the beginning.

      So that you get more of the stuff right at the beginning. And when you're doing your implementation and test, you're finding more things that can go wrong, which means that the quality goes up at the same time that it's more secure.

      And then eventually, the people that buy the software, whether it's consumers or other businesses, are going to see that securities are competitive differentiators.

       

      The Future of Security

      Jonathan: But the thing is, how do you know how secure software is? If you have the same functionality from two different vendors, how are you going to know which one is lower risk for you? And there are people working on that too.

      Rachael: Rating system?

      Petko: Yes.

      Rachael: Like that?

      Jonathan: Yes. So we all know what a secure development life cycle should look like. We all know pretty much the types of testing that you want to do and the kinds of results that you want to see. And so I'm hopeful that over time, we'll have guidelines or a way, a meaningful way of evaluating how secure is this application.

      Petko: Jonathan, I don't know about you. But I look forward to the day that I can walk into my local electronic store, not just look at the counter there of all the different electronics and not just see the price. But next to it some kind of security rating that lets me choose as a consumer, do I want to pay extra for security?

      Rachael: Yes.

      Jonathan: Like ENERGY STAR.

      Petko: Yes.

      Jonathan: Yes. That's exactly what you want is a little gauge and a number and something that helps, that means something, that helps you evaluate it. And that's a tough nut to crack. And there are a lot of people working on that, and there are a lot of different standards and stuff. We'll see what happens.

       

      Android Applications’ Security

      Rachael: A lot of people don't care though. Let's be honest, right? My Chick-fil-A example. When Chick-fil-A opened in New York, right? I'm not disparaging them, but I think they have a C grade or something. It was not an A, right? And this is chicken here, something with the handling, I don't know. But you know what? Nobody cared because Chick-fil-A was filing in the city, and that line was around the block rolling the dice for that C-grade chicken because it's Chick-fil-A and it's delicious.

      Jonathan: Yes. Same thing with applications, right? You want to do stuff.

      Rachael: Exactly.

      Jonathan: And if the game's that good or your TikTok feed's that good or whatever, you're going to do it.

      Rachael: Can't help yourself. Yes.

      Jonathan: Regardless of risk.

      Petko: But Rachael, I think in the case of Chick-fil-A, the brand was so strong there, you didn't care about the rating. You're like, "That's a one-off. There's no way."

      Rachael: Exactly. It's not that bad.

      Petko: Same thing with your TikTok and same thing with all the other apps you have on that phone of yours, or phones. I don't even want to guess.

      Rachael: Yes, I have a few.

      Petko: Jonathan, I got a provocative question for you, and you can give me the lawyer answer if you want, but I noticed that all your research is titled on Android apps.

      Rachael: Oh, interesting. Yes.

      Jonathan: Yes.

      Petko: And my question is, well, is iOS more secure that you just don't bother looking anymore? Or is there some other reason?

       

      [6:18] The Different Sides of Security

      Jonathan: No. It's just harder to look at. So the way Android apps are distributed is unencrypted. So we can just drop them into our tool and look at them. The Apple apps are distributed encrypted.

      So of course, there's a way around it. If you have a jailbroken iOS device, you install the app, and then there's a way to pull the unencrypted app off the device, and then you can analyze it in exactly the same way. So it's just a little harder to get to, so we did the Android ones for this.

      Petko: And Apple just has a better distribution system where it's more secure and more digitally signed, it sounds like.

      Jonathan: Yes. And so as far as the security goes, I would not expect them to be any better. I think for the most part, the same development teams are going to be creating the Android and the iOS versions of the app.

      And so whatever practices they are or are not following, probably going to be the same for both. Just because the platforms are technically different, you might see a little bit different list of components, but I would expect the results to be similar. 

      And I guess this goes back to one of the things we talked about in a report and your question about what do consumers do. And our suggestion was, well, if we can do this analysis, app stores could do this analysis too. Maybe they could put up some kind of gates around how old the components can be or how many vulnerabilities there can be.

      So that was our best answer for consumers as well. Maybe the app store could be better.

       

      The Complexity of Maintaining Security

      Petko: I think Google's got a project that goes out there and scans open source anyways. It does some of this and identifies it. It's not always identified in the app, in the consumer's marketplace though, but they've got a whole separate service that does that. And I think

      Apple has marketplace checks like that, that you mentioned.

      Jonathan: Oh, Yes. They definitely do. Running an app store is hard to do. And props to them for doing it because people will throw all sorts of crap at you. And it's your responsibility to take care of your consumers, but it's also in your best interest. Because if people know that one out of every four apps is full of malware, they're not going to want to be part of your ecosystem anymore. 

      So Apple and Google are definitely doing some kind of vetting on apps as they come in from developers, but maybe they're doing SCA and we don't know about it. But if they are, obviously, they're not kicking back apps that have really old components in them with lots of known vulnerabilities.

      Petko: Yes. But I imagine if you're Google or Apple, even at Microsoft, you have an Apple Store, some kind of marketplace. If you start doing this analysis, you might find out all of them are vulnerable and you're like, "Well, I can't stop everything. So I've got to focus on maybe what they have access to and maybe limit it to certain things like what do you enter data into and things like where does that go." I imagine I do point out, it's so complex for them.

       

      The Necessity for App Regulation

      Jonathan: Yes, definitely. They've got these competing priorities where they want to keep it as open as possible. So as many developers as possible can participate to give a wide variety of apps for the consumers. The whole thing flows together. But on the flip side, they do want to make sure that there's a certain bar for quality and security for getting into the store.

      Petko: Just flipping it around. One of the things I'm curious, should we have some kind of regulation that requires either apps that are public or maintain the definition of a public app that's distributed. Should we have some kind of government regulation that says you've got to keep it up to date?

      Jonathan: Yes.

      Petko: No?

      Jonathan: That's the question, right?

      Petko: That's the question.

      Jonathan: I think, again, it's really hard to pin down. There are all sorts of corner cases. I could update all of my open-source components to the latest versions, but even those might have vulnerabilities in them. And even going deeper into the rabbit hole. If I'm a developer, I can get an open-source component, but then I can modify it and include it in my app.

      And probably the SCA tool will still flag it, but maybe I fixed a bug that was in it. So it's a tough one.

      Rachael: That is a tough one because I have a couple of banks that I work with. And one of them is just go to the app, go to the app. I feel like they don't want to talk to me, they don't want to deal with me. Just go on the app and take it straight out, self-service. And makes me a little uncomfortable with my banking details.

       

      Keep Strangers Off Your WiFi

      Jonathan: Me, too. And for me at least, part of it is generational. My kids are all mid-20s and they do everything on the phone. And the phone just drives me nuts. Just really want to have a keyboard, so I can type stuff. But similarly, I don't know, it feels flimsy in a way. Not physically flimsy, but I know exactly what you mean. 

      Rachael: I don't want to be on the Starbucks wifi accessing my banking app. You know what I mean?

      Jonathan: Yes. That's just the world we live in.

      Petko: Well, that's secure, right? Starbucks is secure, isn't it?

      Jonathan: Absolutely.

      Rachael: Not a problem there ever.

      Jonathan: Especially if you connect to that access point that says, "Starbucks Super Free Wifi***." That's the best one.

      Petko: That's sarcasm for those listening to the podcast. Please do not do that.

      Jonathan: That's right. That's not advice.

      Rachael: That's funny. I've got, I guess, a neighbor on my block whose internet or their wifi's called FBI Surveillance Van.

      Jonathan: Yes. I like it.

      Rachael: I don't think that one's safe. I'm just going to go out on a limb on that one there. 

      Petko: Keeps people off your wifi, right?

      Jonathan: Maybe.

      Rachael: I would think they would have really good wifi in that van though, man. If I could get some good connectivity, I'd take it.

      Jonathan: That's right. Direct link to satellite.

      Petko: That's Starlink. I hear it's using Starlink.

      Rachael: Now, I'm really interested in your background too, Jonathan. One of the things that we love to talk about is how people found their way to this path. And your degree is in mechanical and aerospace engineering, which I love, but you've never really used it, I think you said.

       

      [12:50] Jonathan’s First Introduction to Security

      Jonathan: No, never did that. I guess the first thing that you get on your resume is the one that sticks. So while I was in college for a summer or two, I worked at Bell Labs. My dad got me a job at Bell Labs for the summer. And I learned how to program C. And it was all software after that. 

      The first summer was so cool. They gave me two Game Boys, the original Game Boy. So this was, I don't know, 1989 or 1990. And at the time, there was a physical cable. You could plug two Game Boys together. You could play head-to-head with certain games. And they said, "Okay. Why don't you see if you can make it play over a network instead, over an ISDN network?"

      And so I took the cable apart and I hooked it up to an oscilloscope. And I found clock bits and I found data bits and I made a little circuit. I recorded the data and figured out what it was doing. And then I ended up getting it done. But it was just such a cool project. But anyway, it was all software after that. And I did some development.

      I did a lot of technical writing. So I got a job with O'Reilly as a writer at some point. So I got to write a few books about Java development. Eventually, I don't know, I did a book about cryptography and that was really interesting. And cryptography is not security, but it's definitely adjacent.

      Rachael: Exactly.

       

      Discovering the Vast Security World

      Jonathan: And it's used a lot. And so I knew that was interesting. Then around 2010, I guess I was specifically trying to get a job in security, cybersecurity because it looked really cool. And I was lucky enough to get a sales engineer job. So it's been cybersecurity ever since then.

      Rachael: It's never a dull moment in this world.

      Jonathan: One of my favorite things is a good exploit story blog because it's like a heist movie, right?

      Rachael: Yes.

      Jonathan: It's like, "Oh, we did this incredibly complicated thing that you can't even imagine." And so at its best, it's really good storytelling. We looked at this, we found this little hole in the wall, and we stuck our chisel in and wiggled it around. And it's just fun.

      Rachael: It's funny because I've been in technology a long, long time too, and started off in laptop world back when 233 megahertz was blazing fast speeds. And then we introduced Bluetooth and 802.11, all those things. But at some point, the laptops, there's only so much that they could get to.

      And then you find this security world where you're mind-blown. Every few weeks, they do that. And to your point, it's this never-ending Hollywood film that just gets more interesting and interesting, which makes me think of, you've been in now, what, 15-ish years? What are the next 15 years going to look like? 

      Jonathan: Well, we're back to ChatGPT, right?

      Rachael: That's right. Petko's favorite topic in the whole world right now. 

       

      Arms Race in the Digital World

      Jonathan: AI will figure out how to make better AI, and then it's all over for us. No, I don't know. I think I feel like we're slowly getting there. We're slowly getting this idea across every time. I tell as many people as I can that security has to be part of software development.

      It's the same thing. You got to do it the right way to make risk lower for everybody. And I think it's happening. We're gradually making that transition. 

      And I think people are gradually realizing the kind of risks that we're up against and that, in fact, every organization works on software, and depends on software. So any software risk is really an organizational risk.

      Rachael: Exactly.

      Jonathan: I think we're getting there. But the next 15 years, probably more of the same. We're just slowly making the needle move a little more. And I think things will get better. People call it an arms race sometimes. They're like, "Okay. Attackers know how to exploit this kind of bug. So we'll change the compiler, so it's harder to make that kind of bug," and so on and so forth.

      There's some of that, but the big part is it's all about the process.

      Rachael: And you've taught, I'm curious what that was like in, I think you taught at Duke, and it's access to the younger people. I'm excited to see what they're going to do, especially when they've grown up with nothing, but digital at their fingertips, right? Almost since birth.

      What was that like? What was the perspective of the students when you were teaching?

       

      Never Talk About One Without the Other

      Jonathan: It was fun. 

      Rachael: Any future attackers?

      Jonathan: Yes. It's probably like any subject where you get a few people that are really interested and the rest of them are just trying to get through the class. But I felt like I was telling them a lot of stuff that they didn't know, and I felt like they should have known some of it.

      So the way I ended up teaching this course was it started out, it's a graduate-level course, but I don't think it should be. 

      So it was originally part of engineering management, and then they started a FinTech program and it became part of that, which is fine. But I think probably every professor believes this, but I think it should be a requirement for the undergraduate CS curriculum.

      And as a data point forester, I think a few years ago, they surveyed major CS curriculums in the US and found that out of the top 40 schools none of them required anything about security in CS. 

      And that goes back to what I'm saying, you should never talk about one without the other. So I enjoyed teaching the class and I felt like at least the people that took the class got to learn something useful and important. But I feel like from the beginning, don't just tell people how to write code, tell them how to work inside this process that makes it okay.

       

      [19:41] Let the Professionals Do Their Thing

      Petko: I wonder if we should make software development like a trade where if you think about your plumber or anyone that comes to your house, they usually have an apprenticeship program that's formalized. They have almost a license eventually. And the person in charge takes responsibility for the folks below him.

      So if someone doesn't build something to spec or plugs in power with water together, you hold the top person responsible. The lead software developer or the chief security architect, whatever it might be, the right responsibility there.

      Jonathan: That's the play. I guess with software development, it's more like if you were comparing the trades with the inspectors. So you want the electricians and plumbers to be able to do their thing, but you want them to live inside of a process where somebody's checking. And if it's not right, you go back and fix it. It's not so much about holding developers responsible, it's just giving them this framework to work in where the security testing happens automatically. 

      And when there are findings, they get fed back to the developers. They're already using an issue tracker to keep track of what they've got to do. So you just need to use that same existing mechanism, so that from the developer's point of view, it's hardly any different.

      They're still building things and trying to make things work. But their to-do list also involves, "Oh, you did a SQL injection here. You should really fix that up," and so on and so forth.

      Rachael: I like that. So how are we going to get that started, Petko?

       

      Let Developers Be Developers

      Petko: I think there's lots of companies that are shifting left and giving their developers tools. So that way, they are aware of it. When they're writing their code, they also get a list of, "Hey, here's some things we want you to be aware of. Not saying you got to fix them. But if you don't fix these, request for a waiver," let's say. 

      Jonathan: Yes. That's part of it. And so part of that is just feeding back through Jira or Bugzilla or whatever they're using. Then also, there are these tools that run right in the IDE, and they call them a spell checker. And so if you do something dangerous there, it can let you know, and that's even better, right? Because it's a closer feedback loop.

      The larger one is developer writes code, commits it to the repository, testing happens, and feedback happens through issues. 

      The shorter feedback loop is write code, IDE plugin says, "Hey, this doesn't look right," and they fix it right then even before it goes into the repository. Some of it's about getting developers more involved.

      But fundamentally, you got to let developers be developers and build this stuff and be creative in the way that they do. And above all else, don't waste their time because then they will ignore you.

       

      Jonathan’s Word of Wisdom

      Petko: What did Steve Ballmer say? Developers, developers, developers? So I guess that's your motto?

      Jonathan: I guess so, yes.

      Rachael: This is so much fun. I think we're just scratching the surface here on this whole topic, but I do want to be mindful of time. Jonathan, thank you so, so much for joining us today. This has been so much fun. Do you have any words of wisdom to all of our friends out there using sports betting apps for March Madness as parting words?

      Jonathan: I would say don't install an app unless you really want to. And if you're putting in credit card information or whatever, try to be mindful about where that's going. It's probably going to be okay, but it's good to be aware. There are risks.

      Rachael: Exactly. But it's not going to happen to me. It's going to happen to the next person.

      Jonathan: Right.

      Rachael: Well, to all of our listeners out there, thanks again for joining us this week. And don't forget to subscribe because you get a fresh episode every single Tuesday. So till next week, be safe.

       

      About Our Guest

      Jonathan Knudsen - Head of Global Research, Synopsys Inc.

       

      Joining us this week is Jonathan Knudsen, Head of Global Research for the CyRC, cybersecurity research center, at Synopsys Inc. To understand the vulnerability landscape in software, you have to first understand how software is made. Jonathan shares insights on software development and where vulnerabilities (or many, many vulnerabilities) can be integrated in the final product. (Although software is never really, final, is it?) And as we round out March Madness for 2023, he shares some sobering findings from his recent research into sports betting apps and the more than 179 vulnerabilities on average uncovered. We also dive into software composition analysis, the future of security ratings, and the notion of security as an enabler to business. We had so much to talk about we made it a two-part episode!