Len Noe | Nov 22, 2024

November 22, 2024 00:42:50

Hosted By

Ari Block

Show Notes

In this engaging conversation, Ari Block and Len Noe discuss the challenges of parenting, particularly with teenagers, and share humorous anecdotes. Len, a transhuman hacker, explains his unique experiences with augmented technology and how it enhances his cybersecurity skills. They delve into the evolving landscape of cybersecurity, the impact of AI on hacking, and the importance of compliance in protecting data. The discussion also covers the limitations of biometric authentication, the societal implications of data breaches, and innovative security measures in credit cards. Finally, they explore the future of cybersecurity, including the significance of software bills of materials and the potential shift back to on-premises solutions.

Len's Book [Human Hacked: My Life and Lessons as the World's First Augmented Ethical Hacker]: https://a.co/d/elkuLOc

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: Welcome aboard to the show, Len. So happy to have you on today. [00:00:03] Speaker B: We. [00:00:04] Speaker A: Okay, let me stop and let you respond to that. I'm so excited to talk to you. I just jumped. Wanted to jump into the question. [00:00:10] Speaker B: It's a pleasure to be here, Ari. Thank you for having me. [00:00:13] Speaker A: Len, you have kids? Tell me how many. [00:00:17] Speaker B: I have five daughters. [00:00:19] Speaker A: Five daughters. I mean, I have one daughter and she's 10. And so I'm kind of terrified. [00:00:26] Speaker B: It gets worse. It gets so much worse. [00:00:30] Speaker A: And your kids are quite a bit older than me, mind you know what's coming. I. What is the most terrifying moment that you have had with your kids? [00:00:39] Speaker B: Ooh, that's. That's a tossup between two options. [00:00:43] Speaker A: One, boyfriends. [00:00:44] Speaker B: Is it that it's either the boyfriend or trying to teach them how to drive? [00:00:49] Speaker A: Oh, my Lord. I got to hear the driving story. I. I don't think we can talk about security before we touch that. [00:00:56] Speaker B: Let's just say if I had a choice, I definitely would have farmed this out. You know, I love my K kids, but there is something absolutely terrifying about putting a 16 year old in a number one my vehicle that I'm paying for. And there was one instance where one of my. We were. We were trying to teach one of my daughters how to drive. And it was one of those things where you. When you pull up to the light, the. The road shifts by about, you know, 20ft to the left and you kind of go have to cross the intersection at a diagonal. And we just told her, you know, she's like, well, what do I do? I was like, well, you just keep going straight. She's like, well, if I go straight, I'm gonna. I'm not gonna hit the road. It's like, just follow the lines in the road, honey. Just follow the lines in the road. [00:01:47] Speaker A: I. I gotta tell you, I was not a great driver when I was that age who was. I was afraid of my. I should have been for more afraid than I was. And I. And I am surprised my driving teacher did not wet his pants. I mean, quite serious, high end. You know, the reason I got into computer is because I had like an eye hand coordination issue. That's why my parents, like got me into computers. [00:02:10] Speaker B: It makes you feel any better, I wasn't really all that good at sports either. [00:02:14] Speaker A: Yeah, no. Hey, I think a lot of our guests on the show can say that. When I was 8, instead of playing soccer with my friends, I was learning to code C plus plus. So, you know, there's definitely something wrong with Us. [00:02:27] Speaker B: No, I think there's something wrong with all of them. I, you know, I was going to say, how many of them made, you know, the World cup or FIFA or the NBA or the NFL or the NBA baseball. I don't even know what they are, whatever. But you know, how much, how many of us that were learning how to code actually wound up with good jobs in the end? So I think we were the smart ones. [00:02:50] Speaker A: You know what that movie, which I'm sure you're thinking about now, what was it? The Revenge of the Nerds or something that was such a pivotal. It was, I can't remember if that's the exact name. [00:02:59] Speaker B: That's the Revenge of the Nerds. [00:03:01] Speaker A: That was such a pivotal, like, moment in my life. It's like, oh, maybe, maybe it's okay to be a, you know, full on nerd geek kind of thing. [00:03:11] Speaker B: Yeah, I think they just couldn't understand. They didn't have enough imagination for Dungeons and Dragons. That was their problem. [00:03:16] Speaker A: Did you play? [00:03:17] Speaker B: Oh, absolutely. [00:03:18] Speaker A: I wasn't still do. Really. That is, I love. So I was a dungeon master. I actually wrote a post about why being a dungeon master is such a wonderful thing when it comes to storytelling. Imagination, how much hard work it is to bloody create those. [00:03:33] Speaker B: It is. And the one thing that I, I remember, see, I'm going to be 50 in like two weeks. So I've been around for a while and I remember in the beginning it was like, oh, all that's going to lead you down this terrible path and you're going to be worshiping the devil and all this other stuff. And it's like, no, what it did is it gave me an imagination and it let me be able to visualize things in my head and I don't need a screen in front of me to tell a story in my own mind. [00:04:03] Speaker A: Yes. I mean, people forget that, that there was, there was a little bit of, you know, of a fight around, you know, fantasy like 40 years before, you know, Elvis was not welcomed with open arms. Right. It's these small things in history that we kind of take for granted nowadays. [00:04:23] Speaker B: And the funny thing was, you know, if that was devil worshiping, the fact that I've got a credit card in my hand now I'm being called out on extremist websites for having the mark of the beast. So it's just a label I haven't been able to get away from. [00:04:37] Speaker A: I appreciate that. So, so this is so delightful and I appreciate you coming on the show. What, what made you write a book? I mean, it's such a difficult labor of love. [00:04:51] Speaker B: Well, the book was actually kind of a creation out of my life. [00:04:57] Speaker A: Let's do a name drop here so everybody can go and get the book because I personal, you know, Ari's stamp of, of, of approval. Human. [00:05:06] Speaker B: So the, the name of the book is called Human Hacked My Life and lessons as the World's first augmented Ethical Hacker. So essentially what that means to the layman is I am what is known as a transhuman or a grinder. So I am a human being that has augmented my body by adding 10 different subdermal implantable microchips within my, from between my elbows and my fingertips. And I can literally hack certain people and certain devices through physical contact. I have magnets in my fingers that I can use to trace electrical wiring. It's actually granted me an additional sense. I use subdermal implants to enhance my security online through a key, a tool called the Vivo Key Apex. This is where I get my one time passwords. It'll do HMAC, SHA1, PGP, all different kinds of encryptions, and you can even put your Tesla key card on it. And so that way you don't have to worry about your keys and you can just get in your car and drive. So I am a little bit different than most cybersecurity professionals in the fact that I've actually integrated my body with the technology that I'm planning to use in order to attack people. [00:06:29] Speaker A: So there's, there's something incredibly interesting about this, you know, sixth sense. Tell us more about this. [00:06:37] Speaker B: So what I've done is I have put, placed a magnet in the tip of my left pinky and the amount of nerve endings that are bundled at the tips of your fingers, once it heals, any movement will actually produce a nerve sensation. The closest thing I could compare it to would be kind of like the, excuse me, spidey sense. So if I get anywhere near an electromagnetic field, electromagnetic current, even poorly shielded electrical wiring inside of a house, I can get. I get a tingling sensation in the tip of my finger. And the closer I get to that source, the stronger it gets. So I can actually feel the invisible currents and things that are around us every day. So it's been an amazing addition to the human condition as far as I'm concerned. But it does have a couple of drawbacks when you got some really interesting friends like I do. I grew up on the west side of Detroit. It's a kind of a rough place to grow up. And I'm not afraid of Being shot at. I'm not afraid of people pulling knives on me. I am afraid of Earth magnets. [00:07:50] Speaker A: What is that? [00:07:51] Speaker B: Those really strong, powerful little magnets that you can get. I have friends who like to run around and chase me with magnets. [00:08:03] Speaker A: Oh my God. That makes you rethink the definition of friend at that moment, I would say. [00:08:10] Speaker B: Well, you know, they're my friends. I mean, I do pranks on them as well. They just nowhere near as painful. I mean, because when you think about it, when, if it snaps on, you're being pinched from the inside and the outside of your skin at the same time. It's a lot different than what most people have ever experienced. [00:08:31] Speaker A: Absolutely. [00:08:33] Speaker B: But you asked why, and that's the question I get a lot of the time. I've been breaking into computers for decades. I credit my first hack to a Commodore 64. My first network based hack was done on an isolated Token Ring network. So I've been doing this stuff for a really long time and I spent most of my life as an active criminal. I was in the motorcycle clubs for the better part of my life. And it wasn't until I was given my, one of my first granddaughters that I realized that I needed to make some changes in my life. The choices I was making in the direction I wanted my life to go or else to steal a line from aa. The only opportunities I was going to get to really, really spend any time with my grandchildren was going to be either from behind bars, an institution, or at a grave site. So I actually made the choice to walk away from everything about my life. I gave up riding in the clubs. I decided I wasn't going to be a black hat anymore. And I tried to just be the same type of security professional that anybody else would be. And I think it was just a through circumstance and fate that my past and all of the activities I was doing when I was acting as a criminal came back to be able to give me a future that I could have never imagined as a protector. [00:10:06] Speaker A: I appreciate that. I want to ask you a question about hacking today. [00:10:12] Speaker B: Sure. [00:10:12] Speaker A: I mean, on the one hand, you know, reverse engineering of patches, you know, there's a certain laziness to it. But on the other hand, understanding, you know, protocols. You mentioned Token Ring. Like understanding the really understanding a protocol, looking at how it works and trying to find vulnerabilities is so hard. [00:10:30] Speaker B: Not me more. [00:10:32] Speaker A: Tell us more about that. What does the landscape look like? [00:10:35] Speaker B: To be honest, I get a lot of people who ask me, you know, I want to get into cybersecurity. What should I study? I want to get into cybersecurity. What classes should I take? And honestly, for about the last year and a half, I've been telling them to take English in composition classes. [00:10:52] Speaker A: Oh, that's surprising. Why? [00:10:55] Speaker B: Because people like me, you know, who've been doing this forever and you know, we have the history, you know, I started in OS 2 warp and DOS, you know, I. We don't have these individuals out there anymore. And the truth is with the current landscape and the tools that are available, there's no need. A lot of it's going to LLMs and AI. I mean there within the last week or so, Google has two different AIs. One of them was actually able to find the first zero day in SQL Lite by doing code analysis. And they have another tool that is a fuzzer that is able to work at infinitely exponential degrees better than a human being. So I really don't see the future of offensive security being people like myself. I see it more as individuals who will probably be leveraging some type of an AI or a large language model to actually do the exploitation code creation, possibly even to up to automating the actual attacks. Have you ever heard of the LLM White Rabbit neo, for example? [00:12:16] Speaker A: No. [00:12:17] Speaker B: This is an LLM whose sole purpose is to act as an assistance for offensive security professionals. And just as a test, I went out there and I went to vulndb, which if you're not familiar with or your listeners aren't, this is where most of your new exploits are first, you know, put up for the world to see. Doesn't mean that there's exploit code, but it's like, hey, we discovered a remote code execution. We were, you know, whatever it is, and I went out there and just said, what is a new CVE that nobody's done anything with? Went into White Rabbit NEO and said, make me proof of concept code for this new thing. It went out to vulmdba, pulled the vulnerability in, analyzed what it was and threw out Python code. I mean, it didn't work right off the bat, but what it did do was give me a starting foundation and a framework to build off of without having to do months and months of research. [00:13:26] Speaker A: I didn't believe that this worked in just this concept of asking, you know, chatgpt or whatnot to write code until I tried it out. I am a serious coder right back into the 20, 30 years ago. Sure, you need to tweak it, but you look at the code and like oh, my God. This has saved me hours of looking at APIs, hours of reading documentation, hours of figuring out the ins and outs. Now I can just have a look at the structure and say, okay, I need to tweak this a little bit. It's mind boggling. I did not expect this. I did not think that this would happen. In fact, if you would have asked me if AI is a real thing, I would have told you, no, this is just sophisticated neural networks. This is all Bullshit. And then ChatGPT happened and I had to eat my hat, right? I had to say, no, something in the world has changed. And it just took off in a way that quite honestly is scary. It scares the shit out of me. On what? [00:14:27] Speaker B: I think it should scare the shit out of it. Yeah, I think it should scare the shit out of everybody. I mean, especially when we've got the AI enabled phishing, which is actually going out, figuring out who your CFO is. Maybe your, your CTO figures out. [00:14:43] Speaker A: Have you seen it do personalization yet? I haven't seen it yet. [00:14:46] Speaker B: Oh, yeah, it's basically in. It'll go out and it'll search all of your social media posts, figure out your writing style, and then it can basically act and speak as if it was that individual. [00:14:59] Speaker A: You know, one of the things is coming, but I haven't been exposed. [00:15:02] Speaker B: Oh, it's not coming, it's here. [00:15:04] Speaker A: Oh, my Lord. [00:15:05] Speaker B: And you heard about the deep fake fishing that happened in Hong Kong about four months ago? Yeah, you know, and that's one of my personal, you know, things that I'm up on the soapbox preaching about is the fact that we are now in a position in time here within this world, that we can't believe what we see and what we hear anymore. And with the fact that biometrics, in my opinion, are one of the worst choices for an authentication method that you could ever use, if for no other reason. [00:15:37] Speaker A: Talk more about that. Why is that? What's your perspective? [00:15:41] Speaker B: Well, for starters, when you register a biometric with a device, it's not taking a snapshot of your face, it's not remembering your thumbprint, it's mapping out specific landmarks, and then it's doing a comparison when you go to authenticate. But what people don't realize is after you register, that data is written down to a very specific file called a biometric template, which is stored in a secured location on the device in memory. But that is not untouchable. I mean, iOS actually had some malware that was running around a couple of months Back that was actually biometric stealing malware. And the problem with any type of biometric template is once it's compromised, this isn't like a password that you can rotate. [00:16:31] Speaker A: Yes. [00:16:31] Speaker B: You cannot change your face, you cannot rotate your thumb or your fingerprints. So I don't personally believe that we should be using these in any way, shape or form. I mean, it was the Chaos Computer Club out of Germany and I can get you the exact date, but they actually were able to compromise the German defense minister, I believe her name was Ursula Vanderhof. And if you notice, you don't see a lot of politicians talking with their hands up in the air anymore because they were actually able to lift fingerprints off of high resolution digital photos. So anything that would be considered open and public domain, in my opinion, is not something that we should be using as an authentication method. [00:17:20] Speaker A: I mean, when you just, when you just look at that from that perspective, that it actually, your face is just a password, it's open to everybody to see it. And however you encode your face, really that's where the secret is correct. There's nothing secret except in the encryption. So it just becomes a. Well, how did you do that? What's the algorithm look like? That's all. [00:17:39] Speaker B: Yeah, and I can simplify this even easier. You know, when it comes to what we are legally given protections for here within the United States, it's passwords and pass phrases. Your voice is not considered protected, your face and your image is not considered protected. That's why we've heard of cases here. [00:18:02] Speaker A: Within the, from the PII perspective or from what perspective? [00:18:07] Speaker B: No, from, from a legal, from a legal challenge. From a legal challenge. So let's say, let's take a look at, you know, some, and I'm not trying to say that this is a good thing or a bad thing. It's just the way the law works. You know, there were quite a few people who had committed terroristic typ acts that had iPhones, if you recall. [00:18:27] Speaker A: Yeah. [00:18:28] Speaker B: And you know, the, the governments were trying to get Apple to break into the encryption, break into these phones, and they were like, we can't do it. [00:18:35] Speaker A: Right. [00:18:35] Speaker B: So take those same situations. The only reason they were able to. [00:18:39] Speaker A: Do that, you know, that's bullshit, by the way. [00:18:40] Speaker B: Oh, I know that and so do you. I mean, I want to, I, I. [00:18:44] Speaker A: I developed that hardware that they actually had and didn't use. M I, I work for Celebre. [00:18:50] Speaker B: There you go. [00:18:51] Speaker A: I was the product manager, I hired those hackers. I was the person who, you know, four Weeks, what? Months after they came, like, oh, we use this company, it's called Celebrite. I was employee number 13. Nice. So. So, you know, anyway, I didn't want to go into that story, but that's complete bullshit, that whole story. [00:19:10] Speaker B: Oh yeah. Anyone who is a manufacturer of a product, they have a way to get back in it, guaranteed 100%. But if you, if you take that same situation and remove the password aspect and if it was just a facial recognition. There have been many cases that have been documented here within the United States where some type of an authority has just walked us up to somebody while they were in the cell, held the phone up, opened it up. [00:19:39] Speaker A: Yeah. [00:19:39] Speaker B: You know, it's this. If it's considered public domain, it's very difficult to try and double it as a authentication method, especially if anybody in the world can actually take your picture. I mean, one of the things that we've been seeing a lot lately out in the wild is the use of deep fakes to try and bypass facial recognition because it'll give you the voice, it has liveliness. So, yeah, I'm just not a fan of biometrics. I would prefer to see some type of pki, you know, a yubikey, something along those natures in addition to a password. But that's just my preference. [00:20:16] Speaker A: Yeah. If I had to like be the king of the world for a day, I would make it legally mandatory that every single person in the world used these and then every single software in the world basically supported hardware tokens. And it's, you know, it's not. This is not a fingerprint reader, what we see here. It's just attached to turn it on. So it's not on all the time. That's basically opening in that gate to give that authentication. I mean, if we just had that, I feel like it would make life so much more difficult. It wouldn't. [00:20:47] Speaker B: Oh, absolutely. For sure. [00:20:49] Speaker A: But it would be such a different world. What frustrates me is that it's taken a while for companies to even support physical tokens, including Amazon, which is which, you know, they have issues around that where it's not as easy to use it for their gov cloud solutions. So it's like what's happening? [00:21:08] Speaker B: Well, it's not. It costs money. [00:21:11] Speaker A: Yeah. [00:21:12] Speaker B: You know, and the sad part is, is when we look at a lot of things in regards to compliance there companies are typically not doing the right thing. Because it's the right thing. Yeah, they're doing the right thing just to make sure that they meet some Type of auditory requirement. And to me that's really sad because at the end of the day, we are the consumer. If we're going to buy from these companies, I would like to have a much greater feeling of that. They, as my provider are looking at my data that I'm giving them as something that's valuable and needs to be protected. Yeah, most of the time it's more along the lines of it's easier to pay out than to fix the problem. [00:21:57] Speaker A: I was livid when I figured out this kind of truth in the industry that it turns out people know. But basically compliance is zero. It gives you zero security. There is no connection between compliance and security. And unfortunately I was the person who fired this consultant. Basically, these consultants are so good that they will do whatever they need to have the discussion with the auditor, create the evidence that they need. You have this exchange between the consultant and the auditor and you get socked to ISA 2701, whatever. And did anything change in your security posture? Nope. Nope. Absolutely. [00:22:37] Speaker B: You know, because that's not the intention. The intention is to pass the audit so that way they can continue to make money. [00:22:43] Speaker A: That's right. And really most of you know, CIOs, CSOs, CXOs, whatever, they look at compliance. It's really a sales enabler. This is what you need. And you know, in order for your customer to say sure, and it doesn't even say you save you these security questionnaires. We have to. I've answered between 100 and 300 and a friend of mine told me a thousand questions. Thank God I didn't need to do that. Never. But oh my God, here's my point. Right. I think that compliance is level zero. [00:23:14] Speaker B: It is. That's, that's. To me, compliance is kind of the bare minimum in order to be able to get some type of cybersecurity insurance. It's essentially, if you want, these are the bare minimums to be able to work in this space. [00:23:31] Speaker A: Yes. [00:23:31] Speaker B: And, and it's the companies that will take that as a starting point and build from it that I think are doing the right thing in their minds or in the right location. But unfortunately, when it comes to, especially here within the United States, being a capitalistic society and a profit driven system, they're more beholden to their shareholders than they are their actual. [00:23:55] Speaker A: And this is the legal requirement, right? Legal requirements of you as an officer of the company is to maximize shareholder value. That's it. There is no legal requirement for being ethical. [00:24:07] Speaker B: Like, no, you can't break the law. [00:24:09] Speaker A: But you don't need to be ethical. [00:24:10] Speaker B: And you don't need to be moral either. [00:24:12] Speaker A: Absolutely not. Which is ridiculous. Right. [00:24:15] Speaker B: You know, and to be honest, that's one of the things that I've been bringing up a lot lately is the fact that I feel like as a society, we have really lost the anger that we should have in regards to our information leaking. [00:24:30] Speaker A: Yes. [00:24:31] Speaker B: I mean, I did a survey at one of my last presentations where it was like, how many people here have gotten one of those letters from a company that says, we're sorry your information has been leaked. We're going to give you a year's worth of free credit monitoring. [00:24:46] Speaker A: I have 10 of those. [00:24:48] Speaker B: But do you remember the first one? That was the point. I remember the first one of those I got. I was furious. I was. I was fit to be tied. I was ranting and raving and screaming. And now it's just become part of doing business here in the United States. And I think we've all become almost numb to it and apathetic. Whereas it used to be, you know, you'd have some type of. I don't want to say social ramification, but, you know, there was always this big negative hit into the company, either for their reputation, whether it was their branding, but now it's almost like, oh, yep, this is just part of doing business in today's modern society. And I think that's a mindset that we as the customers need to start really pushing back on and fighting this narrative that this is acceptable. [00:25:49] Speaker A: Absolutely. I mean, we kind of had Enron, and that was kind of outrageous. But then we've been slowly being desensitized. [00:25:57] Speaker B: Absolutely. [00:25:58] Speaker A: And it's like, okay, this is a thing that happens. I'm to the level that I'm like, okay, I'm going to put pin codes on my SSNs. Like, by the way, everybody should know this. You can put a password on your Social Security number. [00:26:10] Speaker B: Absolutely. You can lock it down and there's free. [00:26:14] Speaker A: It's actually. They try to sell you it, but there's a free way to lock your. They call it the freeze, as opposed to their paid service. You can freeze your Social Security, your. Sorry, your credit. Credit score. You can put a password on your Social Security number. I don't know, maybe I'm. Maybe I'm just, you know, over zealous. But I. [00:26:32] Speaker B: Once a year, I think you're about. I think you're right where you need to be, man. [00:26:36] Speaker A: But this is not coming. Right. I mean, compared to. But once a year, I call up the company, the credit card company and say, oh, I lost my credit card. It's not true. I have it. I just tell them I want a new credit card number once a year. [00:26:47] Speaker B: Oh, have you seen what's going on down in Latin America with the credit cards? And so they're, they're using numberless credit cards there. [00:26:56] Speaker A: Yeah, yeah. And I love that, by the way too, that you can. PayPal does this to a certain degree, right? Yeah, I think that's an undersold value of PayPal that they don't kind of explain how they work from that perspective. And then your ability to create virtual credit cards, if you're doing something dodgy, I can't remember who was the company. They have like a black logo. You can basically create on the spot a new virtual number and basically use it for shit. Or separate have three virtual numbers and kind of say, okay, and if you're. This one. [00:27:23] Speaker B: If you're going to do like an auto pay every month, you know, you can set up one specific number for say your electric bill. And if anything but that company tries to use it, it'll be denied. This. I, I've seen this starting to make its way into the US market, but they're actually selling this as a premium feature to a card that is. Which makes no sense. Well, this could be the potential. Yeah, this could be the potential end of credit card fraud. [00:27:55] Speaker A: The technique there, the fact that they're selling it, I agree with you. But the technique there is delightful. I mean, that concept that here's your trusted transactions and here's your untrusted transactions and you should not have a repeating, you know, whatever on a untrusted transaction credit card. That is absolutely delightful. I love that technique. Is this commercialized? Where is this? Europe? They're usually ahead of us. Where's this? [00:28:18] Speaker B: Actually, I've seen it more in Latin America. I've seen it all over South America. I've seen it a little bit in Europe. But I'm just looking at it more from. If I was the credit card company, why wouldn't I want this all the way around just to stop the credit card fraud and the potential having to repay all of these fraudulent charges, it doesn't make sense to me. But. [00:28:44] Speaker A: Well, when you look at incentive design and how the credit card companies work with the insurance companies and how the specifically in the US as opposed to Europe, how the individual consumers are actually not impacted due to their insurance, it kind of becomes this issue in incentive design and misalignment. But we're going Way too deep. We're going way too deep. [00:29:05] Speaker B: Well, you said rabbit holes. [00:29:06] Speaker A: Yeah, that's true. Here's my personal pet peeve, right? You can pay for LinkedIn to authenticate your identity. We have know your customer, know your consumer requirements. If you kind of think about an expansion and this goes into 1984, George Orwell kind of rabbit holes where, well, should we really do this? But my point is, if you calculate the sum of evil in human trafficking in, you know, pedophilia and all these awful things, know your consumer act, this future hypothetical. Know your consumer act in social media would completely change the landscape. It would also bring us into our Orwellian, you know, domain, which is kind of scary. [00:29:54] Speaker B: We're already there. [00:29:56] Speaker A: Yes, yes, yes, absolutely. So do you. I don't know, I fight myself with this. I'm like, should we take, you know, another step towards 1984 but make it so much difficult for pedophiles and human trafficking? Or is this a line we shouldn't cross? [00:30:15] Speaker B: I don't think there's no right answers here. [00:30:19] Speaker A: I don't know what that means. [00:30:20] Speaker B: I'm not going to say that I think that, you know, any type of abuse should be tolerated anywhere. But what I would say is I don't think there's a way to stop it, you know, So, I mean, even if we try and lock down certain channels, the one thing that we, we've discovered is the criminal element will always find a way. [00:30:37] Speaker A: Yeah. It's an arms race. [00:30:40] Speaker B: It is. We're in the largest arms race that the world has ever seen. The only difference is it's not being done with weapons and missiles and tanks. It's all being done on a cyber battlefield that no one can physically see. [00:30:56] Speaker A: Yeah. And the U.S. government just caught up to this relatively recently. There is one basically modem, right, communication check chip in the world that half of us are using, and it's manufactured by one manufacturer in China. I'm oversimplifying, but so the U.S. government. [00:31:11] Speaker B: We'Re not going to say the name. We're not going to say the name. [00:31:14] Speaker A: You can Google it. But so the US government basically caught up to this and they're like, oh, if you're a, you know, government contractor in some way or another, or you're in this category of protected goods, you can't use these chips. So there's a list now of, you know, the. What is it, like, 70 now or something like that that you should not use. So now all these manufacturers are like, holy shit, we need to redesign Our board. Right. And this is recent. [00:31:38] Speaker B: Yeah. But you know what? That to me, that's just kind of taking it to the governmental level of what we're seeing more in a grassroots efforts with S bombs. [00:31:47] Speaker A: Yeah, Talk us through this. Right. Just bring our audience with one of. [00:31:51] Speaker B: The newest things that we've been seeing out in the wild are what are called SBOMs or S bombs, which is a software bill of materials. You know, up until now, when it comes to doing what a bill of. [00:32:02] Speaker A: Materials really is, it's all of the things that goes into building something. Now this can be a physical bom. If you're building a phone or it's every single electronic component in that makes up your. Because you know, a manufacturer doesn't manufacture everything in there. They buy a whole bunch of things. They print it on a printed circuit board, which is that old green thing. They're not always green. Or SBoM. What is SBoM? I can't show you. Software. SBOM is software. [00:32:29] Speaker B: Right, Software. You know, to. To Ari's point, there are hundreds and hundreds of libraries that can be used in order to create functionality within an application. A lot of these libraries are open source and they're free for anyone to use. But therein also means that these have their own potential supply chain attack style attack vectors. Let's say I'm using, I don't know, G as my compiler. Yeah, this is an open source tool. If somebody gets into the open source tool and says every package that's compiled, put this line of code in and it makes it through that particular code, check everything that it that particular G could be pushed down to hundreds of thousands of machines. And every single application that's built utilizing that could potentially have that same line of attack code. And it's only been within the last two to three years that we've really started seeing people looking for accountability in the libraries and the coding techniques and the softwares that are being utilized to build the applications that are running our world. [00:33:43] Speaker A: So this was almost exactly three years ago, there was a massive SBOMB attack. It affected everyone, including myself. And there wasn't even a fix right away. Like we had to wait for the library to get it done. And then why this is so complicated is because when, if, when you need to update your code, test it, release a new version, if APIs change or the signatures, quote unquote, like it's not easy to replace dependency libraries in your code. That can be days or weeks of work in the best case. [00:34:15] Speaker B: And if I can throw one point in there when most end users think about updating an application, they're thinking about like a Windows update where everything's already packaged for you. And that's not what we're talking about here by any stretch of the imagination. [00:34:31] Speaker A: Yeah, yeah. Just to bring this to kind of being crystal clear for everyone. You know, you think about it as, you know, you've got thousands, if not hundreds of line, hundreds of thousands of lines of code and you've got this, you know, maybe, you know, a thousand lines of code that you need to go and update. Right. And it's not as simple as replace all. It's a problem. It really is a problem. And the thing is that it's not your code, it's somebody else's code that you're using to accelerate your development. So you don't even know how it works. You can't scan that code many times, you don't have access to it. Sometimes you do. It's a really interesting problem. Len, I want to ask you a difficult question and you know, everybody asks the answers this from their own personal experience and bias, so it's okay, there's no right answers here. What do you think for, you know, security officers right now? What is the biggest top three threats that they need to be thinking about? [00:35:39] Speaker B: Ooh, I, I'm a firm believer that I think we're going to start seeing a much reduced short time between discovery of exploit and proof of concept and usable attack code. So I don't think that anybody who's in a managerial position around cyber is really prepared for what I'm going to see as the elevation of the speed of attack that we're going to start seeing over the course of 2025. [00:36:15] Speaker A: Right. And really what this is, it's automation of absolutely disassembling patches into attacks is really what we're talking about in one aspect of it. [00:36:24] Speaker B: That's one aspect. The. I think that one of my big predictions for 2025 is I think we're going to see our first AI based apt over the course of the next year. [00:36:36] Speaker A: Just explain that. Take out the acronyms. [00:36:39] Speaker B: APT in our world stands for Advanced Persistent Threat. So this would be potentially your nation state attackers groups coming out of China, Iran, Russia. Currently most of these individuals are acting as groups that are supported by these governments. Well, with the advent of new AI, LLM technology, automation that can all be combined, I'm betting that we're going to be seeing some of these adversarial nations developing full autonomous AI whose only sole purpose is to weaponize cyber Attack. So the ability to need a human behind those attacks I think is going to go away over the course of this next year. So that would be the first thing. [00:37:27] Speaker A: And just to bring our audience with us because I don't think it's a commonly known fact that cyber attack technologies have been productized and popularized and democratized. So you can go and buy cyber attack technologies. They have a support number. [00:37:44] Speaker B: Oh, you can actually do ransomware as a service now to the point, just like you could up on aws. There's an entirely offensive cloud out there that's based out of the darknet. But I think nobody is really prepared for the speed at which the evolution of attacks are going to happen over 2025. I also think that we're going to see a lot of repatriation, so basically detransitioning from cloud based services. I really do believe that the cloud bubble has finally started to burst. We've seen massive increases for costs for data at Rest compute and I see a lot more companies bringing historical archival data back on site. And I think we're going to see the idea of a strictly cloud environment go the way of the dodo. And we're going to move predominantly into hybrid models where only cloud native services will be remaining in the cloud. [00:38:57] Speaker A: That's so interesting. Archival data, I get it. Right, but what about the like, do you think people are going to start, you know, getting their own server farms, just, you know, companies running? [00:39:09] Speaker B: No, I really don't think it's going to be so much server farms, but I think it's going to be more along the lines of a lot of NAS or SAN storage locally working through some type of federation to whichever cloud platform that they're using. I don't see a lot of Azure AD going away, but I think we're going to see a lot more on prem for anything that is no longer cloud native Kubernetes, OpenShift, Docker, things of that nature. But I think we're going to start seeing more and more legacy style applications that are still in use being brought back on prem just to deal with the advancement in the cost of running in the cloud. [00:39:50] Speaker A: Yeah, I mean to me it's terrifying if Git gets hacked, right? It's those kind of things. [00:39:55] Speaker B: Let's be honest. I think the cloud is another one of those things that it has great potential but unfortunately nobody understands iam the same way they do Windows or standard Linux based permissions. And that's why we're not hearing a ton about cloud being hacked as Much as it's leaks or breaches. And most of that's due to misconfiguration of IAM roles or cloud based permissions. [00:40:25] Speaker A: Right, right. I appreciate that, Len. I mean, looking at the time, we should have been done five minutes ago. This has been so much fun that we just completely lost track of time. I appreciate you. [00:40:39] Speaker B: Yeah. I mean, and we didn't even really get into the fact that I can break into your devices through physical contact. [00:40:44] Speaker A: Oh, my God. Yeah. But this idea of, you know, you connect the USB or you even give a USB table cable or your charger, you know, oh, hey, use my charger and that's. You're done, You're a hack. [00:40:56] Speaker B: Well, I know we're short on time, but let me give you something else to think about. And I don't know if you can see this. Yeah, yeah, I see that is a NFC RFID chip. And if you actually put your phone in my hand, I don't need to plug in a usb. I can actually trigger the NFC and actually compromise you through that. And one of the things that I love asking people because I think this really forces people to kind of take, reevaluate the way they see things, is I'm sure at some point in time, Ari, we're going to meet up in person. Just because we both travel in the same circles, if I was to just say, walk up to you out of the blue and go, will you hand me your wallet and let me go through it, you would probably look. You would. Exactly. You're out of your mind, Len. What are you thinking? But at the same time, if I. You didn't know I had these implants. And I walked up and said, hey, Ari, let me see your phone for a second, man. I got this video I want to show you. [00:41:55] Speaker A: Yeah. [00:41:57] Speaker B: There's a more than a 50, 50 chance that you'll hand it over. And if we look at the amount of personally identifiable information that's contained within wallets and purses compared to what's in those cellular devices, it's not even comparable. So if nothing else, I'm hoping that through my exposure of being a transhuman with the implanted microchip technology and the fact that I can actually compromise through physical contact, I'm hoping people will start to reevaluate the security by which they look at their mobile devices. [00:42:31] Speaker A: Absolutely. Len, thank you so much for joining the program today. You know what? We're gonna have to have you back. This has been such a delight. There's so many topics that we didn't touch upon. So, hey, I really appreciate it. [00:42:42] Speaker B: I'd be happy to come back anytime, man. I appreciate the opportunity to talk to you. I had a great time and a great conversation. [00:42:48] Speaker A: Thank you so much. You're very kind.

Other Episodes

Episode

November 05, 2024 00:28:11
Episode Cover

Katherine Loflin | Nov 5, 2024

In this conversation, Dr. Katherine Loflin shares her insights on parenting, community attachment, and economic development. She emphasizes the importance of teaching children adaptability,...

Listen

Episode

November 14, 2024 00:47:57
Episode Cover

Jeff Wenninger | Nov 14, 2024

In this conversation, Jeff Wenninger, a retired law enforcement professional, discusses the complexities and challenges within the police force, emphasizing the need for cultural...

Listen

Episode

September 04, 2024 00:52:28
Episode Cover

Wendy Alexander | Sep 4, 2024

Wendy Alexander shares her journey of overcoming rock bottom and building a successful career. After a tempestuous relationship and financial struggles, she focused on...

Listen