Jeff Man | Dec 4, 2024

December 04, 2024 00:36:57

Hosted By

Ari Block

Show Notes

In this conversation, Ari Block interviews Jeff Man, who discusses his involvement with Hack for Kids, a nonprofit organization aimed at introducing children to cybersecurity and technology. Jeff emphasizes the importance of understanding the human and process aspects of cybersecurity, arguing that technology alone cannot solve security issues. He shares insights on the current cybersecurity landscape, the role of compliance and regulation, and the challenges of cybersecurity audits. Jeff also highlights common weaknesses in organizations, particularly the misconception that security can be outsourced, and the critical importance of key management in encryption.

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: Welcome aboard to the show, Jeff. So happy to have you here today. [00:00:04] Speaker B: Happy to be here. I think we'll find out. [00:00:07] Speaker A: I love your optimism. So, Jeff, I saw one of the things that you're involved in is called Hack for Kids. I learned how to program when I was 8. I have a confession. I'm doing a terrible job in getting my kids involved in programming and all things technology. Tell us about Hack for Kids and maybe give me a few tips on how to do a better job with my kids. [00:00:34] Speaker B: I can certainly try. And of course with kids, having raised a few of my own, I'm now on to grandchildren. You never know what you're going to get. It's kind of a crapshoot. That's my best parenting advice. Hack for Kids is a nonprofit that started, gosh, I don't know, probably time is weird because of the whole, you know, pause button effect of COVID But it's been around probably for about 10 years or so. And the person that started it just sort of had this vision of doing like a mini hacker conference for that's more focused, oriented towards kids, adjacent to some of the regular adult hacker conferences with the goal of exposing kids to just different aspects of cybersecurity and hacking. So there's, you know, it's pretty, pretty well laid out these days in terms of workshops. It sort of resembles if you've been to a hacker or security conference. There's lock picking, there's welding for the older kids, puzzle solving, a little bit of crypto challenges, math challenges, some of these logic puzzles like MENSA puzzles that you have to try to take apart or put together. It varies in the different activities. Sometimes they'll get donations of old IT equipment and just let kids kind of rip it apart to see what it looks like on the inside because that can be fun. All for the purposes of just exposing young people, children to different facets of technology and security. Of course. I being an old timer security professional, I don't like the focus that it's always on technology because I think there's more to it than that. But it's, you know, it's kind of a necessary part of it. My involvement is trying to bring diversity to Hack for Kids. I have a long term project of trying to bring a Hack for Kids type of experience to underserved, underprivileged communities. So I have a long Runway for that because I have a day job and a few other side gigs. But my personal belief is, and based on my life story, my experience and how I got into this business is discovering the aptitude and the propensity or the interest in doing kind of security things, hacker things, but not necessarily through the usual, well, are you an engineer, are you a computer, Computer scientist, You like to program, you know, the academic world sort of got that covered. But there's other people out there that have an interest and an aptitude for doing things in the security and the hacker realm that I would, I want to find too, and at least expose them and give them the opportunity or the let them know that there's the possibility of doing something different. So that's Hack for Kids in a nutshell. And my, my involvement in the group. [00:03:51] Speaker A: I appreciate that you said there's more to it than the tech. What does that mean? [00:03:58] Speaker B: I was listening to some of your other episodes and I'll forget who was saying what, but you know, somebody mentioned people process technology or at least people being the weakest link of the security triad. It's part that. But the way I learned information security a generation ago, back in the late 1900s, there was very little technology involved, at least not digital technology. I sort of came in at the, at the very beginning of what we now call the digital age where there was, there was technology involved, but it was mostly mechanical and machines and phones and radios and all the signals were over sound waves and radio waves and TV waves and not just digitized. Of course. I chuckle Nowadays there's so much data flying around that, you know, it's possible to pick it out of the air again like we used to do a generation ago. But when I say it's not all about the technology, I'll fall on my sword on this until the time that I retire or am retired from the industry. But the whole industry approaches cybersecurity and security in general. Data security is what we used to call it as a technology problem. And I claim it's not a technology problem. Technology is the problem. We've innovated far beyond our ability to control what it is that we're using, frankly. And we've never, and I say we as a society have never, especially people like me that came out of the government and the DoD, where data security and information security was kind of part and parcel to the Cold War and espionage and warfare or preventing warfare, national security. It wasn't about things that get talked about so much these days in terms of ransomware and protecting all these poor municipalities and school groups and organizations that don't have a, and I'm being very broad stroke here, don't really have a clue as to what they're doing or what they need to do, don't have the budget for it, don't have a reason to do it, but they're getting hit anyway and they make the headlines and everybody gets to thump their chest and say, oh, these organizations need this, that or the other. And then the technology, for example, the vendor machine comes in and tries to save the day. The problem is security fundamentally, to me, starts with an understanding of what it is that you have that's valuable, that's worth protecting. Who do you want to protect it from? How much is it worth to you? How much do you want to invest in protecting against it? And frankly, nobody can afford to protect the data that they have that's valuable and sensitive to the degree that they need to today to apply every technology solution that's out there. So what you're left left with is people not knowing how to spend the scant resources. And I have a bias because I've been in the private sector for 30 years and I've been a consultant advisor pretty much all that time trying to go into organizations and teach them about the fundamentals of data security, what it means to them, how to apply it to their organization and circumstance. And I generally have collectively gotten over the past 30 years. Yeah, yeah, yeah, just tell us what to buy and what colors should the blinky lights be and where do we need to put it and how many do we need? Don't make us think about these things because we, you know, we can't afford to think. Don't tell us what we already know to be the problem. Just tell us how to fix it. Which it generally leads to technology, but 30 years into it, I would argue that we're not necessarily solving many people's problems. Just, you know, I used to say open up the newspaper nowadays, you know, pop open your social media or whatever your news source is and. Or do a simple search on things like security breach or cybersecurity breach on the source of truth, and see how much is still happening. You know, almost on the daily, if not weekly basis. And while there's some organizations out there that have, quote, unquote, solved the problem and are, quote, unquote, secure that they really are kind of like 1% of the organizations are out there. The vast majority of organizations don't have a reason to, eg, they're not regulated. They don't have compliance or regulatory requirements that they have to meet, have to meet. They nobody does the right thing just because it's the right thing to do. They're all companies that are largely in the business of making money or saving money and security costs money. And as I said, nobody can really afford everything. So they run around and they're subject to listening to the most convincing stories from vendors and salespeople and the most compelling buddy, buddy, partner, partner. It goes round and round and round and they buy solutions and they think they're good, they think they're done because they don't have to think about anything. And yet breaches keep happening, organizations still get popped and the, and the worst thing is you'll, you know, we're getting towards the end of the year you're going to start to see the annual wrap up stories from people in the, in the security media and the cybersecurity media as well as the future looking, you know what's going to happen in 2025 and you know, somewhere in there you're going to get stories about, you know, what are the top 100 or 200 passwords that are being found, you know, from pen test vendors, pen test companies, or from the latest breach reports. And when you get down to it, companies are mostly falling because of the same problems that they were falling to 30 years ago, 40 years ago, 50 years ago. And you can point your finger and say, well shame on you, you should be doing this, that or the other. But they largely don't because they don't know that they have to because they've invested it in something that they think will secure and therefore make them not have to think about it. I could go on and on for hours. Hopefully you're getting the gist of where I'm coming from. [00:10:56] Speaker A: You're painting a picture of the good guys basically slowly losing a. Would you agree with that? [00:11:07] Speaker B: Yes and no. I think we've fundamentally lost and we're just not willing to admit it yet. I remember when I was first learning sort of the pen testing craft, the breaking into computers to tell people in networks to tell people what's wrong and fix all the holes. Back some over 30 years ago, back in the early 90s, there used to be a TV show on called the X Files. Some people may remember that show. There was an episode one time where the premise of the story was some super secret information probably about UFOs because that was a theme of the show was stolen somehow or allegedly stolen because somebody had digitized it and put it online. And I thought well that's really stupid for a storyline because who would do such a thing? Fast forward to the day. What do we not have online what are we not storing in the cloud and what are we not sharing with supposedly ourselves with wherever we are? I mean, I still don't like using OneDrive because I don't believe I should have all my data in the cloud. I have to fight with my phone all the time to turn off all the automatic backups and stuff. It bit me earlier this year because I had a phone brick and I lost some information. But I was okay with that because most of my information is in hard drives that I keep, you know, physically next to me in drawers and safes and things like that. But there, there's winners and there's losers. There are a lot of companies that are doing a lot of the right things largely because they have to or because they learned a hard life lesson by getting popped, you know, previously. And so they, they got religion as like, as I like to say. But even they tend to fall back on and rely on technology. I was listening to one of your other episodes where somebody was talking about risk and risk assessment and risk management and third party risk. There's a belief that if I hire a third party, they're the experts and I don't have to think about it and I don't have to worry about it. And I can't say emphatically enough, no, that's not the case. You've got to think if you're going to, quote, unquote, win in this, in this business or game of cybersecurity, however you define that, if you're going to have any kind of, as an organization or a company, any kind of safe, warm fuzzy feeling that whatever it is that's valuable to you is, quote, unquote, safe, you're going to have to think about it. You're going to have to apply what to me is the most important part. Disagreeing with one of your previous guests. People is not the weakest link. I think process is the weakest link, or more specifically, lack of process. People are not doing the things that they're doing in a way that's repeatable, in a way that's demonstrable, in a way that actually prevents or more often than not detects the bad things happening. Even that's a reality. I don't think enough. People talk about security is not about prevention. It's about early detection and minimizing the damage. But we don't like to think about that. We like to think about, no, I want, I want secure, I want security. I want it to be a state where it's impenetrable and if something bad happens, well, then whoever it was was lying about how secure it was. [00:14:51] Speaker A: I would agree. That black and white perspective of the world, I think is incredibly dangerous. But it makes executives and gives them the ability to sleep at night to know that they're secure and that there's somebody to blame. But really, there's a thousand shades of gray. You talk about people versus process. Give us an example of what you mean by process. [00:15:16] Speaker B: Very common in our industry to point fingers at. Let's say somebody broke in because they guessed a password or they stole a password or the password was password or it was a default password. You could argue, and maybe it's somewhat semantic that. Well, that was some people, some person, some human making a decision to not do something like change the password or set a good password or not use a bad password. But I say there was a lack of process or procedure that forced the individuals, or should have forced the individual or individuals into doing the right thing. So that's one example. Another example, you know, similar is default settings, configurations got, you know, patching. You know, every time I see a pen test report where somebody got in because there was or they discovered a vulnerability that hadn't been patched and it was older than 30 days, which is a big no, no. But very often, you know, there's patches out there that are available for months, if not years, and guess what bad guys use them for these days to create ransomware. So maybe you should patch more often. But everybody says patch more often. If that's the case, yeah, you could point to a person and say, why didn't you patch your system? But more often than not, there is no singular ownership of systems anymore. It's. There's processes in place that are supposed to push out patches automatically or push a process or promote a process where patch becomes available, patch gets tested in the lab, patch makes, you know, somebody makes sure that patch applying the patch doesn't break whatever the application that's running on the system and so on and so forth. [00:17:14] Speaker A: So this is an incredible. [00:17:15] Speaker B: Does that give you a. I mean, it's semantic maybe, but I think what's always missing is lack of procedure or lack of following a documented procedure. [00:17:24] Speaker A: This is an incredibly important point that you're making now because it's showcasing the paradigm between, well, why don't we just patch everything automatically the same day? Because the hackers are using these lack of patches to attack. But the thing that we've kind of forgetting is that patches can break the systems that are working currently. And that's the thing that all, if not, if not all, many IT organizations and others have seen, and they're like, oh, we have to test the patch before we put it in. So there's this game of kind of patch first versus test and don't break, which is really going on. [00:18:03] Speaker B: Well, it's a game, but it's also kind of a myth, because, you know, I work with lots and lots of organizations, and not all organizations are equal. Not all organizations have test labs, or at least test labs set up the same way. And of course, more often these days, the test lab is containers in a cloud environment that they can spin up or not. But the idea that there's a test lab where there's a system that's emulating and, you know, it's configured and has everything running the way the production system has so that you can do proper testing and kind of put the system through its paces, whether it's installing a patch or modifying the application that you're responsible for. Most organizations, I'd say, don't really have that functionally. They might claim to, but they're much more dependent on somebody else doing the job for them, or at least the perception that somebody else is doing the job for them. But it's also a convenient excuse. We can't. You know, I remember 30 years ago, coming out into the private sector, why don't you have your operating system updated to the latest version? Because the application that's running on it works on that version, and we don't know if it works on later versions. And that's an excuse, and it might be valid, but it's rarely tested, or it's rarely tested enough to see, okay, yeah, we're good. And that works both directions. Because there's also an argument for all the old unsupported software and systems that are out there, again, that are running critical business applications and functions that if you break it, you kind of bring the business down so nobody touches it, that flies in the face of patch early and often and keep everything up to date and everything current. So in that way, it's sort of an enigma because most organizations are relying on things that are running on older systems. And a generation ago, we wanted our systems to last a long time because that was a good investment. Whereas in modern times, we're expected to roll over the technology every two or three or five years as a matter of course, which costs money. Did I mention money earlier? And money being a factor of cybersecurity that we don't talk enough about. [00:20:38] Speaker A: So this, this concept of, of cost. You said only 1% of companies, entities, whatever, can actually afford to do what they need to do. What's the answer? In lieu of unlimited budgets? Right. Is it just prioritization? Is there, is it, is it the same answer for all organizations? Is it organizational dependent? As the CEO, how does one even make a decision around cybersecurity? [00:21:08] Speaker B: Well, pragmatically, most organizations will only do what they have to do. So they're only going to do what regulatory bodies, compliance standards, whatever they're subject to, are telling them to do. And even then, they're going to push back and argue and try to limit scope and reduce scope and, and make the burden of compliance go away. And now's probably a good time for my full disclosure that I work full time as my day job in the payment card industry, PCI field. I'm what's called a qualified security assessor or qsa. So I'm the quote, unquote auditor that comes into a company and puts the company through its paces to see whether it's actually following all the security requirements that are in PCI. And I've been doing that for 20 years. That's half of my career, which I throw up a little in my mouth when I realize that that's how long it's been. But I do it because when PCI came along 20 years ago, it was actually writing down some pretty specific things that you should be doing in order to create a secure organization. And the way it was originally presented was we're going to do an inspection of your security program, your organization, to see how well it holds up to protecting a particular type of data sensitive data, credit card information. Credit card numbers. Why? Because credit card numbers 20, 25 years ago were being stolen by the millions and it was a very lucrative market. So PCI has been around for 20 years, has made a significant impact and sort of changed the way the bad guys do business. I don't think it's completely coincidental that as more organizations kicking and screaming, voluntarily or involuntarily, have secured their environments where there's credit card data, and by the way, more often than not they figured out, oh, we don't even need to keep the data. And if they're not keeping the data, the data is more secure because it's harder to find and harder to steal. But the paradigm shifted to the early days. The bad guys would try to break into the database and steal all the 20 years worth of sales records that you have with all the credit card information and do it all at once and risk being caught and all that kind of stuff. Nowadays it's much more common for them to figure out ways to harvest cards, maybe one at a time, either physically using what's called a skimmer device, some sort of inherent enhancement to card readers, especially at unattended payment terminals, which is a fancy term for a gas pump where people aren't always watching. Or an ATM is another example, but that's even being done electronically. There's ways that hackers have figured out to electronically skim data from a legitimate website, from a legitimate e commerce site, and the transaction still goes through, but they've figured out ways to sort of engineer a man in the middle of attack, sort, we used to call it. [00:24:25] Speaker A: Hijack attack browser plugins, for example. [00:24:29] Speaker B: Exactly. But even that is while it's still lucrative from the perspective of the bad guy, bad guying being a business person and trying to find a way to make a living. Even that sort of pales to the ransomware malware threats that around these days, because that's easier and it's, it's just as. If not as lucrative, maybe more lucrative, but it's. And maybe it's just the media, especially the journalistic side of media, is paying more attention to those attacks, but they do seem to be more prevalent and more common these days. I haven't read the breach report lately to see what the exact numbers are. I know they're all still happening, but there seems to be a shift more towards I don't even have to steal the data anymore if I can just lock it up and prevent you from having access to it, and you'll pay me to unlock it. You know, that's. That's a pretty easy way to make a buck rather than have to go about trying to figure out how to steal the data. And then by the way, you have to turn around and sell it to somebody. So there has to be a market for it. That is the type of thing that most organizations are up against, that they're just not really equipped to protect against outside of sort of the traditional target areas being primarily financial services, banks, the places where money used to be, because money is what used to be stolen, but now anybody can get hit by a ransomware attack. And you know, God forbid, the bad guys that used to supposedly had some sort of code of ethics, now they go after hospitals, healthcare organizations, school, schools, municipalities, utility companies, all companies that are largely, and we can fight over this, but I'll say largely are unregulated from a cybersecurity perspective, including Healthcare, you can fight me on that if you want. I have an answer. [00:26:41] Speaker A: I don't disagree, honestly. I mean, you know, most companies, right, have no regulatory requirements. If you're not into credit cards or health or something special, dodness, those kind of things, there's no requirements, right, beyond maybe GDPR in the 12American states. Why would anybody fight you on this? What is the counterarguments that you've heard? [00:27:00] Speaker B: Well, I think most people would probably say, well, hipaa, you know, HIPAA has been around forever, forever and healthcare has to do it. But the problem there is, and I saw this first when, gosh, how many years ago was the anthem breach? Probably 10 years ago at least. And when I first heard about it and heard what had been stolen, which was a lot of personal data, and I thought to myself, I'll bet you what happened was that the company Anthem was going to great pains to be HIPAA compliant and keep all the healthcare information from being identified with a person. And that's a slightly and significantly different puzzle to solve. And protect all that personal information, because that's not part of hipaa. HIPAA doesn't say protect all your personal information. HIPAA says protect the healthcare information so that it can't be pinned on you. And it turned out because I knew somebody that worked the Anthem post breach, that that was pretty much the case. They were focused on keeping the healthcare data separate from the personal data and not so much protecting the personal data, but if somebody had been thinking about it and thinking about what that they have that was valuable and worth protecting above and beyond any regulatory requirements, somebody might have said, hey, this is valuable data, maybe we should protect it just because it's the right thing to do. Nobody does that though. [00:28:32] Speaker A: This idea of audits, and I've seen this firsthand, a really good consultant can pass an audit no matter how poor your security posture is. And to me that just is a really bad reflection on the level of quality that the audits actually reflect. I think that's my starting point. Would you disagree with that? [00:29:00] Speaker B: I do not disagree with that. I am very often mistaken as an auditor and PCI assessment process and is very often called an audit. But I see an assessment and an audit is kind of slightly different exercises. Whereas an audit is, you've got this, for lack of a better term, checklist of do's and don'ts. And are you meeting certain thresholds for umpty up millions controls where pci, at least in its design, was meant for me as a QSA to be a security expert to come in and assess how well are you meeting this requirement or the spirit of this requirement. And if you weren't doing prescriptively exactly what the requirement said to do, which there aren't that many are that terribly prescriptive, you are allowed to say, well, we're not doing X, but we are doing Y and a little a Z, which in our opinion is stronger than X. And me as an assessor can say, yeah, you're right, that that combined does meet the spirit of the requirement, oftentimes more than, you know, plugging in whatever vendor solution ostensibly meets the X requirement. But, you know, I have a pretty long track record of clients. Most of them ask me back year after year because A, I get to learn their environment. I get to, you know, have all the pitched battles about what they're doing and is it, is it effective and is it sufficient to meet the requirement. But once we get to that point, I'm, I'm assessing as a QSA whether I think they're secure or not and whether I think they're, they're meeting all of the collective requirements that are applicable to them for all the systems that are within scope. And I tend to do that by, you know, the gut feeling, you know, getting, building a rapport, asking a lot of stupid questions and ignorant questions of like, how does that work? What does, what does that do? Getting people to talk about their systems and their environments, especially the administrators, the developers, the DBAs, and getting to the point where I'm comfortable that they know what they're talking about and they know what they're doing and they can show me that they're doing what they're doing to meet a requirement. And then I'm pretty good with that. Then we can move on to the more esoteric problems, which very often is they have a product in place that the vendor has claimed does X for them. And I look and very often this has to do with encryption or obfuscating the data, using PCI terminology. And very often it's not doing what it claims to be doing. You know, we said we, we weren't going to mention nsa, but I happen to used to work for nsa and I'm certified, you know, a lifetime ago as a cryptanalyst. So cryptology and encryption is something that I sort of take a special interest in and know a little bit more how it works than probably the average auditor that's out there. And I have the ability to ask little silly questions like how does this work? And I actually want the answer, not the not the marketing slick, you know, elevator pitch. Well, this is proprietary and secure, and you can't do this or that or the other. I had somebody tell me years ago, he was promoting what was effectively a QR code on something, and he said, well, it can't be read. And I'm like, what do you mean it can't be read? Something's got to read it. It's like, look at it. You can't tell me what it says. And he literally thought because it was a bunch of squiggles, that that meant it was secure. Did I answer your question? [00:32:55] Speaker A: Absolutely. I had this one company, I won't say who, but they had an encryption, looked great. And then I'm kind of going through the code, doing a code review, and I see the encryption basically key in the code, and I'm like, what is that? And so, yeah, something can be wonderfully encrypted, but the key's right there. And the code was basically available because it was a web app. So I'm like, yeah, there's an issue here that we need to talk about. So I've lived through these experiences that. [00:33:32] Speaker B: You'Re alluding to, and you bring up a very good point, which I try to explain to a lot of my clients and customers, anybody that will listen, especially the vendors that sell these things. You can have the world's greatest algorithms, and you can have the longest key you want, but in a traditional sense, you're encrypting communication data that's being sent from point A to point B. And in order to do that, both ends need to know the key, know the algorithm, and so you have to distribute the key. Key distribution. Key management, as has been the bane of cryptology for hundreds of years, in more modern terms, with public key algorithms, public key cryptography, you've solved that problem a little bit with the idea of public and private key pairs. But you also very often will store the data these days, which was not a concept. That was around 50 or 100 years ago. If we had secrets to keep 30, 40, 50 years ago, it was printed on paper or microfiche and locked in a safe in a locked room and a bunker 100ft below ground. Going back to that X Files episode. Who would take that data and digitize it and put it online? But we do that all the time. But all that to say is, it's not the algorithm. It's not the math that bites you. It's more often the key management or the implementation. And guess what? That's how certain agencies have been exploiting communication systems from their adversaries for decades is taking advantage of. People don't know how to use this stuff and they don't use it. Right. That applies in the private sector just as much today. [00:35:26] Speaker A: I appreciate that. Believe it or not, we're out of time, so I've got bad enough time for another question. [00:35:33] Speaker B: Sure. [00:35:35] Speaker A: Looking over 20 plus years of experience in the private sector, what is the. I did caution that with a plus. You see, I hedged myself. [00:35:47] Speaker B: You did? Okay. [00:35:50] Speaker A: What is the number one thing that you see at these organizations? What is the most popular weaknesses that people are just not doing? [00:36:02] Speaker B: Well, I've sort of touched on it already. I would categorize it one of two ways. One is the belief that you can outsource security to somebody else. It's either, you know, the security department or somebody in it or a third party, but it's somebody else's problem. That's sort of a close tie to the stuff that I can use in my organization. The applications, the tools, the sites, everything that I'm able to use. Because I'm allowed to use it, it must be secure. I think those are the two fundamental problems that I see. And of course, there's variations on each one and they overlap at some point. [00:36:49] Speaker A: Jeff, thank you so much for coming on show today. I appreciate you. [00:36:53] Speaker B: Thank you. Thanks for having me.

Other Episodes

Episode

November 15, 2024 00:40:57
Episode Cover

Michael Barbera | Nov 15, 2024

Michael Barbera explore the intricate relationship between consumer expectations, product design, and decision-making processes. They discuss how companies can manage expectations to enhance customer...

Listen

Episode

December 04, 2024 00:22:03
Episode Cover

Miguel Antonio Hermoza Paredes | Dec 4, 2024

In this conversation, Miguel shares his journey into cybersecurity, discussing his early interest in ethical hacking and the evolution of his career. He emphasizes...

Listen

Episode

December 04, 2024 00:40:28
Episode Cover

Mark Edgar | Dec 4, 2024

In this conversation, Mark Edgar discusses the complexities and challenges faced by HR professionals in today's society. He emphasizes the importance of influencing workplace...

Listen