Episode Transcript
[00:00:00] Speaker A: Rihanna, welcome aboard to the show. So happy to have you today.
[00:00:03] Speaker B: Oh, thank you so much. I appreciate the invite.
[00:00:06] Speaker A: Rihanna. I want to start from something pretty basis. What is it that. What does your day look like? Like, what's a day to day look like? And explain to us what you do. What is an SOC blue team? What is that actually?
[00:00:18] Speaker B: Well, I always say, do you want my corporate America answer or do you want my I'm not in the office answers?
[00:00:25] Speaker A: Oh, my God. Wow. Probably a mix between the two.
[00:00:29] Speaker B: If I was to describe my job and I would say a pop culture reference, there's this dog sitting in the fire, and he has like a coffee mug and he's like, oh, it's fine. Everything's fine. It's fine. You know, my corporate America response is essentially the security operations center is your first defenders, right? They're. They're essentially your first responders for the business. These people are the first eyes and ears to any security risk or threats that are happening. So this can be anything between a user escalation, something happened at the edr, the firewall. These people are so involved, hands on keyboard 24 7, prepared to jump on any types of risk and threats. I also call them the adrenaline junkies in the business. They're probably the people you'll see walk around chugging two energy drinks and popping Tylenol multivitamins.
[00:01:23] Speaker A: This is true.
[00:01:24] Speaker B: But, you know, it's a pretty fun day. You know, I always say I walk into my job not expecting anything, just because every day is a new day. That's probably honestly why I really, really enjoy the cyber defense side of the field.
[00:01:39] Speaker A: So. So the. The dog saying, it's fine, it's fine. It's just the vast majority of false alarms, right? That's the reference there.
[00:01:45] Speaker B: I would say false alarms or the complete opposite, where everyone has a problem that day and it's just like, oh, okay. It's just a typical Monday.
[00:01:54] Speaker A: When it rains, it really does pour. I mean, that's how it works. So. So let me ask you this. Okay.
[00:02:00] Speaker B: Yeah.
[00:02:01] Speaker A: Once in a while really does hit the fan.
[00:02:03] Speaker B: Oh, yeah.
[00:02:04] Speaker A: What does a day like that happen? Like, what does it look like? How does it start and when? What does it feel? What do you. How do you know? Like, oh, my God, things just got real.
[00:02:12] Speaker B: It's. I would say it's the most euphoric experience ever. Everything is just great. No, and honestly, it honestly just depends. I've worked in a lot of different security roles in a lot of different companies and sizes, and I will Say once stuff hits the fan, you need to be with a team that you can all collaborate and work with. I've been in security incidents before where people, we don't talk to each other, it's just completely silent. And I always like to use the phrase, hey, be like Gordon Ramsay on Kitchen Nightmares, right? You have to talk, you have to communicate. I have to know what you're doing, what I'm doing, what's going on. Because the moment we stop talking, especially in hybrid environments these days, that's where you go blind. As a leader in these roles and stuff like that, or even as an incident commander. People are literally, their fingers are in so many different soups at the moment that you don't know what's being overlooked, what's not being looked at and everything. And you'll have your people that are hands on keyboards, right? They're trying to tell that full story as to, hey, how did this happen? Why did it happen? And being able to provide those high level summaries. And then if you go on the leadership side, this is where you deal with the fun politics of, hey, this is a confident answer of what is happening. This is the confident analysis. This is the data loss, potential reputation loss. And this is what we need to do right now to remediate it. And this is what's currently being in progress. I'll give you an update in 30 minutes. And these are sometimes very, very, very difficult conversations. Especially if people in the business have never had to deal with cyber incidents before or anything like that. It's scary. You think about like the boogeyman in your closet and then you see it for the first time. It's like Monsters, Inc. I don't know if you've ever seen that movie. And like three kids. Yeah, right, exactly. And like the monster comes out of the closet, they're like, oh, you know, I mean, sometimes that's what execs and senior managers in a business, they're like, oh, these are real. And you're like, yeah, this is great.
[00:04:16] Speaker A: I got to say that, you know, from what I hear from executives, it's like a top three fear, right, that they're going to get hacked.
It's definitely a real fear. It's getting complicated into today's landscape. I mean, if you find an issue you can't necessarily solve, it might be a different team. I mean, if you think about dependency issues, right, you might need to release a version, change lines of code. How do you prepare yourself to basically be able to act fast in an environment that you don't necessarily have control.
[00:04:45] Speaker B: Yeah. And that's such, it's such, such, such a hard thing. I mean, especially like it doesn't matter what the organization is, there's always going to be like that specific area that it's just gonna be a little different. Whether it's, hey, it's the C suites, computers, you know, and they're set up completely different because MFA is an inconvenience or, you know, whatever that case is, you're just gonna have to work with that team. And I always say you can be technical at end of the, to have those soft skills because again, essentially security, whether it's defense or Red team or even vulnerability, compliance, management, it's. You have to build these relationships with the different business segments because you're selling security to these people. And especially if it is a real situation where everything's on fire and everything's fine, essentially, you know, we want to have those difficult conversations and it's so much easier when it's already with someone that understands. Right. And you can have those transparent conversations and they understand, okay, this team's got it, we're good. You know, and especially if it's an area that you don't typically touch or even have that critical access to, it does happen. Cyber doesn't always get all the cool permissions. That's who we want. But you know, again, right. Having those very transparent conversations, those very strong business relationships can absolutely go a very long way, especially in a very high risk situation.
[00:06:15] Speaker A: If you can share what is one of the, you know, recent or not so recent kind of red alert situations that you've been through. What was that? What did that look like?
[00:06:24] Speaker B: Well, okay, I won't use my experience just because of, you know, contracts and, you know, secrecy and all of those. And I don't feel like, I don't feel like getting a drill for it, but I feel like in the community there has this one specific incident that a lot of people could relate to and this was crowdstrike and people updated, especially if you had automatic updates on and they tell you, right, it's just easier to have automatic updates on than to worry about updating your EDR agents. And one bad push and there you go, you have blue screen of death for the rest of the day. And billions and billions of dollars essentially were lost across, across the globe. And if we take a step back, especially like looking at, hey, this is what's going on in the community, which is something that we all should be involved in and then think about, what am I doing in my current business. Right. And so hey, do we. EDR is a good one. Right. Are we test piloting these agents? Right. Do we have a specific group that has these auto updates on like our IT and security team where if they were impacted by a bad update, we're able to go through those configuration checks and then make adjustments accordingly before we just push it out business wide. Especially with a big business. Right. Bigger the business, probably bigger the rest of a bad agent push.
[00:07:45] Speaker A: So yeah, I mean this, this idea of consolidated risk seems to be getting worse and worse. I mean we used to be in a world where it was security through obscurity. Everybody had something weird happening on. If you wanted to hack somebody, you had to hack them. And now we're in the world that if you can just hack Amazon, find some obscure, you know, thing or you know, just look at a patch that just came out and hope that, you know that they haven't patched their servers yet. Yeah, I mean it's coming to that. You get so much rewards from that hacking method that I'm kind of wondering like, does it make any sense to be on these, these platforms where you are a prime target? Should we go back to the, you know, everybody doing their other, their own thing?
[00:08:29] Speaker B: Yeah, again, Right. It just, it depends on the environment a lot. And I, I like to use words that used to be technical words but are now sales buzzwords. And one of them is being defense in death and zero trust. Right. What exactly does that mean in an environment? You know, zero trust? We can look at. Do we truly inventory and know what our assets are in our network? The people that make up the business. Does everyone have admin privileges? Right. These are things that we should probably be thinking about. And then if you go in the defense and death layer, hey, do we have firewalls? Do we have network segmentation? Do we have adr? Do we have mfa? Right. Do we have password rotations, password complexity? Right. We can go all the way down to the GPO level of a Windows computer, but if we don't have those layers, it's going to be a broader risk. Right. Your attack surface is going to be very, very large scope and really understanding who's in your business, what makes up your business and on what's the security architecture can really help identify those potential risk as well. Even if you don't have a red team in the business, just having an idea of lay of the land goes such a long way.
[00:09:37] Speaker A: Yeah. Would you say that this is kind of linking between your arguments around you Know, start with nist. Nice. What Talk more about that.
[00:09:46] Speaker B: NIST is, I mean, especially if you're in a smaller business, right? And you're like, I just got a cyber team. I don't know what that means. Right. I don't know where to go. A lot of people, they'll go, start reading about nist. And this is a great, great, great, great, great resource to have because this is going to give you that foundation between this is what a standard is and not all the times, especially if you're not on the government side. NIST is a also a good suggestion book for you. Hey, this is essentially what logging should look like for your environment. Hey, this is how long you should keep logs. Followed by hey, this is essentially how you get introduced to mitre. What are you detecting on, what are you alerting on? Okay, if you do pen tests, this is our suggestions, right? And again, right. Not all the time will this be applicable to corporate whatever. Just because this is just standard across the globe and you can take those and apply it back to your business. And then again, that's how you might be also able to evaluate your potential gaps, especially if you're on the management side and you're making these decisions to do millions of dollars of purchasing and equipment and hardware and software. You know, this is a good justification to have because this is again, a security standard that is used all over the world. And if you're like, hey, they're saying we should have logging, we don't have a sim.
Oh, if we have an incident, we literally do not have data to support if we were breached or not. Okay, well, if you have cyber insurance now, you don't have data to give to your cyber insurance. And now your insurance probably went up because now you don't have that mechanic in your environment.
[00:11:28] Speaker A: I mean, this whole topic of, you know, sims and cyber insurance working together, I feel like there's such a huge potential here. Like, why aren't we getting. And not only sims but also our, our compliance. But not from a compliance perspective. But what are we really doing? Like, why aren't we getting cyber insurance discounts based on our, on our security posture? Like, I'm just putting it out there, right?
[00:11:48] Speaker B: I mean, seriously, I, I was, someone made the joke to me the other day, they're like, cyber insurance is turning into a pyramid scheme. And I was like, yeah, yes, yes.
I was all like, well, you know, if you know a guy, let me know.
[00:12:04] Speaker A: I mean that, to me, that just says that it's ripe for innovation, right? Some company is going to, going to do something here. I love that.
So, okay, so let's say we don't have the money for professional red blue teams, small companies, small and medium. How do we even approach cybersecurity?
[00:12:23] Speaker B: What do we do?
[00:12:24] Speaker A: How do we look at it?
[00:12:26] Speaker B: It's a lot. Again, I always say cyber costs the company money. We don't make the company revenue. And so we tend to ask. Yes. And we tend to ask and ask and ask and ask. And eventually stakeholders and people like that, they're going to be like, are you nuts? We just gave you a couple of zeros and commas for this one thing and you're turning around asking for like an. No, that doesn't make sense. And then next thing you know, there's layoffs and people that don't make the company revenues. Sometimes hr, you know, sale new salespeople or even your IT and security folk, right? They're going to be, especially in today's world, they're going to be the first on that, on that layoff handle. So it's sad, but it's true. And so now it's like, hey, what can we do more with less in the world? Especially if, yeah, especially if you're a smaller company, right. You don't have a big budget. And if you really think about what is cyber, there's so many different corners. Bigger companies tend to have specialized teams that do specific skills like pen testing and vulnerability and risk and you name it. And then if you go back down to the smaller companies who Cyber team is very, I would say, mature, right? They just started. A lot of people are going to be wearing multiple hats just to compensate for that smaller revenue and that smaller growth. And this is where people start going into, I would say MSSPs. And so they start contracting out because they're not paying for the benefits of another person. They're typically, you know, third party service. And I always tell people there's nothing wrong with, you know, going out of your network, especially having something's better than nothing in my opinion. But then also really training your users to identify these potential risks and threats. You know, it's free to tell new hires to not click ads. It's free to tell new hires how to spot for phishing. Right. And there's a lot of things that you can do that's budget friendly that you can bring back to your users. Now I always do suggest we're going to be talking about MSSPs and stuff like that. I always tell people, read your contract, have a legal person spend 30 minutes with you and read this contract. Because there is a lot, a lot, a lot, a lot of gaps that people overlook. Like for example, what if that MSSP gets hacked? What's going to happen to your environment? Like, people don't really think about that. And is this outlined in your contract or hey, what's escalation?
[00:14:56] Speaker A: I'll tell our audience that it's not going to be their fault.
[00:14:59] Speaker B: Oh no, no, it's not your fault.
But like, these raise questions that people really don't think about. And people just sign off these billion dollar contracts for security providers and they don't even know what their contract says. So, you know, really work with a legal person. You know, read the contract, go through it, it's okay to ask questions, right? You know, it's still a service at end of the day. I probably call my router for my Google Fiber all the time and I'm like, hey, I see you guys offer this package. What is this? What's the difference? You know, nothing wrong with that.
[00:15:31] Speaker A: So I have a question to you. Okay, this is a difficult question. So I want to frame it as pros and cons.
Pen testing, penetration testing can be very expensive. Now if we talk about the small to medium businesses, it's a tough pill to swallow.
[00:15:48] Speaker B: It is.
[00:15:49] Speaker A: I've heard people, and when I say I've heard, I mean me who have been talking against pen testing. I want to ask you the pros of cons of pen testing. And I want to put the pros and cons of pen testing against white box periodic like twice a month, once a month. White box architectural testing.
[00:16:10] Speaker B: Yeah. False.
It's a loaded question.
So again, I always go back. It depends on your environment, right? If you're someone that has very, I would say detailed compliance insurance guidelines and outlines and it says, hey, you have to do XYZ type of annual test. And if pen testing falls in that then Absolutely right. It's no way out of it. But again, why are we doing the things that we're doing? Like for example, I went to a conference recently and I won't name the company, but this user came up to me. They're like, it is so frustrating to work with the business. And I'm like, what do you. Well yeah, but what do you mean? You know?
And they're like, well, for example, like I own all of our vulnerability management process. I also oversee like our pen testers. I oversee that whole area. And I said, okay. And there is a group in the business that refuses to patch their Computers, because quote unquote, it's going to break what they're doing. And I said, okay, Story as old as time. Yeah. And they. The person goes, well, here's the thing. I was so sick of them saying no. I made sure they were on our next pen test. And I was like, what do you mean? And he goes, I sat down with that team in a room. And he goes. And I did a live demonstration of this because I was so annoyed with it. And I was like, okay.
And he took this team. He took. It's a smaller company. And he even took the CEO in and they sat in a conference room and he pulled up their website and he goes, this is why you guys need a patch. And he just went ahead and exploited that entire website, starting with the default password of admin. Admin.
And oh, yeah, I know so many, so many questions. And he went ahead and just kept on going. And the CEO at the very end, right, he didn't understand security. He just knew he had to have security. He didn't understand it. And the CEO goes, did you guys know that he could do this? And the team's like, yeah. And the person doing the pen test pulled up the email that like, two weeks they sent before goes, hey, letting you know you still have XYZ vulnerabilities found. Which he exploited, not making any friends.
[00:18:42] Speaker A: Is this true?
[00:18:43] Speaker B: This guy. This guy woke up and chose violence today.
[00:18:47] Speaker A: This is weapons of mass destruction. Having the email ready to be like, yep, these didn't do their job.
[00:18:54] Speaker B: Had evidence ready to go for this. And the CEO goes, okay. He goes, I want this fixed by end of the week. No excuses, I don't care. And they ended up being fully compliant by end of the week and everything. And then on top of it, I was all like, well, I. I condemn you for doing this, man. I was like, good for you and everything. And he goes, yeah. He goes, it kind of felt dirty, but at the end of the day, I got what I want.
I was like, that's all it is. But again, right, going back, I feel.
[00:19:27] Speaker A: Like there's a better way, right? Like when you think about building relationships and communication, like if you had to get to that level, and I'm over.
[00:19:33] Speaker B: Here preaching, you know, in the beginning of this, hey, make sure you have relationships with different people and everything. No, out the window, right?
[00:19:41] Speaker A: Just let's. Let's both agree that our recommendation to all the listeners, this is the last resort, like, you should try everything, exhaust all options before you basically put a knife to the throat of you know, a fellow colleague in your business.
[00:19:55] Speaker B: He was not, he was like, I'm gonna, I'm gonna get it or I'm not, you know, last straw. And then I was like, were you job hunting around this time? He goes, oh, I was about to be. And I was like.
[00:20:08] Speaker A: Scorched earth. Yeah, right.
[00:20:10] Speaker B: Well, you know, now him, the CEO are on first name basis, so good for him.
[00:20:13] Speaker A: Really.
That's a dangerous tactic.
That's a dangerous tactic. Oh my God.
[00:20:21] Speaker B: I guess fall back to like the question, right? Is it beneficial or not? It just, it depends on your scope going in. The point of that story was that person, they had drive, they had purpose, they had an outline and objective of what they're trying to do. And a lot of people, they're like, oh, I just want to see what the pen testers can find and exploit. And not all the time is that going to be beneficial. Right. It's, it's as if you're walking in your neighborhood and you're driving every single door, every single car, every window just to see something's unlocked. So you can go in and steal something and then leave just because you can't. Right. And if we think about pen testing in that scenario, I'm not a pen tester, so if I offend anyone that is a pen tester, I'm sorry, but if we think about pen testing, it even helps, I would say the pen testers in some way having scope, right? Because if you're spending money and time and resources in these people really understand what you're trying to achieve. Am I looking at this specific website because it's my customer facing website, it's hard for maybe us to patch, right? And here's the scope of just this one area. And now you can really truly have the impact or risk with that as well. I don't think it's value add in my opinion. If you're just like, yeah, here's a bunch of IPs, good luck. Or you know, like, hey, here's our domains, good luck. You know, like that's not really going to be beneficial to anyone because if you're the person on the other side, you know, you know what can be probably exploited, right? You know, the skeletons in the closet. Especially if you're the security person or the IT person that is discussing these pen testers and what the scope is as well. Now sometimes I would say if it's a very small mom and pop shop type of company and they do a lot of third party services, I really suggest to these People, maybe not a pen test, but hey, what is your contract in relations with these third parties? Because sometimes people will have third party IT come in and they will do like remote desktop support and stuff like that just because they can't afford that IT service, you know, hey, what, what does that look like? Are they patching their stuff? What if they get exploited? We've seen time and time and time again about third party libraries and services constantly getting those. Are they getting pen test? Right. What are they doing security wise? Because that's going to be super expensive for you if that third party introduced a threat into your network. So it just, again, it really depends what you're doing currently, what your scope is and is it something that's required for your business or not?
[00:22:56] Speaker A: Yeah, here's my point, right? That if you're doing pen testing once a year, right, you don't have a team that's working on it, then you know, if they're good, they're going to find something.
[00:23:06] Speaker B: Oh yeah, yeah, for sure.
[00:23:08] Speaker A: Right. And really the easiest way to basically break in is to find some patch that you just haven't patched yet and then to use the exploit. So, you know, has the, you know, you've paid now 25, $50,000 to do what? Right, Just be hacked. And my point is, well, there are alternate methods of continuous security posture where you're really saying, okay, let's have all these vulnerability scanners working all the time. Let's have our actual code that we're producing being scanned for third library parties. And if you're doing pen testing, my argument, me getting angry here for a moment. If you're doing pen testing but you're not doing all that, I'm saying that doesn't make sense. Right. You should start with all those other.
[00:23:51] Speaker B: Things and maybe like. And again, right, if this is your first time having a pen test, this is great data to have to set your roadmaps and stuff like that. I always ask people, are you even collaborating with your security team or mssp? Are they getting the alerts from the pen testers trying to do this activity? If the answer is no, probably should go back and see what gaps are potentially missing because then that's how you're going to start getting into that detection generation in that threat hunting side of the defense area. It's, it's super nerdy over there.
[00:24:25] Speaker A: So we're going to do a pivot now.
There are a lot of things that we can control and automate and scan and quite honestly it's getting a Lot better. And there's a lot of cheap solutions out there. So, you know, it's. You know, it's paradise, you know, compared to, let's say, 10 or 20 years ago. However, there is one thing that isn't getting better and maybe is getting worse, and that's human beings.
[00:24:49] Speaker B: Oh, yeah.
[00:24:51] Speaker A: Not only are we not getting better with the introduction of AI, it's getting way harder.
Should we just, you know, give up right now?
How do we look at this?
[00:25:03] Speaker B: What's. What's the future look?
Vacuum. Right now?
[00:25:07] Speaker A: Like, social engineering, phishing attacks. Like, this is getting scary. I heard of the. The government, you know, they went on. It took them 10 minutes to figure out they were talking to some AI you know, deep fake.
I need you to leave, please. Go upstairs.
[00:25:26] Speaker B: I try to find out.
[00:25:28] Speaker A: I might actually leave that.
[00:25:29] Speaker B: What is in his hand?
[00:25:37] Speaker A: Yeah.
Covid. Thank you, Covid. I appreciate that.
[00:25:44] Speaker B: That's awesome.
[00:25:47] Speaker A: Trying to find Emma. Where's my sister?
[00:25:50] Speaker B: Not here.
[00:25:52] Speaker A: Not yet.
[00:25:55] Speaker B: We can. We can pick it back up from there if you need to.
[00:25:59] Speaker A: Where were we?
[00:26:00] Speaker B: AI. Social engineering.
[00:26:02] Speaker A: AI, yes, Social engineering. I got. Okay, so I got this phishing attack.
[00:26:07] Speaker B: Yeah.
[00:26:08] Speaker A: And not only were they. Did it come from an actual PayPal email, right? Because your ABCs of phishing attack. Is this a fake email? No, it was an actual verified PayPal email. Everything looked great. Like, the like. So this was actually PayPal sending me a phishing attack. Now, how did they do it? They used the ask payment function in order to do a phishing attack. So in the text, in the message, what did they write? Now, what's the most nasty thing they could have written in that message?
[00:26:37] Speaker B: Could have been.
[00:26:38] Speaker A: Take a guess.
[00:26:39] Speaker B: Fraud.
[00:26:40] Speaker A: Yes, you got it.
[00:26:41] Speaker B: Yes.
[00:26:41] Speaker A: So they said, yes, you got it. Good job. So they said, like, oh, if this looks like a fraudulent message, call this number.
[00:26:49] Speaker B: So you call the number.
[00:26:50] Speaker A: No, of course not. No, I didn't. No, I didn't. But the brilliance of it, right? They're attacking you a. Through the psychology of fear, that this might be a phishing. If you are smart enough, they're like. And then it gets to their phishing hotline. And you can. If you say, okay, no, it's not phishing, then you pay the pay now. So, like, multiple ways for you to basically go down that rabbit hole and basically be susceptible to their attack. And I'm like, this is evil. And no deep fake even involved. Right. No personalization involved. They didn't even get to that level. And it was still really good. So it was like, if I was like stuck on this for a minute or so. I was like, oh my God, like a reaction before.
[00:27:32] Speaker B: Yeah, the thinking. Yeah, yeah.
[00:27:35] Speaker A: Have you seen any kind of deep fake, hyper personalized AI attacks yet or not yet.
[00:27:41] Speaker B: So my. And it's funny you mentioned this. So my podcast that I do, I actually deep dived around the US election season one day and was like, hey, you know, what's. What's going on with the election season? Turns out, the amount. And it's so crazy. When it was originally Hillary versus Trump, there was a whole. Like, I could talk about this the entire episode and I won't, but there was a whole side of weird cybernets around that time. And so a lot of it was trying to like sway voters, you know, and stuff like that. And it's not uncommon, right? You go on the Internet and you will see things that are geared towards your algorithm and stuff like that. And my. A couple of my family members, they messaged me videos and I'm just like, can we not, you know, and then I was like, oh, wait a second. And I was like, why does Donald Trump have six fingers? And I like how to back it up a little. And it's always the hands. It's always the hands that creep you out. And.
And so I backed it up a little bit and you can just see the little bit of fabrication. Like, I. If I did not notice the hands, it was so smooth. And I was like, you know this is fake, right? And they're like, no, it's not. And I was like, look at, you know, minute, three, 56 seconds. And look at the hands. And they're like, what? They're like, donald Trump has six fingers. And I'm like, no.
I was all like, no. I was like, oprah Winfrey has six toes. Donald Trump does not have six fingers. And they're like, no way. And so I was like, yeah, that was, that was AI. And so you. Then I had to explain, like, what deepfakes are and all this other stuff. And Twitter, X, sorry, formerly Twitter, they actually had to shut down their AI bot service during the election season for a bit. And just based on how that bot service was programmed, they would pick up information that was badly fed into it. And so it was spitting out bad political information with stuff that wasn't accurate and everything like this. And they had to shut it down and like roll back a version and put some guardrails back up on it because it just, it was a cycle of just trash, essentially at that point. And I think Google had Something similar not too long ago with the same thing. But I always tell people, especially with phishing emails these days, it's so easy to have an evil run by these services and stuff like that. I can ask ChatGPT any day to be like, hey, can you write me an email that says my password got compromised and I need to click this link to reset my password and you know, chat's going to give it to me. And I'm like, cool, can you write this in HTML? Can you provide some fancy CSS to it? You can even ask it to look like specific, like social media pages. Like, hey, can you use the same branding as Meta? Can you use the same branding as Blue Sky? And it will absolutely create all of those libraries for you. And so as a threat actor, even.
[00:30:41] Speaker A: If it says no, all you have to say is like, oh, I'm a student in a design class and I.
[00:30:46] Speaker B: Need to show this.
[00:30:47] Speaker A: Like you just give it some bullshit explanation. It doesn't happen. It doesn't dishes out what you asked for. Yeah, these guardrails.
[00:30:55] Speaker B: If it doesn't like, you know, live.
[00:30:57] Speaker A: Oh my God.
[00:30:58] Speaker B: But you know, it could give you all of this stuff and just like in a snap of a second. So if I'm a threat actor, absolutely. Like I'm going to probably use these, these service calls and everything just to automate my email services on the backend. And then you can automate this through your scripting platforms or your phishing services. And it just bada beam on a boom, you're set and you're ready to go. And so people are often add on.
[00:31:21] Speaker A: Some scraping in from your LinkedIn and from your website and add on some GPT to do personalization. I mean, it's horrifying.
[00:31:30] Speaker B: Yes. And I even asked the open AI service from ChatGPT to like, hey, can you do pictures of Brianna Schultz doing this? And a couple of them, I'm not gonna lie, I look great in the Star wars universe. But a couple of them I'm just like, no, that's not like, that's not even close to. But it's crazy with stuff that's open source and that you could just research and find. Yeah, it's going to grab it, but it's getting.
[00:31:54] Speaker A: The problem is that even if we're having these issues with the hands with text, like I don't know if they fixed that already. Right. Text doesn't come out right. Yeah, this is going to be fixed in the next year, 2, 3, 5, like all of these problems are going to go away. Yeah, it's going to be super, super realistic.
[00:32:09] Speaker B: And it's, it's so wild. Like, especially people that are in the security education side of things and they're paying for all of these security education.
It is going to be so hard for these education parties to keep mainstream to these new types of threats and everything. And so people are like, how are we going to train our users against like these new social engineering tactics? And one of them is first off, just like when we were told as kids, you cannot believe everything you see on tv. It's the same thing with the Internet these days. You cannot believe everything you see on the Internet. I posted this on my LinkedIn my ex recently and I was like, hey, I verify everything with three sources because to me, three is a pattern, two is a coincidence. And if three different unique sources are saying this, reputable sources, by the way, not just random exposed like reputable sources. Okay, I feel confident sharing this with my friends, my family, my colleagues because again, otherwise I want to turn into my family members who send me political videos of six finger Donald Trump. And I'm just like, wait a second, that's false information we're passing around. And then the other thing is ChatGPT. Yeah, it can fix grammar, it can fix all of this in the day. It is going to be very hard for it to talk like a person and it's hard for it to be authentic. And so even now with like businesses cracking down on, hey, you probably shouldn't be submitting company emails to these gen AI services because it has company information in it. You know, a lot of that hopefully is going to help to make more of those emails authentic to tone and direction and stuff like that. It's very, very hard for these services to act like a human because it's not, you know, at end of the day, it doesn't have these same emotions and personality traits that we do. And so when you receive these emails or even phone calls now, what does it sound like? What's the tone like? I actually had a gal, she, she reached out to me about this story and it blew my mind. Her, her son was out and she was out grocery shopping one day and all of a sudden she gets a phone call from an unknown number, but the area code was local to her area. And so she's like, oh, okay. So she picked it up and answered it and it was like, hi, I'm so and so from that area's county Sheriff's office. I am calling because your son got into trouble. We have him currently Posted for bail for $2,000. And, you know, he. It needs to be done in the next 24 hours. You can pay the bail at this website, or we can do it over the phone. And so she's, like, juggling groceries.
[00:34:53] Speaker A: That's brilliant, though, that they got the local area code to match up. Because when I see California, California number is calling from me. I know.
[00:35:00] Speaker B: Like, I don't know anyone in Cali. No, like, yeah, but that's.
[00:35:04] Speaker A: That's really impressive. I mean, the level of detail that they put into that is. Is. Wow.
[00:35:10] Speaker B: So.
[00:35:10] Speaker A: So did she fall or what happened?
[00:35:12] Speaker B: So she was freaking out, and she goes, I need to think about this, you know? Like, I. Like, what did he do? And it was like, some vague, like, wall, like, speeding or whatever it was. And of course, like, right, she just got back from the store. She's literally carrying groceries in. And she goes, I need to. I'll call you back. You know, whatever.
And she was smart enough because she had saw this exact same scam recently on. On Meta. And she goes, I'm just gonna text his friends, right? Because I know he's with his friends. And right as she's typing that message, her son calls her and goes, hey, Mom, I'm just leaving, so. And so's house. I'll be home in 10 minutes. Do you care if they come over for dinner? And she goes, you're not. You're. You were. She goes, what? You know, like, she just was bamboozled. And he goes, yeah, I'll be home soon. And she goes, okay, yeah, we'll see you soon. And she ended up actually, like, googling the number, and it wasn't the sheriff's number at all. It wasn't even a county number of anything. And she. She reported it to, you know, the ftc, which is what you're supposed to do, and she ended up blocking it. And she was like. It was. It was so real. Like, you know, that sounded like an American accent. It sounded like something that was tailored towards her local area, right? Because everyone has their own different slang a bit. And she goes, it just. It blew my mind. Like, you. You see these YouTube videos of people making fun of scammers and everything. She goes, this scared me so, so bad, you know, that I was about to drop, like, money to get my son out of jail, even though he wasn't in jail. And I was like, yeah. I was like, well, you did the right thing. And, you know, I appreciate you sharing the story and everything, but how. How absolutely wild is. Is that story I.
[00:37:07] Speaker A: There's a level, obviously, if you're a scammer, you're evil, but there's a. There's an additional level of email if evil. If you are tackling these, like, hypersensitive psychological issues right around safety of your family, I feel like that's just, that's just pure evil.
[00:37:24] Speaker B: Yeah. I couldn't even imagine, like getting a phone call and it was like, oh, hey, so and so is in a car accident. You know, like, we can't like insurance in a valid. Like, you know, there's so many things that you can start tugging those emotion strings on. And that's really what social engineering is at end of that day. Right. It is making sure that they're having that psychological connection with whoever. I mean, don't get me wrong, there's like the random, I would call it crap mail that goes out. But like, when the true threat actors that are out there that are really sophisticated, you know, they're putting money towards their threat services and stuff like that, like, they are 1000%. They don't care about your emotions. They're out there to target it and they really hone in on that fact that, like, hey, you're a person at end of the day that's providing for their family and they, they will absolutely abuse that fact. And it's, it's so crazy, like, how, how it's gonna be. I mean, as a person that works on the defense side, I think it's a little bit cool how it's evolved. But a person with a family, absolutely. Like, you know, my parents, they're in their 60s. There's no way that, to me, I feel confident that they're gonna be able to pause and think. Right. No, they're. They're gonna react because they're good people. But yeah.
[00:38:38] Speaker A: Yeah, I, I feel like that it's getting so good that I am unsure at this stage that education is the answer. I feel like we're, we're putting, we're putting a lot of faith into education. But. But technology is just moving so fast that we may need to just, you know, as an industry, agree that there's certain things that everybody is going to start doing. Yeah. And, you know, it's everything from, you know, the fact that every single person needs to use two fe, Right. And like physical tokens, not text messages or apps, and that has its own set of problems. Right. There's a company that came out with a ring. A 2fa. Sorry, 2fa.
[00:39:19] Speaker B: Ring. Yes. Yes.
[00:39:21] Speaker A: So I thought that was pretty cool. But Then on the other hand of this, there's the stuff of, like, should we rethink our protocols? Should we rethink our communication? Like, how is it possible to buy a domain without authenticating your id? Can somebody explain to me how you can buy a domain? Well, maybe in America, that's a ridiculous question because we can vote without an id, but let's not get political. Let's at least need an ID to buy a domain. Because if you can't buy a domain, you can't send all these spam emails. You can't do all this stuff. Why don't. Why is there no know your customer for Snapchat, right? So much sexual harassment and pedophilia going on on Snapchat. Like, just know your customer regulation would kind of make at least half of that just go away overnight. It's ridiculous.
[00:40:07] Speaker B: It's crazy. Like, even think about.
I always say you need a actual ID to sign up for a banking account, whether it's business or personal. These days online, you have to have a valid id. And a lot of these sensitive stuff that can absolutely be abused.
It's crazy. You need to have something to validate.
You're purchasing a domain, you're doing all of these things. And because again, they can turn around and use it for, hey, here's a fake website that looks like this. I'm impersonating it. You know, I'm not Amazon. I'm a zero on, you know, or whatever. You know, like, people that don't understand the difference, it's gonna look legit to them, you know, at face value. So. Yeah, good point.
That's an interesting one.
[00:40:59] Speaker A: I spoke to this really interesting gentleman. He had a completely different solution in mind. But I think the premise that he had was really, really interesting. What he said is basically this. It should be like a license plate.
[00:41:10] Speaker B: Yes.
[00:41:11] Speaker A: Like, why shouldn't we have all, like, our social license plates? Doesn't tell us who you are, doesn't tell where you live. It doesn't tell even what your name is. So you still get that anonymity online. Yeah, but if we had, instead of a Social Security number. Sorry, in addition to Social Security number, we add a social, cyber security number, whatever. And it's just a license plate. It means nothing. It can't be tied back to you unless you're. You have a warrant, right? A legal warrant by, you know what, lawyer, police. I don't know who the hell gives those. And then that's the number you have to use in order to get a Snapchat account. Yes, that's it. And then it will go and cross verify with an API with the government and that's it. We know that you are not a pedophile with 50 accounts.
[00:41:51] Speaker B: Yep.
[00:41:51] Speaker A: Right. And if you are, then the government knows how to get you.
I don't know, it just seems like we're just. We're skipping some really fundamentally simple things and we're just not doing them. And if we put on our tinfoil hats, it's like, well, how much of Godaddy's revenue would disappear overnight?
[00:42:07] Speaker B: Oh, my gosh.
It would just crash.
[00:42:11] Speaker A: Like, there's a revenue model here that, like, this crime is paying a lot of money to a lot of companies who are supporting the infrastructure to do so.
[00:42:19] Speaker B: Yeah.
[00:42:20] Speaker A: So, I mean, I can see why a lot of people wouldn't want to do it, but. Yeah, that's a whole different.
[00:42:25] Speaker B: That's a whole different conversation. Like, what budget's going to get cut? You know?
Oh, Lord, no, we don't need an education system anymore.
[00:42:37] Speaker A: Rihanna. We could keep on going for hours. We've been. We're basically, you know, like 10 minutes past when I usually cut, so.
[00:42:43] Speaker B: Oh, sorry.
[00:42:45] Speaker A: No, on the contrary, this is a compliment. It means we've been having a lot of fun. Thank you so much for coming on the show today for the last show of the day. Being exhausted, it's nice to have a good time. Appreciate you.
[00:42:56] Speaker B: Oh, thank you. It was a great time.
[00:42:58] Speaker A: Wonderful.