Episode Transcript
[00:00:00] Speaker A: Ed, welcome aboard to the show. I appreciate having you here today.
[00:00:04] Speaker B: Thanks, Ari.
[00:00:05] Speaker A: You guys are doing something incredibly difficult. We're segmenting security companies right now. And one of the questions that came up is, well, first of all, figure out who your company is and what they're focused on and do they even have security risk and what that level is. You guys have decided to focus on, I would say, the worst companies possible that have healthcare information. Right. That is terrifying to me to think that my healthcare information might get out there into the world. I want to start with a very simple question. Why did you even decide to focus on healthcare?
[00:00:38] Speaker B: Oh, yeah, well, I was. This will be my 11th company and I've done software all my career. And what I found is that I love to find a problem that requires technology to improve a process.
And when I was coming out of, coming out of retirement, my wife cordially commanded me to go back to work. So I was causing too much trouble.
[00:01:04] Speaker A: That's the story we need to come back to.
[00:01:06] Speaker B: Yeah, too much trouble. Well, I don't know how much I could share there, but I, I was thinking about problems I faced at a previous organization. My team had to marshal the process for third party risk. So we'd get this security risk assessment come in from a provider that we were trying to transact with and before we could do anything, we had to fill this out. And I used to joke, if you see one, you see one. They were all different, everyone had different questions. Even if they were the same questions, there was too much semantic difference in the way they asked the question. Some were multi choice, some were binary. Yes, no. Some were open ended. So it was a real challenge to get any leverage internally. At the company I worked with, we tried building a knowledge base, we tried to hire people to solve problems. Still didn't give us any leverage. So this notion of leverage was sort of front and center of my thesis for third party risk. And this was back in 2017. And when I looked at the alternatives that were out there, again we were using things that were, you know, more than decade old. We were using these, you know, these spreadsheets that were questionnaire driven. We were using things like high trust, which isn't a bad thing, but again, it was a very manual, costly process that was done once a year and you know, you were lucky if you did another one ever. And from a vendor's perspective, those were time intensive.
By the time they did get published, they were sort of out of date. And especially as the technology stack was changing from an on Prem world to a cloud based world. How do you keep that data up to date? And how do you continually monitor the risk of a vendor and the products that you rely on over time? Right, because you have a relationship with the vendor, a relationship with the product and things change, the product changes, but your usage and scope of that usage changes over time as well. And so you need to consider all of those things when you think about risk as it relates to third parties. And so that was really the impetus behind starting sense and that I have to confess.
[00:03:30] Speaker A: And I was head of product, there was no security function per se, but I was the guy getting these questionnaires and they can have between 100 and 300 questions. Absolutely.
[00:03:41] Speaker B: Thousands. Thousand. I've seen questioners with over a thousand questions.
[00:03:45] Speaker A: Thank God I never had to do one with thousand. But it was mind boggling. And I always, and sometimes we got these questionnaires when the, let's say the step or process in sales wasn't even that serious.
[00:03:56] Speaker B: That's right.
[00:03:57] Speaker A: I would go back and ask the sales guy, look, this is going to take me hours. Are you like is, can we ask them to delay this to once they actually want to buy? Yeah, it was horrifying, I got to admit.
[00:04:08] Speaker B: And not to mention that the questionnaires may have been cobbled together to include other types of technology. So if you were a cloud vendor, you were getting questions for on prem vendors and vice versa. And so this notion of curation, where everybody was doing their own curation was an important element within our solution and a problem to solve as well.
[00:04:30] Speaker A: Oh, absolutely.
This is a question that is incredibly interesting. I kind of have an answer. But I want to see your perspective. If I just become compliant. Right. There's a whole slur of compliances from ISO 2701 to SOC 2 to healthcare specific and privacy. Can I just give them, say, hey, I'm compliant with these standards and I don't need to fill out these questionnaires anymore.
[00:04:55] Speaker B: Yeah. The challenge with that is it's a point in time compliance. Right. So as I said earlier, you could go through that process, be compliant and then you have to what is the degree of compliance and what's the scope of compliance? As well, it's really important because some of these certifications or compliance requirements will require you and give you flexibility to assess up to a certain level, a certain threshold. So you can't just take the certificate at face value. You have to look at it in context for how you're going to use that application.
[00:05:33] Speaker C: Right.
[00:05:34] Speaker B: And so that's really important. So it's that combination of out of the box security of the product, but also how am I going to configure it, technically implement it? Is there phi? Maybe not today, but over time, is there phi? And that, I mean, that's happened to vendors where initially they were deployed. There was no concept of phi being used. And over time the scope changed in the product.
[00:05:59] Speaker C: Right.
[00:06:00] Speaker B: You look at products today, they're introducing AI into these providers and these providers don't even, they're not even aware of it.
[00:06:07] Speaker C: Right.
[00:06:08] Speaker B: So you've got this. I have a product I've been using for several years. I have a whole AI governance track where I'm looking at new products that are introducing AI and I have the vigilance and the diligence associated. But what about those existing products that are updating their stack with AI capabilities that I may or may not be aware of?
[00:06:29] Speaker C: Right.
[00:06:29] Speaker B: So this is a big issue and it continues to evolve as technology evolves.
[00:06:35] Speaker A: Yeah, that's so on spot. I mean, this topic of point in time compliance is incredibly important. And really what you're alluding to, and keep me honest here, is continuous security. Is that a fair statement? How would you agree with that?
[00:06:53] Speaker B: Well, it's continuous risk management.
[00:06:54] Speaker C: Right.
[00:06:54] Speaker B: Just think about us as humans. We're constantly either consciously or subconsciously managing risk decisions. I get up in the morning, I take a shower, I leave the house, I get in a car, I drive to work, I get on a plane. I'm constantly dealing with risk situations whether I know it or not.
[00:07:12] Speaker C: Right.
[00:07:13] Speaker B: So of course you continually have to look at risk from that context. You can't do everything.
[00:07:18] Speaker C: Right.
[00:07:18] Speaker B: There's obviously not unlimited budget, so you have to do it in the context of the business. Risk is a business decision, not a technology one.
[00:07:27] Speaker A: So Ed, let me ask you this.
In addition to this concept of certifications being a once a year thing and risks happening all the time, I want to ask a different question. If I am compliant, am I secure?
[00:07:44] Speaker B: That's a great question.
Maybe it's probably the best answer because no one wants to be, and I've learned this through my career, no one wants to be the most compliant.
[00:07:54] Speaker C: Right.
[00:07:55] Speaker B: So years ago I had a product that basically was a very elegant way to secure data. We attached the policy and the control, the encryption with the data object. The problem was, I mean, and there were issues with the implementation that we had to get through. But the problem was along comes a regulation called CASB, which was a California Security Breach Law 1386 I think it was like the day before my birthday if I recall.
And this thing gets, gets published. And the way the market interpreted compliance was we just have to deploy full disk encryption and we're done. Check.
[00:08:30] Speaker C: Right.
[00:08:31] Speaker B: And sure it met the compliance requirement. Were you more secure?
[00:08:37] Speaker C: Maybe, maybe.
[00:08:38] Speaker B: Depends on the uses of the data and how you were sending it out and what else was happening.
[00:08:42] Speaker C: Right.
[00:08:43] Speaker B: So. So I think that compliance is a double edged sword. In some ways it's good because it does create a standard level or threshold of security.
However, if it's not written in a way that is logical and done in a way that is focused on the right things, it may create the wrong behaviors.
[00:09:08] Speaker A: That's such an important point. From my experience, there can be a disconnect between compliance and security. And the reason for that is that there's these amazing third party consultants that can help you become compliant, but they're not really looking at your security. They're just writing processes and writing text and verbiages that then they work with the auditor. But in fact, if you look at your security posture from before and after the audit, nothing has changed.
So to me that's a little scary that this whole thing that governments have put together and to help us become more secure has really just created a compliance industry in some cases. In other cases that is not true. So it's definitely not a broad statement. What are the activities thinking about compliance separately from security for a moment? What are the activities that companies need to do not to become compliant but to become secure?
[00:10:06] Speaker B: Well, you obviously need to take a step back and identify those things that are, from an asset perspective that need to be secured and from a business process perspective that are critical to running your business.
[00:10:22] Speaker C: Right.
[00:10:23] Speaker B: And sometimes it's more important to focus on those things than everything.
[00:10:30] Speaker C: Right.
[00:10:31] Speaker B: However, unfortunately, security is not that binary.
[00:10:34] Speaker C: Right.
[00:10:35] Speaker B: And it can be insidious. It can find its way in through the, you know, the long tail, if you will.
[00:10:42] Speaker C: Right.
[00:10:42] Speaker B: And so we're not worried about this vendor because it's small, it's not you, it's not critical. But that could be the springboard, you know that to, to an attack.
[00:10:52] Speaker C: Right.
[00:10:52] Speaker B: And so you have to think about it holistically, but you also have to really have a thesis for where you're going to spend your resources, your limited budget.
And I would posit that you have to start with the overall assets inventory at a business process level. At a critical. What are my critical business processes? That if they go down, we can't run the business.
[00:11:19] Speaker C: Right.
[00:11:20] Speaker B: Let's start there because not everything's created equal.
[00:11:22] Speaker C: Right.
[00:11:23] Speaker B: And so focus the time and energy on those things that matter to the business and ensure that you have the right level of coverage and the right level of diligence. You're not going to. Again, there's no such thing as 100% security, otherwise we wouldn't be able to operate as a business.
[00:11:39] Speaker C: Right.
[00:11:39] Speaker B: I can, I can, I can, I can implement 100% security, but then I can't get out of my house.
[00:11:45] Speaker C: Right.
[00:11:46] Speaker B: So. So we need to be thoughtful about that balance between security and usability.
[00:11:52] Speaker A: Can you tell us a story of, you know, a customer that kind of thought through this? And usually, and keep me honest here, but many times it's after they had a security incident where, you know, customers just wake up and like, oh, I need to really get my head around this. What is one of your customer customer stories where they've kind of experienced this problem firsthand?
[00:12:13] Speaker B: Well, every customer we have has had some type of challenge or experienced some type of problem in this area, if you will. And, you know, oftentimes we find that people are doing the right thing. And if you think about the world prior to a company like Sensenet or a solution like Senseinet, right. They were using what was available and unfortunately they couldn't get the leverage they needed because the business was changing, technology was changing.
[00:12:43] Speaker C: Right.
[00:12:43] Speaker B: And so the only way to do it was to throw more people at it. And those budgets aren't, like I said earlier, they're not unlimited. You don't have unlimited resources. And so I can tell a story about an early customer. I won't name any names here, but they came to us and they basically said, we have, we think the most advanced security program as it relates to third party risk. But we also recognize the disparity between the amount of money we're spending and the actual results is huge. And we have to, we have to look at that in the context of other things we need to do in the business. You know, we have, you know, let's just pick a number. We have 20 people doing this, you know, focused on this problem. We need to take half of those people and redeploy them in other areas that we need. We don't necessarily want to fire them.
[00:13:32] Speaker C: Right.
[00:13:32] Speaker B: Or, but we need to actually do other things that are equally important to the business.
[00:13:36] Speaker C: Right?
[00:13:37] Speaker B: Help us. Can you help us?
[00:13:38] Speaker C: Right.
[00:13:38] Speaker B: So that's an example where technology, if used appropriately, can really create leverage in the overall business.
[00:13:47] Speaker C: Right.
[00:13:47] Speaker B: If I can still do what I'm doing today and actually do it even better, faster, cheaper, across the board. Why wouldn't I deploy the technology? And I think that's sort of the promise of technology. However, people have to think about it. And I know everyone says this, well, it's a people process, technology problem.
But then you have to think about it that way as well. And you have to be willing to actually transform your business regardless of what you're doing today.
Case in point, if you take a bad process and apply it to a great product, you're going to get bad outcomes and vice versa. If you have a great process and you have a bad product, you're going to get marginal outcomes, bad outcomes.
[00:14:32] Speaker C: Right.
[00:14:32] Speaker B: And so there is a balanced approach and a little bit of give and take where you've got to let the product, if it's a good product, you've got to let the product inform the process and you have to be willing to be open to evolving that process accordingly so you can get the best out of the solution that you just invested in. And so we see that's the biggest challenge in security. It's not necessarily a product problem or a person people problem or a process problem. It's the combination of those things working in concert to deliver better outcomes on the security side. Does that make sense, Ari?
[00:15:10] Speaker A: Absolutely. You know, I think that we've seen. I managed a consulting group for Siemens for many years, so we would come in and we would just see these technologies completely being abused by a process that just doesn't make sense and vice versa. So I really appreciate your comments and.
[00:15:30] Speaker B: Just, I'll just add one thing else to that. I've been a product person all my career, but I'm also a process wonk, I consider myself, and I've built processes based on capability maturity models.
[00:15:45] Speaker C: Right.
[00:15:45] Speaker B: So that's sort of the framework I use. When I started the company, I didn't start writing, you know, a line of code.
[00:15:51] Speaker A: Let me, let me. I want to bring our audience with us. I don't think everybody is familiar with the CMMI concepts and all that, so. So give us an introduction to what that means.
[00:15:59] Speaker B: Yeah. A capabilities maturity model is a way to think about and manage progress from an as is state to A to B state. And it's usually across levels, usually there's five levels. And you start with an ad hoc approach and you become adaptable over time. You become flexible, resilient over time based on that process.
[00:16:19] Speaker C: Right.
[00:16:20] Speaker B: And it basically is a roadmap, if you will, for technology adoption in a way that takes into Account people, process and technology. And so when I, when I started the company, I wrote a cm, I wrote a capabilities maturity model for third party risk and that informed my product decisions. And that's how we approach it with customers. And I think if you approach it that way, you at least have an honest conversation about what's happening.
[00:16:51] Speaker C: Right.
[00:16:52] Speaker B: With the product or the overall process. You have to be able to look in the mirror. I have to be able to look in the mirror from a process perspective and a product perspective and say, are we doing the right things for the customers? And we also have to challenge customers when they're not thinking about it that way because if we don't, they're going to get bad results and they're not going to renew with us. We think that's, that's not a good way to build relationships that are lasting.
[00:17:17] Speaker C: Right.
[00:17:18] Speaker B: And we also want to learn from them by having those conversations. We're not, we're not perfect. We don't have the, you know, you know, all the answers to everything else. But what we do have is we have this unbelievable base of customers going through the same problem.
[00:17:33] Speaker C: Right, Right.
[00:17:33] Speaker B: Trying to solve the same problem. And we capture those as best practices and then we measure best in class customers. So then I can go to a customer that may be resisting process changes and say, oh, you can continue to resist, but you're going to continue to get lackluster results. Here's what best in class looks like. We measure this, we measure you. Here's where you are. Look at this gap. If you did these things, you changed and you applied these best practices. Now some of them are easy, some of them may be policy changes.
[00:18:05] Speaker C: Right.
[00:18:06] Speaker B: But we know they work because we measure them and we believe that's the way you get that balanced approach with process technology in people at the end of the day.
[00:18:17] Speaker A: So I want to test some thoughts or approaches that people have in this topic of building a policy. Can I just define a perfect policy and then just jump to it? No matter, regardless of what I'm doing today.
[00:18:32] Speaker B: Whoa. A policy typically is just a documented set of requirements, if you will, that theoretically need to be followed at some level.
[00:18:47] Speaker C: Right.
[00:18:48] Speaker B: And so policies in and of themselves are necessary, but they're never sufficient. You also have to have those procedures, if you will, that support those policies. And you also need to make sure that you're not creating the wrong behaviors.
[00:19:02] Speaker C: Right.
[00:19:03] Speaker B: And you're also not.
There's not enough gray in there to have interpretation.
[00:19:11] Speaker C: Right.
[00:19:12] Speaker B: And so I think mostly people want to do the Right thing if they're given the right set of tools. But we're also, you know, human beings as well. We're not perfect. And so some people will interpret it to the letter of the law, as they believe, and some will sort of say, well, it doesn't really say this. I'm going to, I'm going to do it this way. Right? So I think that when we write policies and when we review policies and procedures, we always have to keep that in the front of our mind, right? Are we getting the right behaviors based on this policy?
It's just like shadow it, right? No, shadow it. What does that mean? Does that mean we're not going to be able to do the business?
[00:19:52] Speaker C: Right.
[00:19:52] Speaker B: Well, then, okay, I won't do shadow id. I'll buy it myself and get it deployed. Well, that's shadow it. Well, not by the letter of the law. So there's the grayness.
[00:20:00] Speaker C: Right.
[00:20:00] Speaker B: And people do things like that. It's, you know, especially providers that may or may not be employed by the, by the hospital or the health system is there.
[00:20:10] Speaker A: When we look at implementing processes, does that help us with measuring them? My assumption is that if we have a somewhat standardized process across the organization, that would help us to measure and then also to improve, you have to.
[00:20:25] Speaker B: Have all three of those things. You have to be able to implement, monitor, measure, and then improve. It has to be a life cycle, right? And so that's why, to your point earlier, the other problem about these certificates and older approaches to this problem is no one goes back and reassesses, right? Because they don't have enough time to even do the number of products and vendors they have already in their current inventory or the ones that they're adding over the next 12 months. Imagine now I've got to reassess everything. The only way to do that is to leverage technology as part of your overall.
[00:20:59] Speaker A: Okay, so break this down to us. When we talk about technology as part of our risk management, what does that look like? What are the building blocks?
[00:21:08] Speaker B: Well, you have to take a threat model approach, right? So you have to understand the threats in your business. You have to identify your assets in your inventory of vendors and products and understand what you're leveraging from those relationships, right? Is it a consulting relationship, is it a supplier relationship, that they don't have any technology. But from a supplier perspective, let's take a laundry service, right? Most people will think of them as a supplier. We don't need to assess them. They're a laundry service. They have no technology that we use, so we don't worry about them. However, if you're a large hospital and you have one laundry service that can service you and they get hit with a cyber attack, guess what? That's a risk. Because now you can't operate your hospital without laundry, right? Without linens and scrubs and everything else, right?
So what's important is really to take that holistic view of not just your ecosystem, but also your internal systems, your internal products and services, those things again, that you rely on to run your business. And sometimes they're tech enabled and sometimes they're not. And so it's really important to have that holistic view of what's critical to the business. And you obviously want to identify, you want to be able to go back and detect, you want to be able to obviously protect with controls and things of that nature to mitigate or remediate risk. But sometimes it may be more important to put the limited available funds into resiliency plans, right? Because this notion of it's not a matter of if, it's a matter of when, which everyone is on everyone's lips these days, right? We know we're going to get hit, we just don't know when.
You better be resilient. You better have your clinical continuity, your disaster recovery and your business continuity plans documented, tested.
[00:23:13] Speaker C: Right.
[00:23:15] Speaker B: And updated appropriately to deal with some type of disruption that could shut your system down for hours, days, weeks, months, years.
[00:23:27] Speaker A: This is a, this is an incredibly important point and I want to latch on. You didn't say it explicitly in this way, but really what you said is find your single points of failure.
And what you also said is that you can try to secure things, but you know, security is not the only risk. By not having multiple vendors, maybe the vendor just shuts down, right? That's not a security risk. Maybe the vendor increases their prices to the level that you're dependent on them and now you're not profitable. And this is, this is coming from painful examples of, you know, my experience. So this is an incredibly important part. You can secure yourself and reduce risk by adding a vendor, as simple as that. It's not always about technology.
Understanding your processes as you're talking about ED is incredibly important.
[00:24:18] Speaker B: And understanding the reliance of a particular vendor or product. And this is where people got in trouble with change, right? They didn't realize that over time, Change Health was acquiring all these other technologies and services that the community relied on. And so, you know, when the announcement was made in February of last year that there was an issue, some people Went, woo. We don't use that them. When in fact they realized, wait, we do use them. They acquired our technology vendor that we rely on.
[00:24:56] Speaker C: Right.
[00:24:57] Speaker B: So there was this aha moment that you know, was a little bit of a oh boy, the fan has been hit and we've got to now deal with something that we thought we were okay with. And I think that, that you know, risk is multidimensional and if you think about it as a, again, point in time and only one product, one vendor at a time, you lose the forest through the trees. You lose the fact that from a critical business process, we rely on clearing houses to generate revenue and we have a consolidation problem with one vendor. We don't have alternative strategies. So if they get hit, we're in trouble and our system could shut down.
And if it doesn't get back online quickly, we may not be able to generate revenue for a certain period of time and we may not have the amount of cash on hand we need to withstand that type of event.
[00:25:53] Speaker C: Right.
[00:25:53] Speaker B: And I think that's the type of thinking now that we all took away from that change experience. Systemic risk is also a consideration. I know we all thought about it, but people thought about it from an earthquake or some natural event happening, right? It's much greater than that these days.
[00:26:12] Speaker A: I would make even a stronger point is that, and this is completely non trivial, the teams don't understand what they're actually doing. And I'll give you a short story on this.
CEO came to me in one of these, before one of these company get togethers and say, hey Ari, I'd like you to give a session. I said, okay, what I'd like to do is journey mapping.
And the CEO said, well, okay, I mean I don't see the value in that, but I trust you, like everything you do has generated value. We got into the session and the way I structured it is that the different team, head of marketing, head of sales, et cetera and their employees basically presented their processes. The amount of shock and disagreement that came out, well, no, that's not what we do. Suddenly we figured out that different people are using different processes. People from other divisions were exhilarated to learn what the other division is actually doing. That created a few things. One, there were so many spin off meetings, we're like, oh, I can help you with this, I can make this process for you more effective because I have this data, I have this thing, I can put this touch in my communication with the customer. It was a revolutionary process. People were amazed. In fact, I Had people come out and say, this was the best session I have had ever in all of my, you know, jobs.
[00:27:28] Speaker B: Yeah.
[00:27:28] Speaker A: And, and to me, that was just saying the simple thing. You don't know what you're doing, just bringing people into a room.
[00:27:36] Speaker B: Yeah.
[00:27:36] Speaker A: So first of all, and you said this at the beginning, map out where you are right now, validate that with the whole team, and you'll, you'll figure out that there's a lot of surprises. And my guess is that many of those surprises are opportunities for you to improve. Maybe some of them don't have the benefit you kind of do the, you know, the cost effective analysis on that and you're like, well, you know, discrepancy here is okay, but in other cases you might find that, oh, there is huge money left on the table here. So this is such an important point that you're bringing forward. So I just really wanted to crystallize that.
[00:28:12] Speaker B: Yeah, yeah. And to your point, that perception of what's risky and what's not risky is really at the heart of the matter and how people operationalize around that data.
[00:28:29] Speaker C: Right.
[00:28:29] Speaker B: And so I was just at CHIME this past week and I was in a meeting and it dawned on me people are conflating the requirements around risk management from the government, from the OCR's perspective.
[00:28:46] Speaker C: Right.
[00:28:46] Speaker B: So what happens if health system or BAA, a partner gets breached and there's a data breach or ransomware?
You know, OCR typically will come in, they'll conduct an audit.
[00:28:59] Speaker C: Right.
[00:29:00] Speaker B: And if you read those audits, you read their findings, which I do, believe it or not, in my spare time. You know, you see patterns and the pattern is 90% of them basically get, get in trouble because they haven't done a risk management assessment.
[00:29:17] Speaker C: Right.
[00:29:17] Speaker B: And we were talking about that and several folks would say, yeah, but I don't have the people or the time because what they were doing is thinking about the risk management process as I have to identify and fix at the same time. And I would posit, no, you actually have to identify first because why would you apply your resources to fix something if that may or may not be the right thing to fix?
[00:29:43] Speaker A: Oh, perfect.
[00:29:44] Speaker B: Given your limited resources.
[00:29:46] Speaker C: Right.
[00:29:47] Speaker B: Spend the time, first and foremost to get a critical view of the organization, those things that matter in the organization, and build out your risk assessment from that perspective. And if, God forbid, something happens and you're doing the right things and thinking about it logically and you get audited, I believe you probably would not be in trouble because you're in process, you've identified all of the available risks in your organization and you identified the fact that we have a plan to mitigate and remediate. We're focused on the critical ones first.
[00:30:23] Speaker A: Exactly, exactly.
[00:30:24] Speaker C: Right.
[00:30:24] Speaker B: And risk management is about taking risk at some level.
You can't reduce all risks. So therefore, inherent in your risk management program there's a notion of appetite and tolerance.
[00:30:37] Speaker C: Right.
[00:30:37] Speaker B: And you have to document that.
[00:30:39] Speaker C: Right.
[00:30:39] Speaker B: And that's what the OCR wants to see. Are you, do you have a program? Is it logically sound? Do you have it documented? Are you making progress on it?
[00:30:48] Speaker C: Right.
[00:30:49] Speaker B: Those are the things they want to understand.
[00:30:52] Speaker A: This is, I mean, you know, I think a lot of the pitfalls is that people come, they see, oh, 100, 300 controls, I need to be perfect on everything. They get completely overwhelmed.
[00:31:03] Speaker C: Yeah.
[00:31:03] Speaker A: That is not the expectation. And to boil it down, it's identify, prioritize, incredibly important, and then just decide you're going to work on whatever, 5, 10 of the highest priorities over the next month. 2, 3. As long as you have that plan and you're actually executing, you will not be found at fault. This is, this has been my experience as well. I appreciate that. And I think there is a discrepancy in the market on this exact topic.
[00:31:30] Speaker B: So, so important, the most secure organizations still get breached.
[00:31:35] Speaker A: Absolutely. Absolutely no 100%. Ed, what do you do with the human element? I mean, you know, you know, five years ago I looked at the phishing attacks and I was like, oh yeah, this is clearly a phishing attack and I can train people on how to avoid these. Today I look at things and I'm like, shit, I don't even know what's happening here. I'm a self proclaimed expert. So it's getting tough. And with the introduction of ChatGPT and now these models that you can just own yourself and run yourself, it's getting scary.
[00:32:11] Speaker B: You have to teach. And it's funny you mentioned this because this is a.
People say humans are the weakest link in the process. Right.
I don't believe in that per se. I believe that we have an opportunity to figure out how to educate people in a way that enables them to do the right thing.
And it's hard because it is a moving target. The technology and the approaches are getting more sophisticated and so things you did three years ago or sort of out of date at this point. So therefore you almost have to think about why are some organizations better than others? Why are some individuals better than other?
[00:32:51] Speaker C: Right.
[00:32:52] Speaker B: And I think it comes down to, you have to teach people to be paranoid.
Right. That's my take on it. I was thinking about it this week. I'm like, why am I able to take telemetry and process it differently than most of my organization?
[00:33:08] Speaker C: Right.
[00:33:09] Speaker B: And I think it's because you need a level of healthy paranoia, and you need to understand the signals that are coming in and how to process them in a way that obviously you don't want to be too paranoid because we don't want. But. But you do need to be able to constantly take the data, evaluate it, have a healthy level of skepticism about what you're seeing. If something doesn't look right. I always tell my team, if something doesn't look right, your gut will tell you. The problem is people don't listen to their gut as often as they should.
[00:33:44] Speaker C: Right.
[00:33:44] Speaker B: So if you think something's wrong, it most likely it is.
[00:33:47] Speaker A: You know, I feel like the best security advice I've given I've gotten ever has been from my pharmacist mother.
And what she said to me is, son, if it's too good to be true.
[00:34:01] Speaker B: Yeah, there you go.
[00:34:02] Speaker A: It ain't.
I just had a story from, you know, one of one of our guests tell me that an investor. Now, you would expect an investor in technology to be super savvy. He sends him the following email that, oh, I can't get on the. On the call today. I'm helping somebody transfer a barrel of gold, Gold from point A to point B. And he was like, I need to call this guy right away. He's being scammed. It was incredible to me. I mean, even people who are, you know, quote unquote, on the top of their game.
[00:34:35] Speaker B: Yeah.
[00:34:35] Speaker A: Can be hit by these things. That's right.
[00:34:37] Speaker B: That's right.
[00:34:38] Speaker A: I have a question.
If this is getting so sophisticated with AI personalizing these attacks based on our LinkedIn, Facebook, our company profile, it's getting so good.
Is there a place to look at it from a different perspective? Meaning the actual processes in which we communicate and the steps in those processes, maybe taking a different approach on it, not just educating the people, but controlling the methods of communication.
[00:35:08] Speaker B: Yeah.
Yes. In fact, I wrote a patent a couple years ago called quorum based authentication, where this notion of having a quorum, like you see in these bank heist movies, where you got the eyeball and somebody else comes in with something, and then the third person comes in to open up the vault.
[00:35:27] Speaker C: Right.
[00:35:28] Speaker B: It's this notion of having a quorum to make a decision is kind of an interesting Approach. And I've thought about it recently in terms of, you know, leveraging the telemetry and leveraging the data that you're getting to do something to take an action.
[00:35:47] Speaker C: Right?
[00:35:48] Speaker B: We almost have like a I will text you if I ever send you something like, you'll never going to get, you know, or I will call you if I'm going to ask you to do this thing.
[00:35:58] Speaker C: Right?
[00:35:58] Speaker B: And even that, and even in that is delicate because of now you've got AI manipulating voices, right? And so. And so I almost say that by policy, I'll never ask you to do this, right? Like, we'll be on a video, we'll have a conversation. I might text you like, this is like critical business process. I'm never going to ask you to transfer money via email or a text message. It's just not going to happen.
[00:36:27] Speaker C: Right.
[00:36:28] Speaker B: So I think you have to think about it from that quorum lens, if you will. What are the things I need to see aligned perfectly for me to make a decision that's critical to my organization? I think there's something there. And I've been thinking about how do you create some type of process or product around that I think would be. Would be kind of interesting.
[00:36:49] Speaker A: That's so insightful.
The way I describe this is that is there anything odd about what I just got?
[00:36:57] Speaker B: Yeah.
[00:36:58] Speaker A: Is the CEO or CFO or whoever contacting me in a way that is not typical for them?
[00:37:04] Speaker B: Yes.
[00:37:05] Speaker A: Is the request itself untypical in any way?
[00:37:08] Speaker C: Yeah.
[00:37:08] Speaker A: So you're really looking. And this goes back to what you said about being paranoid. Really what you're looking for is red flags.
[00:37:14] Speaker B: Yes.
[00:37:14] Speaker A: Is there any red flag in this communication request task, you know, should this be going to somebody else? And all you need to do is pick up the phone to that person with the number that you know is actually theirs. Don't call back the one that's in the email because that could be fake. And just say, hey, you know, Ed, I don't want to make a mistake here. I just want to make sure this is 100% correct. What I understand, it takes two minutes. You might save the company millions. In fact, we just had this story not so long ago, this was the accounting department just paying bills that didn't exist. They were fake bills. They just got the bills. They paid it. And this was actually, I think this was Google. Keep me honest to hear, if you remember, an accounting department just paid and it totaled to, I think almost $100 million in bill bills that they paid to supplier that did not exist and services were not rendered. All you have to do is call up the department to receive the services. And like, hey, did you get this? Like, is this real?
[00:38:12] Speaker B: Yeah. Oh, that's it. I. Yeah. I mean, I've got a couple examples of that happening right now where literally. Or if I get a message from, if somebody calls me and they get through and they say, it's my bank, I'm like, give me the number and I'll call you back. Like, what's it regarding? All right, I'm going to call you back. I'm not going to take this message.
[00:38:30] Speaker C: Right.
[00:38:31] Speaker B: So I think we have to be much more vigilant. And I always say, you know, on my, my podcast, I always say, if you're on the front lines delivering patient care and protecting patient safety, remember to stay vigilant because risk never sleeps.
[00:38:45] Speaker C: Right.
[00:38:46] Speaker B: And so this notion of risk is always there. You have to be vigilant. You've got to stay vigilant. You've got to stay paranoid. And it's getting harder and harder and it's going to get worse and worse, not better.
[00:38:57] Speaker C: Right.
[00:38:58] Speaker B: So we have to, as humans, leverage our own capacity to deal with this, whatever capacity. We have to think critically. Combine that with some paranoia and some healthy level of skepticism about everything.
[00:39:13] Speaker C: Right.
[00:39:13] Speaker B: Because that, I mean, otherwise we're just going to continue. It's going to continue to be an arms race.
[00:39:18] Speaker A: Absolutely. Ed, what a delightful conversation. I appreciate you coming on the show. Yeah. We talked before and you confessed that you asked the same question to your guests as I do, which I thought was a circle, but nonetheless, I'm going to ask you it.
[00:39:32] Speaker B: Okay.
[00:39:33] Speaker A: What would you advise 20 something year old? Ed?
[00:39:38] Speaker B: Yeah. You know, I asked that question and I love the, I love the varying responses I get.
[00:39:44] Speaker C: Right.
[00:39:44] Speaker B: And so, you know, 20 years ago, if you asked me that question, which I wouldn't have been 20, by the way, I would have said something like, oh, buy Microsoft. I should have bought my. Because I'm an investor. I should have invested in Microsoft. I mean, so.
[00:39:59] Speaker A: But that's the hack answer.
[00:40:01] Speaker B: Yeah, that's the hack answer. But I think, you know, but the thoughtful answers I often get too, are not to be so hard on myself.
[00:40:10] Speaker C: Right.
[00:40:11] Speaker B: And, you know, I don't have any regrets. I made a number of mistakes.
I was a train wreck at 20. And I don't mean that. I mean I was a train wreck. And people that are listening to this that know me will say, yup, he was a train wreck at 20. And I've been married 34 years. I mean, my wife dealt with me then and she stayed with me. And I think I've gotten better over time. Right. But I'm an extremist and I probably have an addictive personality.
I'm an alcoholic. I quit drinking three years ago.
And I feel great. I wish I had done that 20 years. I wish I'd done that 30, 20, 30 years ago. So I think if I could go back in time, really, and tell my my 20 year old self one thing, it would be stop drinking. Like you're going to stop drinking at some point in your life. Do it now because you're going to benefit so greatly from it. You just don't know it.
[00:41:13] Speaker A: Ed, thank you for your humility and vulnerability. What an absolute delight. I appreciate you coming on the show today.
[00:41:20] Speaker B: Thank you. Art.