Michael Taylor | Nov 20, 2024

November 20, 2024 00:43:05

Hosted By

Ari Block

Show Notes

Michael Taylor discusses the intricacies of decision-making, emphasizing the importance of understanding the entire system involved in any decision. He highlights the pitfalls of group dynamics, the significance of quality inputs, and the need for continuous learning and monitoring of decisions over time. The discussion also touches on the cultural aspects of decision-making and the necessity of aligning assumptions within teams to avoid false positives.

https://schellingpoint.com/

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: Michael, welcome aboard to the show today. So happy to have you on. [00:00:03] Speaker B: Thank you, Ari. Looking forward to it. [00:00:06] Speaker A: I want to start with a really simple question. Give me an example, you know, your own or somebody else's, about how decisions can go horribly wrong. [00:00:20] Speaker B: Whenever we make a decision. The point about a decision is that unless we decide not to do anything, generally a decision is going to cause us to do something, so we're going to take action. Whenever we take action, we're taking action on some sort of system. There's a system in place that we're going to take action on. If we don't consider the entire system, we only consider the parts that we can see. The pieces that we don't see often will react negatively and that's where we get terms such as blowback and backfiring. But in the UK in the 80s, we decided that it was appropriate to require passengers in the rear of vehicles to wear seat belts. And that made absolute sense. But for the next three years, the mortality rate in the UK increased and people said, oh, well, why is that? Right? And so when you ask people why, then people say, well, maybe because people were strapped in more, the driver felt safer going faster, that led to more accidents. And so you get these rational thoughts. In actual fact, what it was down to was that unfortunately, when somebody passes away in the rear of a vehicle, you get multiple donor organs. You can get seven or eight donor organs. So actually there were less fatalities in car crashes with the rear passengers. But that meant a drop in supply of donor organs, which meant the queue and system and in the hospitals meant that there weren't the supply of donor organs that the system was used to. So the rate, the mortality rate, what one life saved on the road led to multiple lives, unfortunately, people not getting organs. [00:02:06] Speaker A: What strikes me with that example is that on the one hand, there was the original decision making, which it seems, keep me honest here, there was a less of a maybe a mistake in the decision making there. It, it seems like the big pitfall is actually how to make conclusion of what actually happened. [00:02:27] Speaker B: You know, people ask, how do I make sure I make a good decision or the best decision? And generally there's three elements to it. The first is the decision making process. How did I come up with the decision? The second is the implementation of the decision. How well did I implement it? The third one is just luck and chance. [00:02:42] Speaker A: Hold on, how many people even ask themselves that question? How should we make the decision? [00:02:47] Speaker B: So that's an interesting point. No, few do, few do because we've Been making decisions since we were not even realizing we were making decisions. So great question. Decision making is something we've been doing since we were weeks old. We're conscious of it so that when we get to be, you know, two, three, four years old, doing it in groups in the playground, right. Doing it with family members. So it's just something we all how to do normally, what will trigger it is if there's a bad decision, somebody will say, well how do we start to make better decisions? It's normally a trigger of something bad occurring. You're right. Yeah, yeah, yeah. [00:03:29] Speaker A: So, so if we, if we just you know, stop there for a second and we agree that the, the thinking about how to make decisions is, is actually an important question. Maybe. So let's explore that for a moment. If I'm trying to make a decision, could there potentially be different ways or different methods to make the same decision, different decision method processes? [00:03:52] Speaker B: Well, absolutely. So if you go back through time, I mean, we've been making decisions since the first cavemen stepped out and looked onto the plane and saw a woolly mammoth and said, but we're going to go after that we're hungry. Right. So we'd be making decisions without realizing it. And then of course we developed techniques, consensus building techniques, techniques, nominal group theories and various other mechanisms. So I think to you. So yes, depending upon the particular decision as well, there can be multiple ways to go about making the decision. And in general, if you think about an organization, a leadership team in an organization, it's rare today that the same decision process is used twice. Right. It, you know, this group's making meeting about this subject and perhaps it's a strategy. So we're going to use this way of forming the strategy and then we're going into the next meeting and it's going to be a policy and then we're going to use something different there. And generally there's not even a choice actually of which process method we're going to use. Right, you're right, it's. [00:05:10] Speaker A: And when we look historically at decisions that have gone awfully wrong, you know, for example, Challenger, we see issues, right, with group think with, you know, over usage of authority that can lead to horrific outcomes. Right. People dying, spaceships exploding. Yeah, I want to dive into this a little bit. What are the, what are the dangers or what are the, the things that can go wrong in group decision making? [00:05:42] Speaker B: How many hours have we got? [00:05:45] Speaker A: Yeah, pick a card, any card. Just choose, choose your favorite. [00:05:49] Speaker B: How many hours have your viewers got to go? Well, so Schelling Point, we started off as a two year research program. It's now an applied research organization. Right. And so currently we've identified 33 quality leaks. So like I say, where do you want to start? We can take any one of those. We can take any one of those. So there are multiple, there are, there are multi, multiple quality leaks. [00:06:17] Speaker A: Wait, just define that. Quality leak. What does that mean? [00:06:20] Speaker B: Okay, sorry, yeah, apologize. That's, that's me just assuming, right. Everyone understands what I'm talking about. [00:06:26] Speaker A: So expert bias. [00:06:29] Speaker B: Thank you. So if you, if you take, if you look at a decision and realize that there is a process for making a decision, there's that it's tricky. It's, we call it, it's called the signal to response process. So there's some signal, something occurs, somebody says, well, we just back up for the beginning. There's three types of decisions. There's unilateral, informed and collaborative. Right. So the first thing is if I decide that it's going to be something that I want to do collaboratively with others, so I've been triggered by something and I think, right, I'm going to get one or more others, possibly thousands, make this decision together. You've got the signal that triggers the start and then you're going to go through some activities to arrive at a response, right? Which is the decision. So if you treat it, if you, if you look at it as a signal to response process and say, well, what should one do? What should the steps be, what should they be, what should the sequence of them be? What should we do, what should we not do? Then I'm using six Sigma process language is you look for what are called quality leaks. You look for where is the tap dripping and you particularly go to the front end and rather designing out how to better install a wood floor when it's been damaged by the leak. You go to the beginning and say, how do we stop the tap leaking and damaging the wood floor? So there's this notion of quality leaks. [00:07:53] Speaker A: And this comes, this comes from the manufacturing world, right? Really Six Sigma, what it is, it's a process design to design the process that has the highest quality output. Really that's what it is. And the amazing thing is you're using manufacturing, let's say techniques or philosophies to look at the process of decision making. That's such an unexpected approach or perspective. [00:08:16] Speaker B: It is. And in general decision making will talk about. One should make a decision by having two ears and one mouth and one should write and invite inquiry but if you look at a decision as a manufacturing process, this is a very gray way of looking at it. Right. But if you treat it, that data is coming in and you're gathering opinions, that's your raw material. You're Rather than having work centers and robots, you have humans that are the machine, you're feeding in raw materials. They're doing the transformation of the raw material. And the product that's coming out is agreements, goals and actions. It is actually a manufacturing process. Yeah. Yeah. Well, it's called making a decision. Yeah. [00:09:01] Speaker A: I love that. I don't think. I don't think you hear people thinking about it in that way. It's quite an insightful, I think, perspective. So. So if we kind of disentangle this analogy, I think there's a lot of truth to it. If we start at the beginning, the inputs that come into the process matter. And if I kind of try and disassemble that, simply, maybe we are missing inputs that we don't even know we need. Right. So there's the unknown. Unknown, so to speak, it's attributed to Ronsville. I don't know if that's true. There's the quality of the inputs that are coming in. Maybe we're getting data that is untrue, incorrect, or maybe partially true. How do we even attack this? What is the thinking that we need to have around deciding the inputs into our decision? What's the approach? [00:09:51] Speaker B: The initial input is either pure data, facts and figures, and then it's people's opinions. And there's a key term which is that data informs decisions, opinions become decisions. Right. So when I read something, market growth is X or the state of something is at 2.3%, that's data. And from that I form a view that I then use in the decision. But I then have many opinions of things that are not informed directly by data that we gather. I think the regulators are going to do something the next couple of years. I think the competition, whatever, Right. And so if you take the manufacturing. If you take the manufacturing metaphor, then when we have a car, we know that we need four wheels and four tires and a steering wheel and an engine. And we can look at a bill of material and say, right, if we're going to get the product out at the end, we can ripple back through and say, at the beginning, we need to start with this. You can do it with computers, any manufactured product. So we did the same. We did the same with decisions. Most decisions tend to conventional decision making treats it as like an onion unfolding you step in and go deeper and then you say, enough, let's come out. So using prison process thinking, we went to the end and said, well, okay, well, if I want to actually have the most valuable, most viable, most endorsed decision, what do we need at the beginning? So if you take opinions for a moment, we found that if you take down the sticky notes or the sticky notes off walls and the whiteboards and the flip charts and the online whiteboards that people collaborate with these days, if you actually take those down and count the number of individual opinions in them, you find on average about 50 to 90. And I mean, what I mean by that is percent, no individual unique opinion. So customer wants this. We are good at this. Palo Alto doesn't have enough resources for this. The goal should be to grow this by 12%. If you look at those types of declarative statements that are opinions, you get about 50 to 90. And what we've found in recent years, using a particular methodology, we now know from hundreds of groups that groups average 130, 290. So there's twice as many opinions in a group than actually have been used as the raw material. Well, straight away that's a problem. So the question is, well, what's the gap? What's the gap between the two? And you pause me here when you want to. But we've found there are four reasons for the gap, one of which we've already spoken to, which is that I do an exercise on MBA programs and at business schools, I do an exercise where I take a group of people and say, if you're going to make a decision about this and come out with a set of actions that aren't just rational and logical and compelling, but are actually complete, what would you discuss? What would you cover? What territory on that decision? And we found in running hundreds of groups that they'll come up with about 50 to 80% of the system of the decision. Right. So in other words, what we're basically saying is groups go and say, right, we're going to make a decision about X. And they're actually not starting by covering everything. So they don't have it. That's one reason why they don't have a full set of opinions. And then there's three other reasons as well. So you're basically, we've been building decisions that look great but off, not all the raw material. [00:13:40] Speaker A: So have you seen a difference between groups that are making a decision and kind of all the thoughts and their whiteboards and what happens versus the group that started with the question of how to make the decision and then what their whiteboard looks like. Is there a discrepancy there? [00:13:57] Speaker B: Oh yeah. I can actually show. If you really want to get geek out on this, I can show Sankey diagrams and flowcharts and with different lengths and sizes, we have it all. We've got over 9 million data points in the database and groups around the world and no, the actual decision, if you look at it as a manufacturing process, the visuals look extremely, very different. Yeah, that's amazing. [00:14:22] Speaker A: I mean just that insight that asking yourself how to make the decision and what my inputs should be, that completely changes the way that you would make the decision. That is I think such an important takeaway. So I really appreciate. But you said something I think very interesting about this idea of opinions and you. I don't want to put words into your mouth so let me say a statement and keep me honest here. If you agree or not, it sounds like you were saying take your opinions and convert them into hypothesis. Say this is what we think would happen, this is what we believe and separate those hypothesis or opinions from the data and the facts and that will build a better process. Did I understand that correctly? [00:15:04] Speaker B: That is absolutely part of it. Because if you look at the end of the day at a decision, our decision is that we're going to focus on the South American market for the next two years while Europe and Brexit sorts itself out in the Ukraine. And we're going to do the following. And if you look at that, you can trail that back. Exactly. Directly and indirectly, directly back to the original source opinions. And so yeah, what you're doing is you're the reason why I say directly and indirectly is because you bring these opinions, in my opinion is this is going to happen, this shouldn't be tried, this is possible. And you find that they either directly one to one end up in the decision or you finally go through either the form of becoming a hypothesis that gets tested or they become a non alignment that gets reasoned and converged. Yeah. So there's a direct correlation between them. [00:16:02] Speaker A: Yep, that's. But there's, that's so interesting because what you're saying is that if you understand how you make your decisions, right. And you maybe even documented it, you've written it down, what were my assumptions, what were my hypothesis? And then okay, so six months have passed, stuff has happened. Does that become a tool to actually improve your decision making? Does that become a tool to learn from your mistakes? Because there's psychology that goes into, we remember things that are different than actually happened. Right. A whole bunch of biases around how our memory gets basically distorted. What does that look like? How do we learn from our decisions? How do we do that? [00:16:45] Speaker B: No, Harry, honestly, it's a great question. So in the field of research and understanding decisions, one simple model to think about is there's real time research and there's, and there's post decision research. So the first one is looking at what's going on and looking at what's been said and how it's been processed. And the second one is asking people after the fact, what were you thinking at the time? Why? Right. The post one is valuable but of course it's got issues in it with memory and recall and biases and. Right. So at shunning point we are real time. Everything is real time. So it's what were people thinking? Why were they thinking that? Why were they agreeing and disagreeing, et cetera, et cetera, et cetera. So let me say the following. You've hit on a very important subject. So imagine that we were, you and I and some others were putting together a policy or a strategy or a business relationship or a change or transformation and we went and went through our process and we end up with our decision, which is we're going to aim for these goals by doing this, etc. Now if you think at that point, hopefully at that point we have the highest degree of alignment. Okay, just a tangent. No group is ever fully aligned and never can be. So it's about do we have sufficient alignment? So now imagine we've got to the end and we have sufficient alignment. The highest point we've now found there are nine reasons why over time that will decline. It's like putting the key in the car. It's like putting the key in on a brand new car and the depreciation starts. You can't stop it. So the process that we advocate, and I'll come back to your learning, the process we advocate says come up with the decision. It was based upon a set of assumptions and thought processes and beliefs. Right? You come up with your decision, you start to implement it. So for example, in a merger every four months for two years, there's what we call the delta step. And in the delta step you get, you go back and you ask, well, you have these assumptions back here, are they still valid? You have these goals back here, do you still believe in them? And you basically figure out and surface which of the nine reasons for drifting have triggered. So if you do that, like I say in mergers, we advise groups to do that every four months for say two years. So first of all, longitudinally on a specific decision, that's how you make sure the time between the decision and the implementation and achievement of the outcomes that that drift is occurring. But we've got one organization, for example, they have, they've used a piece of software because we're an applied research organization, right? So there's a piece of software that enables the methods to be used repeatedly. So imagine they have 44, right now, they've got 44 business decisions that they've made. So after the first 17, we were then able to look at all the data and say, okay, now look, when your executives say this it's helping. And when your executives say this, it's hurting. And then at 43, we're able to look at it again and with one group after the first five, we're able to look at the five and say you have an issue in your global decision making mechanisms that's damaging you. So yeah, so there's kind of a longitudinal over time and then there's a big data element to it to cycle back round. Yep. [00:20:31] Speaker A: This is so important because I think that what we see most often is that, you know, these decisions are made, they made sense at the time. Two years later, everybody on the ground is like, this is stupid. How did we even get here? And what you're advocating, which I think is brilliant, is that, well, if you just revisit your decisions, right, every four months or whatever the timeframe is right, for that type of, of decision, you're not going to be like, oh, let's backtrack, right? That's not the idea. The idea is if something in your assumptions and your data changed, you might need to tweak the decision a little bit. You might need to tweak the implementation. It could be that something fundamental in your world has changed and you need to backtrack it. But at least in that case, you might not decide to make an immediate decision. You might decide that you've identified a risk and you want to keep your finger on the pulse, so to speak. So there is a beauty in tracking these decisions and how they progress in baby steps as opposed to until it slaps you in the face and it kind of explodes and your team are absolutely fed up. [00:21:36] Speaker B: And this is where theory and practice come in. So it's theoretically the right thing to do. You're absolutely correct. Right. And we call it a delta because it's just a delta. We're looking for the difference. We're looking for where do we spot difference? One CEO said, the way he described it was, he said, this way I get to determine where to move the goalpost to proactively rather than being accused of moving the goalposts to hide a problem. [00:22:04] Speaker A: The moving targets problem. Yes. [00:22:06] Speaker B: Right. And so he's moving the goalposts as we need to, based upon new conditions and things. I think one of the things that you deal with, with decisions is back to the very beginning, right? If you talk to people about decisions like, yeah, make it and let's talk business leaders. Yeah, I'm making decisions all day long. It's 50 to 90% of what I do. Yeah. So you think of it as something I do if you, if you kind of think of it differently. And then they say, well, every decision I make triggers everything else that goes on in the business and you look at the importance of it. Right. And if you get them to step away and if they go off to universities and other places for executive education and you ask them, where does decision making fit into their roles as executives, they'll tell you, every year it's in the top three. Right. But day to day there's this mindset of, right, we made the decision, good, now go and implement. And this notion of, okay, things will change and therefore, let's watch the, let's, let's, let's adjust the decision as we need to. It's, it's not an automatic, it's not a natural behavior. [00:23:16] Speaker A: No, no, it's, it's quite ironic, right? You know, executives are self reporting that it's, you know, 80, 90% of what they're doing and yet they, they're spending no time of thinking about, you know, what is the process to do that. It's kind of mind boggling from that perspective. [00:23:32] Speaker B: Well, you know, every executive is overloaded, right? They could, they could with 72 hours in a day, not 24 and this. And if you don't give them a system for doing it, a process, Right. If you don't make it a process, it becomes too easy to think. Right. I'll deal with it when there's a problem. [00:23:48] Speaker A: That's right. That's right. I mean, the problem with that, right, is that the earlier you detect an issue, the less it costs you to solve it. I mean that, you know, it's obvious. But how are we all suffering from this? I'll give you an example, right? And this was, this was pure luck. I was part of an executive team. We basically, nobody was using our software. Nobody. It was half by luck. That we got that data. So this data did not exist. We decided to collect the data and put in a reporting system. We got the data when we saw it. We decided to dig in because I was 100% sure that it was a bug. I was thinking the software doesn't work properly. Obviously people are using this. Our assumption was, you know, somebody just paid us a million dollars to get 600 of these hardware units. I'm obviously using it. My system reports back, nobody's using it, like less than 5%. So fly over to Australia, figure out what the hell is happening here. I actually watched these telco, you know, salespeople work for hours. And I was like, holy shit, nobody's using it really. Now what happened is that I came back to report to the executive team. We decided to pivot the company and we sold the company three years later, almost for $80 million. Now, if we would not have made that decision right at the time that we made it. What happened actually two to three years later is that the industry changed and the need for our product would no longer exist because, you know, Apple and Google came along and they had automation and all that stuff. So my point is that that time that you gained in additional time to act actually saved the company. [00:25:31] Speaker B: And that is, if I can. If I can. That is the way we try and take a story like that. And if you. I might actually borrow that story, if you don't mind. Is that the way we take a story like that and turn that into. How do you help that help people systematically apply the thinking what was going on is. Another term we use is that there are facts and there. There are inferences masquerading as facts. So you had an inference, right. You might not have called it an inference. Right. Presumption, supposition, beliefs. Right. But the technical term is an inference. You had an inference that there was a bug, right? [00:26:15] Speaker A: Yeah. I did spend three weeks on debugging the software, which got me nowhere, by the way. That was how strong my belief was that I've got to be wrong. I was so strongly rooted in my belief that I spent three weeks proving that I was right and I wasn't. [00:26:33] Speaker B: I'll give you an example. If you ask a group, why are you discussing this subject? The leader will give the will explain to the group there's three to five reasons why we're discussing it. If you go to everybody privately and ask them, why are you discussing this subject? There'll be 30. [00:26:52] Speaker A: Oh, wow. [00:26:53] Speaker B: If you then share those reasons back with everybody, there'll be six where they all agree with and there's 24 where there is. We need to discuss this for this reason. No, we don't. Now what we've found is that when you take those non like minded ones and say, well, do we need to discuss it for this reason or not? Is it not the case? Right. We need everybody not just agree we need to make the, make the decision, but make it the same reason. Our data is that when those views are put in front of everybody and then when we turn them into what causes you, when we ask what causes you to say this is a valid reason for doing it, what causes you to say it's not? When those are reconciled, 41% of those original declarations turn out to be changed. So just think about that. That means There's a CEO, there's a CEO, there's a, you know, C level executives, SB, SVP's, managers, people that know their business. Yeah, right. And they're saying, well I think we need to just as because and in a completely anonymous fashion. When they are put in front of peers and tested and then rejection comes in and then they're, then they're reconciled, 41% become something else that the original author agrees with. [00:28:21] Speaker A: Oh wow. So we're getting it wrong to a certain degree. Right? We're getting it wrong. [00:28:27] Speaker B: So yeah, I mean the end conclusion of this is that most decisions that groups make from our research were rational, logical and often compelling, but inaccurate and incomplete and most were false positives. [00:28:44] Speaker A: Wait, explain that false positive aspect of it that's incredibly important. [00:28:49] Speaker B: Well, it's the notion that, well, I'll go back to what you said at the very beginning. If you ask a group of people why is that the best decision? They'll tell you the attributes that made it the best decision. They'll say, they'll explain the assets. They'll say we had the right people with the right knowledge, we had the right consultant with the right methodology, we had the right data, we had the right people in the room. And so they'll describe how they have these assets. And so you couldn't have come up with a better decision. [00:29:21] Speaker A: Right. [00:29:21] Speaker B: Now you brought up earlier things like Groupthink, Prisoner's Dilemma, roll. Now if you ask them a different question, tell me how that decision does not possess any of the liabilities or inhibitors of success. Don't just tell me that you're aware. You can quote me Prisoner's Dilemma and you can quote Groupthink, but tell me how your process prevented their presence in the decision. Yes, well, most decisions actually look like a piece of Swiss cheese. So they get handed over to implementation as we the knowledge. And again, as a former CEO and business person and consultant and others, I've been doing this as well because I didn't know better. Right. It's only the research that showed it is we basically hand it over to implementation and say we, this group of people that know the subject and use the right approach and we're handing you something now. You go implement any issues in the outcomes must be to do with luck, chance, project management, change management. In actual fact, generally we're handing over these false positives. They look right but they've got a lot of holes in them, in the insides. And the problem is in implementation you can't track it back to the decision making process. It's too hard. [00:30:48] Speaker A: Right? Yeah, but that's also very important. Why would you even want to track it back? What's the dynamic there that's valuable? [00:30:57] Speaker B: Well, first of all you don't because the last thing I did one year busy. So most people don't look back at decisions and do post mortems on them anyway. It's a theoretical notion that few do. And the other one is, so are we really going to go back? So I've got, I've got a piece of data, I have a chart and this is a chart which is the degree of alignment of executive teams. And of 147 executive teams, only six of them had sufficient like mindedness around things that they were responsible for. So when you don't have a, when you don't have a strong degree of like mindedness, it means that you're not giving the same message down through the implementation and down to everybody. Well, do you think you want to be the project manager or the change leader or the person that goes back and says, you know, well, yeah, I'm sorry, this project that we're implementing is not producing all the benefits, it's not on time, it's requiring extra cost. And the root issue is that you, the leadership team that put it together actually had an insufficient degree of alignment when you handed it over us. I don't think anybody's going to do that. [00:32:03] Speaker A: That sounds like career suicide to me. [00:32:06] Speaker B: That's career suicide. Yeah. And the ability to see that, you can't see it in the implementation. You get signals of it. When you get two executives that are given different messages but generally it's undetectable. So the idea is back to what we said at the beginning in terms of process thinking, fix it at the beginning. So instead what we're doing is we're right at the very beginning. We're measuring the degree of alignment of a group and going, your degree of alignment is 74 right there. [00:32:39] Speaker A: Not. Is there not an opportunity for the executives. And maybe I'm going off the reservation here, but it's not there an opportunity for the executives to say, these are our assumptions, these are our decisions, here's our, let's say, process of thinking. Actually hand that over to the execution team and say, we're giving you an open check to evaluate if any of our assumptions, you know, maybe just change, maybe they weren't wrong, maybe they changed over time. If at any stage you find a misalignment or a gap that has caused. So, and to me, this seems like instead of basically structuring it as career suicide, because that's how I think it mostly is, structuring it as we give you the opportunity to come back and reevaluate changes in direction of the market of the problem, deeper information that has come to light. Let's take the metrics of what we thought and what you measured and see that there is a gap. I mean, it seems like there's a cultural opportunity here. [00:33:39] Speaker B: Oh, no, absolutely. When people use the prescription that we provide for how to make a decision, how to make a group decision, that is exactly what they are doing. All the assets, because it's in software as well, Means imagine onboarding a new executive, right? You onboard somebody and they say, where did this come from? Because it's in software. You can go all the way through to something and say, it came from this, this, this, then this. So you have that. But I want to go back to something that you said, handing over all the assumptions. So when we went through business, when in the research, when we went through business strategy documents, project program charters, and we went through to identify assumptive statements which are in the tense of it is this. Today, it has been this, it's going to be this, right? Not goals and barriers, but right assumptions, predictions, right? Now, part of this was triggered because some of the research that we're born from is a group out of Harvard who went to Fortune 500 executives in the late 90s. And they would sit down with executives and asked them to discuss decision implementations. And one of the things that came up was people saying, well, it turned out, it turned out that when we were in the room making the decision, I guess I had a different assumption to some of my colleagues, right? That was one of the triggers behind the original Research. And so, so let me ask you this. That when you say hand over all the assumptions, so where's that list of all the assumptions? Right, that's right. So in the. So what we found was if you go and look at these program charters that are handed over and strategy documents and relationship plans and others, you can go through all the paragraphs and you can find assumptions, declarations and you can pull them all together. So to your point, in the approach that we prescribe, there is now an artifact called the foundation document. And I can show you examples and examples and examples. I have them from countries in conflict to churches trying to go congregations to enter Fortune 50 Enterprise strategies to global disease eradication. So for all these subjects now, there is now a thing called a foundation document. And it is every single assumption upon which the rest of the decision is based. And it comprises those assumptions that were where there was natural like mindedness. [00:36:17] Speaker A: Yeah. [00:36:18] Speaker B: And those where there wasn't natural like mindedness, but there was then converged and just for what it's worth, 17% of the assumptions had natural like mindedness and 83% had to be reconciled because they weren't like minded. But. Right. So you start off when a group comes together to first make a decision. The data we have says that the 17% were the like minded and 83% were they not. But you can now end up with this thing called a foundation document, which to your point, you can now give to a group and say we made this decision based upon all these assumptions. Yeah. [00:36:58] Speaker A: And that really opens the, it really opens to have a more almost factual discussion about making changes as opposed to now feeling that you're, you know, contradicting your, your boss in some way. Which I think there's something really interesting about that. [00:37:15] Speaker B: There's a great example where an organization was looking at their product strategy and the statement was that the competition is struggling financially and that presents us an opportunity in the market. So imagine that the chief marketing officer had said that the primary competition is struggling in the market, which presents as an opportunity. Now you can look at that and go, okay, well the chief marketing officer said it, it must be right, right, Right. Now I've just mentioned 41% of these turn out to be not accurate. Right. In that case, he said it very much believed it. When it went in front of the group, there was rejection of it. And when we just to get to the headline, actually what it turned out was when he was asked what's the evidence? What have you seen or heard that led you to conclude or Lead you to conclude this is happening with the competition. But what he said was. He said, well, look, everybody goes. Everybody presents at the main trade show. They've not. They've. They've not been at the main trade show for the last two trade shows. Everybody does X in marketing. They've not been doing the following. Well, you know what goes on. As soon as you got financial trouble, the first thing you stop is your marketing expenditure. [00:38:37] Speaker A: Right. [00:38:38] Speaker B: Very rational inference. [00:38:39] Speaker A: Right? [00:38:40] Speaker B: Well, no. They were hoarding their cash to develop a killer app they were bringing out nine months later. So if you can have a group that's open to the fact that the things that I think I know and believe are inferences and are open to having them, I'll say validated, you can generate a much more accurate. You can have more accurate data going into your decision. Wow. [00:39:11] Speaker A: Oh, my God. Michael. I think we can definitely. This is dangerous because we could probably go on for. We're almost at the end of the hour, and I have, you know. [00:39:21] Speaker B: Wow. [00:39:21] Speaker A: Okay. So I have an incredibly selfish question to ask you. I've got three kids. I've come to the conclusion that I taught them how to work hard. I've made them, you know, smart in certain areas and others. But I feel like really, my big challenge is to teach them how to make good decisions. I don't know if I can talk to them about systems and manufacturing and Six Sigma. I mean, what. What's the easy hack like? What are the things that you do when you think about your own kids? Because they're going to leave the nest at one stage, and I will no longer be able to guide them, and that terrifies me. How do we do this? [00:39:59] Speaker B: I can tell you the first thing not to do. Don't teach them about inferences. The reason is because you'll suffer. What my three children did with me when they were at home was I would say something at the dinner table, and one of them would say, dad, have you just made an inference there? And there is nothing worse than having the student. Yeah. Check the teacher. Right. But no, on a very serious note, we are actually in development of a program for high schoolers. [00:40:35] Speaker A: Oh, wow. [00:40:37] Speaker B: And you raise. And so my thought is, I think the biggest thing about decisions, again, do I actually know everything I need to know? Most of us, we make decisions with partial information. And so I think that what. What could be going on that I don't know, what is there out there, they're not understanding. And frankly, the. The greatest one I've seen is this one around Inferences. The more someone says to me, I know why she's doing that. Explaining to somebody, data. You only have data when that person says, I'm doing it. Because. [00:41:21] Speaker A: Right. [00:41:22] Speaker B: And the thing we've found over the years is the more people will say, I know why she ignored me in the playground. I know why he did that. And the interesting thing is, if I can share this, if you listen in business as well as in social, if you listen to when people are talking about what somebody else is doing that they don't like, 95% of the time, it is a negative judgment of the other person's intent. [00:41:52] Speaker A: Yes. Yes. [00:41:55] Speaker B: And then. Right. Because the brain has to rationalize why are they doing something I don't like. And so what we do is we go, well, I know why he's saying that, because he's going to try and sell me something next week. Right. So I think the main thing, this is why we. That's why we with R, unfortunately made the mistake of teaching our children about inferences and inference accuracy. [00:42:15] Speaker A: You know, my. I've taught my kids to negotiate for. From a very young age. I'm ex military. You know, my. The culture I come from is extremely known as very good in negotiating. I regret that. I regret that. So, so holy. But, but, you know, obviously it's a, it's a good thing. But yeah, as a parent, I do not enjoy my kids using the tools that I teach them against me and go do that somewhere else. [00:42:46] Speaker B: Yeah. But humility, humility is good though, isn't it? [00:42:49] Speaker A: That's very fair. Michael, what an absolute pleasure. I deeply, deeply appreciate you. Thank you for coming on the show today. [00:42:57] Speaker B: I appreciate the on target questions, Ari, so thank you. Given you cover so many subjects. Thank you. Appreciate it. Wonderful.

Other Episodes

Episode

November 20, 2024 00:30:36
Episode Cover

Mike Knox | Nov 21, 2024

In this conversation, Mike Knox shares his profound journey as a father dealing with his daughter's epilepsy, the challenges faced within the medical and...

Listen

Episode

November 05, 2024 00:22:04
Episode Cover

Donat Husjainov | Nov 6, 2024

In this conversation, Ari Block and Donat Husjainov delve into the complexities of customer relationships in the fintech industry, discussing the challenges of balancing...

Listen

Episode

November 06, 2024 00:34:18
Episode Cover

Ed Burnett | Nov 6, 2024

In this conversation, Ari Block and Ed Burnett explore the profound impact of military service on personal identity, cultural perspectives, and parenting. Ed shares...

Listen