Don’t solve the wrong problems precisely. Type 3 Errors and Type 4 Errors, by Ian Mitroff, extending the Design of Inquiring Systems.
How can people or groups tell whether others are deliberately steering us down faulty paths? Mitroff delves into how organizations and interest groups lure us into solving the “wrong problems” with intricate but inaccurate solutions that are based on faulty and erroneous assumptions – and offers strategies and solutions.
Video of 9m57s (with slides) on “Book TV: Ian Mitroff & Abraham Silvers, Dirty Rotten Strategies” at https://www.youtube.com/watch?v=CjgJVp9f_1k
Video of 59m47s (with slides) on “Book Discussion on Dirty Rotten Strategies” at http://www.c-span.org/video/?292366-1/book-discussion-dirty-rotten-strategies
Audio podcast 1h41s downloadable at http://www.commonwealthclub.org/events/archive/podcast/ian-mitroff-dirty-rotten-strategies-how-we-trick-ourselves-and-others-solving
[This digest started with the Youtube transcript, and therefore initially uses that time code to 09m57s. The version on c-span.org has a 14 second header and then runs to 59m47s.]
[00:00] If I had to sum up the the book in a single statement, it would be: don’t solve the wrong problems precisely, because if you do, it not only a waste of precious resources, time and energy, but it leads to cynicism and despair and puts off the true problem such that they build up into a crisis.
[00:19] Also, if I had to summarize in a in a single saying, it would be from the celebrated author Thomas Pynchon: if they can get you asking the wrong questions, then they don’t have to worry about the answers.
- Thomas Pynchon, Gravity’s Rainbow: “Proverbs for Paranoids, 3: If they can get you asking the wrong questions, they don’t have to worry about answers. https://books.google.com/books?id=GGPm4I3BbxAC&q=wrong%20questions
[00:40] What’s worse, the wrong solution to the right problem, or, the right solution to the wrong problem?
[00:50] Well, the right solution to the wrong problem is worse, because if you get the “right solution to the wrong problem” you convince yourself that you’ve solved the right problem, and you don’t go back up to the start of the tunnel coming into all of the different branches where you can branch off.
[01:07] You say, I’ve gone down the right path. But if you keep getting the wrong solution to the right problem you say ok, I’ve made an error and hopefully the error will be self-correcting or I will eventually come to the right solution.
[01:20] Why solve the wrong problem precisely, as I said, a waste of time?
[01:25] In every case, whether you solve the right or wrong problem, it’s due to a set of assumptions. Solving the wrong problem precisely is due to faulty assumptions, which leads to having to know your assumptions.
[Ally Bank, “Pony”, see http://www.adweek.com/video/ally-bank-pony-121402]
[03:16] Let me give you an overview of the talk and what’s in the book. So, we’re gonna talk about something called E3 and E4: Error of the Third Kind, Error of the Fourth Kind. They’re central to solving the right or the wrong problem.
E3: Trick Ourselves
E4: Trick Others
[03:54] Let me start with E3 and E4.
[03:56] If you take a course in statistics, just about every courses talks about two types of error, type 1 and type 2 error. Everybody whose taken a course knows about that.
[04:07] And the easiest way to understand it is: you’re a drugmaker. You have a new drug, and an old drug. And what you you do you is to go out and test on a sample and hopefully the new drug is better than your old drug.
[04:17] But there are two types of errors you can make.
[04:19] One error is to say the new drug is better than the old drug when it really isn’t.
[04:24] And vice versa, the old drug is better the new drug when it really isn’t.
[04:27] And those are type 1 and type 2 errors.
[04:29] And those have to do with the bell-shaped curve and when you got the right samples.
[04:36] E3 is very different. Have I tested the right hypothesis to begin with? Am I asking the right question?
[04:43] So whether it’s the cost or the efficacy of a drug or of health care, it’s how E3 has to do with how we define a problem in the first place.
[04:52] And so E3 is when we trick ourselves. Not necessarily anybody else, but, we tricked ourselves. Okay, we fall in love with your pet hypothesis.
[05:00] E4 is more deceptive and potentially more harmful
[05:04] It’s when I try to convince you, that the formulation that I and my company, my organization or industry has come up [with] is the right formulation of the problem. And that you ought to accept it. And there is no other way to formulate the problem.
[05:18] So it’s large fundamentally with miseducation.
[05:22] … in the book …
Starts with Mis-education
X + 6 = 11 is an exercise.
Exercises ≠ Problems
Problems ≠ Messes
[05:25] I’m not a proponent of textbooks. Most of us start learning things from textbooks. So the first thing we learned was X + 6, for example, equals 11. What’s X? That’s not a problem. It’s an exercise. The reason why it’s not a problem: it’s already preformulated There’s one and only one right answer, but you can usually convert it into a problem.
[05:46] Billy has six dollars and needs eleven dollars to buy a video game. But Billy is in a poor family. He has to give his money to help his mother and father. Then it becomes a problem.
[05:55] Because the context is all-important. Exercises remove all the context, descriptions.
[06:01] Now the problem with exercises, you give students, you know, 20 or 12 years, whatever it is, and education with exercises. You turn them into certainty junkies and they balk like mad if you give them a real problem where they have to formulate the problem.
[06:15] In real problems, they have more than one way to formulate. There’s not just one formulation.
[06:19] So you get into problem negotiation. But you don’t get that, as you go through typical education. Exercises don’t equal problems. And problems don’t equal messes.
[06:30] A mess is a whole system. A set of problems that are dynamically interconnected and change all the time. This is Russ Ackoff, who died recently, one of my mentors.
[06:40] But managers don’t solve problems, they manage messes.
[06:44] And that’s what President Obama certainly has to do. It’s not a single well-defined problem, but how all these things are interconnected so the health care problem is not separate from the financial recovery and jobs recovery and all the rest.
[06:57] In fact, if you have a mess, and I’ll show you an example, and you take any the elements or problems out of the mass that constitute it, you distort the problem. You distort the mess, because you have to look at the interactions. Problems are not separable.
[07:13] Health care. Let me give an example of how we get off and solve the wrong problem.
Technically, the US has the best Medical System.
But, Technology ≠ Best Health Care System.
US has a poor Sick Care System.
Solves which problem?
[07:17] Technically, the U.S. has the best medical system in the world. No question about that, from a technical standpoint.
[07:24] But technology does not actually equal the best delivery of health care as we want it.
[07:29] They’re not the same. So solving the medical problem is not the same as solving the health care problem.
[07:36] In fact, the U.S. has a poor sick care system …
[08:04] The health care system — and we’ll talk about the current health care bill — is founded on three primary assumptions. (1) Government is the problem. (2) Healthcare is a business like any other business. And (3) cost-cutting is the primary aim.
Three Wrong Assumptions
1. Government is the problem
2. Health care is a business
3. Cost-cutting is the primary aim]
[09:29] Now, it’s not that you have to accept my formulation or my statements. That’s not the point. But I put my new things, my assertions, strongly as possible, so you know what I’m saying. If if you disagree, therefore you have hopefully a better clarity on what you agree.
[Switch to c-span.org timecode]
[10:40 slide] The Critical Role of Critical Assumptions
[10:42] Everything is dependent upon assumptions.
[10:55] What happens in a crisis in principally this: It’s not, yteah, that people die, which they do, it costs a lot of money, the organization loses money.
[11:06] One the primary things that happens that most people aren’t aware of: a crisis literally demolishes all, or nearly all, of the principal assumptions that we use to give meaning to our life, to our reality. That’s why I give an existential definition to a crisis.
1. Mental health professionals
1. One of our own
[11:50] When I listen to crises, I take them in a different way, because I’ve been so tuned to crises for 25 years. In virtually every case, a crisis undermines a primary set of beliefs that we use to make sense of reality. And that’s why they’re existential crises.
Twittering in Operations
Normalization of the bizarre
How can we determine if we are committing an E3 or E4 error?
[16:40] You can’t really determine whether you’re committing a Type 3 or Type 4 Error, if you’re only using models 1 and 2, the first two ways. Because they typically only produce one view of a problem, what they take as a “correct one”.
[16:55] It’s only when you get to multiple ways of defining the problem that you can begin to get an handle on “what is truth” or “what is false”. Otherwise you can’t do it. Not that it’s perfect. I’m not saying that. It’s only when you get to, then, 4 or 5.
[17:10] When you get down to 5, it’s the rarest type of knowledge system of all. We don’t train people how to think systemically. And that’s really the only way out of these horrific problems we face. They can’t be defined by one discipline, one profession. In fact, when I hear people come up with — boom — one definition of any problem, I want to run like mad, because I know you’ll have to accept their assumption. It’s very rarely that people make their assumptions clear.
Solutions to the social problems of 2000-5000 years ago
Rational reasons for God
Not he wrong solution to the wrong problem
[Karen Armstrong, The Case for God; contrary to Richard Dawkins]
[18:50] In fact, one of the first books I did was, The Subjective Side of Science. I studied the Apollo moon scientists, not the astronauts. And if you think a scientist worth his or her salt is going to give up his or her pet hypothesis, particularly for the origin of the moon, just because the first round of rocks are returned from the moon, you’ve got to be crazy. They’re going to do everything they can in the world to defend it. Ultimately, they’re going to give it up. But only after they’ve defended it to the death. When I interviewed 42 of the most prestigious scientists, they said that was rational, that a scientist shouldn’t give up his or her pet hypothesis, too soon, lest they give up something worth exploring.
Messes cannot be managed by the mindsets that created them.
[A paraphrase of Albert Einstein]
[20:50] The fundamental purpose of a university, to me, is to teach critical thinking. Yes, teach technology, and theories, and all the rest of that. Knowledge. Of course, all of that is important. But the fundamental job is critical thinking. And critical thinking involves knowledge of assumptions, to be able to criticize your assumptions, to be able to replace them, to think about alternate assumptions, and to be able to appreciate complex messes, not simple-minded problems, in their entirety. And to bring to bear on them, multiple ways of looking at them, from multiple disciplines, from multiple points of view. To say, by looking at the mess, maybe now I have a better idea of which parts of mess I want to concentrate on for the time being. But in order to know that, I have to see the entire mess.
[21:40] Is there any way to definitely say that you understand the mess fully? Of course not, it’s a starter. [….]
[22:10 Slide modified from George Patton
If everybody is thinking alike, then NOBODY IS thinking"
Mitroff & Silvers
[24:25 Los Angeles Police Department]
[25:20 Ford Firestone]
[26:45 Bill Clinton]
[27:10] What I’ve said to my clients, the people I’ve consulted with, is that if you have only one thing to do in a crisis, my recommendation is: hire an ex-investigative reporter to dig around all of the dirt of the corporation, and make a mock newspaper or a mock tv interview, to show your corporation in the worst light. Because I can guarantee you that’s what will happen. Now why doesn’t that happen? Denial is so powerful.
[28:00 Defence mechanisms]
[29:00 We don’t have learning organizations]
[30:00 Environmental organization. False choices, that lead to false policies]
[31:00 Five inquiry methods are abstract. Have turned them into planning methods. An example: Myers-Briggs Type Indicator. Put all of the people of one personality type into the same group. When you do that, it intensifies the way of looking. Ask them to define the problem. List the major stakeholders that affect or are affected by the solutions, and what assumptions they make. A systematic way to get a constructive debate]
[32:30 If you can’t get a Myers-Briggs, here’s another way to do it. One group to argue status quo, whether they believe it or not. Put people who are in moderate opposition, then more, then radical. Then list the major stakeholders.]
[33:20 The biggest problem on which I’ve worked. The U.S. Census Bureau, 1980, 1990. Undercount. We set up a week-long debate.]
[34:30 If you have a small organization, could be hard. Professional management.]
[35:10 More systemic methods. Where are we cultivating?]
[35:45 Book, in chapter on religion. Ken Wilbur. The power of human development. Once an idea is unleashed, it gains currency, and can take off]
[36:50] That’s one of the Type 3 errors that I talk about on the chapter on religion. Here it is: confusing one state of development for the lack thereof in another state. And that’s one of our principles. We try to solve problems at one level, by a level one or two steps down, and they can’t be solved. That’s the whole point. And that’s why I’m talking about systemic. Because the problems that we have cannot really be solved, unless, there’s a systemic perspective. [….]
[37:40] I have to thank my editor at Stanford University Press for taking a radical manuscript, like this, with this kind of a message, to say: the ability to challenge our assumptions, to rethink our assumptions, to think expansively, to think beyond the confines of a single narrow discipline or profession. That’s the way out. If we’re mired in one set of assumptions, or one organization, we can’t do it. [….]
[38:20 15% of organizations can thing proactively, systemically. 85% can’t.]
[38:50 Global warming]
[39:15 Comment. Set up 20% of the time up front, defining the problem. Then we can work on solving the problem.]
[40.00 Agree. John Dewey said problems don’t start in disinterest, they start in moral outrage. Wellpoint.]
[40:50 Four steps of scientific problem solving. 1. Defining the problem. Conceptual model of the problems. Broad variables. Single explanation, no! Each profession will define differently. Advanced medical students, psychological students.
[41:50 2. Build a scientific model. The first stage is semantic. The second stage is syntactic].
[42:00 The third stage is to derive a solution, not to the solution, but to the model.]
[42:10 The fourth stage is pragmatic. Take the solution and see if it solves the problem.]
[42:20 The Type 3, Type 4 errors primarily happen in the first stage, defining the problem wrongly. If don’t see all of the stages, have defined the problem incorrectly. Different people focus on different branches. They don’t see the scientific problem systemically].
[43:20 Initially, doing dialectic doubles the time. In reality, the more you do it, it doesn’t double the time. You can’t define one, without the other].
[44:10 Government, messes everywhere.]
[44:25 Broader than that. Type 3, Type 4 in many areas. Governments. Corporations. Point of strategic planning is not thinking about isolated problems, and to anticipate problems. Don’t see one as better than others. General Motors bureaucracy rivals federal government.]
[46:30 Los Angeles Police Department]
[46:40 Salt Lake Winter Olympics. Problem was Russian ice dancers downgraded. Didn’t think of all of the crises that could hurt the Olympic committee. Could show families and grouping of crises. Not the case that there are not good organizations to learn from.]
[48:30 Ken Wilbur. Ability to think more complexly. Challenges are greater. Hope.]
[49:00 International news 15 minutes every evening, now 24 hour news. Exacerbates problems?]
[49:40 More is not better. May not lead to more insightful. PBS 6 or 10 minutes more insightful. Facebook and social media hasn’t led to better coverage. Can now manipulate and merge images. People can’t tell the difference, don’t care about the difference. Ally Bank commercial]
[51:10 Michael Vick, football player. Moral devaluation. Someone auctioned off notes. Expect more from the human society.]
[52:30 Media, merging images, real and non-real. Young people hooked on texting while driving, a problem. Technologically advanced is not the same as socially advanced. An engineer, but not solely an engineer.]
[53:50 Tiger Woods. Got so big, the rules didn’t pertain to him. The first billion dollar athlete. Shows had rapidly an icon can crash. No secrets. Horrible stitched together videos. That’s what will happen. Who will hire someone who will put them in the worst light. Primary thing in crisis: you don’t own the clock. The only way to gain control is to fess up, and hopefully the American public will accept it.]
[57:20 Crisis. You can’t solve the right problem. Assumptions have a half life, and decay over time. As circumstances change, your assumptions have to change. As Ackoff said, plan or be planned for. Have a real learning organization and a real learning society].