The Logical Library
Menu
  • Home
  • The Logical Library
Menu

Part-7: Epistemology: Grappling with Uncertainty and Risk

By H. William Dettmer

Most people make decisions based on emotional rather than logical reasons. Only after they’ve decided emotionally,
perhaps even unconsciously, do they look for rational support for their decisions.
                                                                                                 —Unknown

W. Edwards Deming maintained that real, lasting improvement was not possible without “profound knowledge” of one’s system. In order to achieve that level of knowledge, Deming said, it was critical to have a thorough understanding in four major areas: [2:96]

  • Systems
  • Theory of Knowledge
  • Variation
  • Psychology

This is a well-known taxonomy. Over the past two decades or more, the quality community has leaned very heavily on the variation aspect. Deming himself in later years emphasized appreciation for a system and psychology. But beyond saying “bring data,” few practitioners of continuous improvement have indicated a real understanding of the second bullet: the theory of knowledge.

Epistemology

Another term for theory of knowledge is epistemology. Even people who are even familiar with the word aren’t clear on its meaning. Epistemology is the branch of philosophy that studies the nature of knowledge, its presuppositions and foundations, and its extent and validity. [6] In other words, how do we know what we know about the world we live in, our part in it, and the interactions between ourselves and others?

Why is the issue of epistemology important? Because our understanding of what we know and how we know it is vital—crucial, even—to the quality of our decisions in life. And everybody makes decisions daily, sometimes very important ones.

The Three Decision-Making Conditions

Anybody making a decision generally does so under one of three mutually exclusive conditions: certainty, uncertainty, or risk. If you know that a particular outcome of your decision is a sure thing, you’ll be making your decision under a condition of absolute certainty. For example, if you’re contemplating jumping off the top of a 12-storey building, you can be absolutely certain that, barring divine intervention, the sudden stop at the end will kill you. Decisions like this are usually very easy to make, because you have extremely high confidence in the inevitability of the outcome.

Some kinds of decisions can be evaluated by mathematical probability. These are considered decisions under a condition of risk. The typical example of decision under risk is gambling, say poker or blackjack. (“Gambling” is really a misnomer. It’s certainly no gamble for the house, which is really playing statistical probability over a large number of individual instances.)

Decision under risk depends on being able to assign, with confidence, a mathematical probability to every possible outcome. Obviously, it’s possible to determine that probability for, say, taking a card or passing at sixteen in blackjack. (Statistically, seventeen is the “magic number”, the point at which your odds of winning are better when you stay with what you already have rather than taking an additional card.)

Decision under uncertainty is the tough nut to crack. Unfortunately, this is the situation in which most people find themselves most often—certainly in the decisions that matter most: a person’s career, a major financial investment, whether to marry a particular person, or a business decision. Making decisions under conditions of uncertainty means that there are serious doubts about the outcome and that you can’t effectively assign discrete mathematical probabilities to the possible outcomes. In other words, it’s worse than a crap shoot. If most of our decisions are made under uncertainty, how can we possibly make them, particularly the critical ones, with any degree of confidence that we’re doing the right thing?

Intuition

One way is to use intuition. Everybody has heard of intuition. Many people absolutely depend on it to make the right decisions. Intuition is the act or faculty of knowing or sensing something without the use of rational processes. In other words, immediate cognition or a perceptive insight. It can also be considered having a sense of something not evident or deducible, maybe an impression. [6] When the hair on the back of your neck stands up, it may well be your intuition telling you that something is not quite right. It serves some people very well. But unless you’re reliably clairvoyant, intuition can lead you wrong as often as right.

Mathematical Models or Simulations

One of the most seductive aids people (and businesses) use to help them make decisions is models or simulations. A number of companies (particularly software companies) have made a lot of money selling the idea that a computer is better at making complicated decisions than humans are. When these are fairly routine decisions, this is certainly true—computers can usually make such decisions much faster and more reliably over time than humans, especially for repetitive decisions.

But models and computer simulations have a major shortcoming: they require that all the relevant variables in any decision be mathematically quantifiable or at the very least, that mathematical probabilities be assignable. Deming once said, “…the most important figures that one needs for management are unknown or unknowable.” [1:121] He recognized that many important things that must be managed can’t be measured. In other words, you can’t measure everything of importance to management, but you must still manage them.

Computers and simulations are capable of great precision, but as Goldratt once observed, it’s often better to be approximately correct than precisely wrong.

Making Better Decisions Under Uncertainty

Decisions under uncertainty will always be…well, uncertain. But there are ways to reduce uncertainty to a reasonable level. One approach is to combine verifiable facts or evidence, to the extent that it is available, with logically verifiable causality. Most people are comfortable with the idea of basing decisions on facts or evidence, but they’re less certain about how logic and facts combine to provide the best available basis for decision making.

In the mid-1950s, Luft and Ingham conceived the Johari window to help explain how members of groups learn to interact. With a little adaptation, the Johari window can be used to explain the state of our knowledge about the world around us. (Figure 1) [5:85]

The upper left pane of the window (A) represents the domain of our certain knowledge—what we know, and know that we know. The lower left pane (B) represents identified gaps in our knowledge—what we are aware of that we don’t know. The upper right pane (C) represents knowledge that we possess, but that we are unaware of, or the significance of it. And the lower right pane (D) is that domain of knowledge that we don’t know, and we’re ignorant of our own ignorance.

Our search for facts or evidence seeks to move the contents of B to A. “Blue sky research” seeks to reduce the size of D and move some of it to A as well. Our search for relationships among known facts seeks to move some of the contents of C to A. But facts alone are not of much use unless and until they are connected in a causality relationship.

Rules of Logical Causality: The Categories of Legitimate Reservation

The validity (or invalidity) of the causal connections among facts, hypotheses, and conclusions is governed by a finite set of rules called the Categories of Legitimate Reservation (CLR). [4: 34-56] Most of these have their roots in Aristotle’s logical fallacies. [3:57-58] There are eight of these rules:

  1. Clarity (the complete understanding of a word, idea, or causal connection)
  2. Entity Existence (the verifiability of a fact or statement)
  3. Causality Existence (the direct-and-unavoidable connection between a proposed cause and a particular effect)
  4. Cause Sufficiency (complete accountability for all contributing, dependent causes in producing an effect)
  5. Additional Cause (existence of a completely separate and independent cause of a particular effect)
  6. Cause-Effect Reversal (misalignment of cause and effect)
  7. Predicted Effect Existence (additional expected and verifiable effect of a particular cause)
  8. Tautology (circular logic, or existence of effect offered as rationale for a proposed cause)

The conscientious application of these eight rules of logic to the facts that we know, and the facts we can uncover, permits us to make causal connections between them, to conclude truths about larger, more complex relationships—to convert mere data to information, and to make sense of aggregated information…knowledge! (Refer to Dettmer, The Logical Thinking Process (2006) for a more comprehensive explanation of the CLR.)

Summary and Conclusion

In summary, what we truly know about our system, the interactions among its components, and its interaction with the external environment is a function of two factors the volume of verifiable facts or evidence we have concerning the world around us, and the quantity and quality of the logical causal connections we can establish and verify. Our ability to orient ourselves in our environment (step 1 of Boyd’s O-O-D-A loop) depends on these two factors.

In our next installment, we’ll examine the premier tool in the world for making these causal connections: the Logical Thinking Process.

Endnotes
1. Deming, W. E. Out of the Crisis. Cambridge, MA: MIT Center for Advanced Engineering Study, 1986.
2. _______. The New Economics for Industry, Government, Education. Cambridge, MA: MIT Center for Advanced Engineering Study, 1993.
3. Dettmer, H. W. Breaking the Constraints to World-Class Performance. Milwaukee, WI: ASQ Quality Press, 1998.
4. Dettmer, H. W. The Logical Thinking Process: A Systems Approach to Complex Problem Solving. Milwaukee, WI: ASQ Quality Press, 2006.
5. Dettmer, H. W. Strategic Navigation: A Systems Approach to Business Strategy. Milwaukee, WI: ASQ Quality Press, 2003.
6. http://dictionary.reference.com/
7. Luft, J. Group Processes: An Introduction to Group Dynamics, 2nd ed. Palo Alto, CA: National Press Books, 1970.

For a downloadable PDF version of this article, click here: Part-7: EPISTEMOLOGY: GRAPPLING WITH UNCERTAINTY AND RISK

  • Home
  • The Logical Library
  • Systems Thinking Articles
    • Systems Thinking 12-Part Series
      • Part-1: Introduction to the Systems Approach
      • Part-2: Business and the Blitzkrieg
      • Part-3: Destruction and Creation: Analysis and Synthesis
      • Part-4: Operationalizing Sun Tzu: The O-O-D-A Loop
      • Part-5: The Learning Organization: Adapt or Die!
      • Part-6: Systems and Constraints: The Concept of Leverage
      • Part-7: Epistemology: Grappling with Uncertainty and Risk
      • Part-8: Policy Analysis: What to Change, What to Change To, and How to Make the Change
      • Part-9: Strategic Navigation: Strategy Development and Deployment
      • Part-10: Leadership in Complex Systems
      • Part-11: The Wingman Concept: Security and Reinforcement
      • Part-12: Logic and Emotion in Changing Minds
    • “Our Goal is … What IS Our Goal?”
  • Logic Trees
  • Videos
  • Systems Thinking White Papers
© 2019 The Logical Library | WordPress Theme by Superbthemes