Decision analysis^{1} is a systematic approach in order to solve decision problems optimally under conditions of uncertainty. It *does not describe how or why* an individual makes a decision; it *prescribes *a decision procedure for the individual that is *consistent with his or her preferences and attitudes toward risk*. You might be asking yourself: „Why do I need to approach the decision making process with a scientific method, when I know most decisions (business and otherwise) are made on intuitive level?“ The answer is threefold:

- Yes, it is true that the vast majority of decisions made in business does not require, and are made without, formal analysis. But for that one crucial decision upon which “everything depends”, it is very helpful to have a systematic, logical decisions procedure to follow.
- Most of us have little experience in intuitive processing of the probabilistic and sample information that may confront us in a complex decision-making problem. Consequently, it is frequently more profitable to rely on the mechanical approaches of decision analysis than on the less reliable information-processing capabilities of our intuition.
- To use decision analysis, we are forced to consider carefully and logically all possible courses of action and the outcomes that could result from each. By doing so, we may see a side of the problem not seen before, or we may even discover that we have been addressing the wrong problem. Thus, the information obtained from decision analysis may widely compensate for the effort expended in the analysis.

One of the alternatives we face in making a decision is whether the decision should be made *now *– using information we already have on the problem (we will refer to this as *prior information*) or *postponed *until we have gathered additional (sample) information. Although all decision problems involve the selection of a course of action chosen from among two or more alternatives, we can classify them into one of three categories:

- Decision making under certainty
- Decision making under uncertainty and risk
- Decision making under conflict

**Decision making** under certainty entails the selection of a course of action when we know the result each alternative action will lead. If the number of alternatives being considered is small, such decisions may be easy to make. However, if the number of alternatives is large, the optimal decision may be difficult – if not impossible – to obtain. It may take too much time and/or be too costly to evaluate all the alternatives and select the one with the most favourable results. Many problems of this type can be solved using the operational analysis methods (linear programming, queuing theory …) and will not be addressed in this text. In another class of problems we must evaluate the alternatives according to multiple criteria (often controversial). Some methods for solving multi-criteria decision making problems are introduced later in this module. It is important to note that “certainty” is often only theoretical concept and is very rare in practical problems, It is therefore important to analyze the sensitivity of results to changes in inputs and to evaluate potential risks.

**Decision making under uncertainty and risk** entails the selection of a course of action when we do not know with certainty the results that each alternative action will lead. Furthermore, we assume that the outcome of whatever course of action we select is affected only by chance and not by opponent or competitor.

**Decision making under conflict** is similar to decision making under uncertainty and risk, in which we *do not know* with certainty the result of each alternative action. However, the reason for this uncertainty is different: we are “playing against” one or more opponents or competitors. The outcome of our chosen course of action depends on decisions made by our competitors. Decision problems of this type fall under the discipline known as game theory and are not discussed in this text.

**Decision making under uncertainty and risk**

Four basic elements common to decision problems involving uncertainty are:

- Actions:

the set of two or more alternatives the decision maker has chosen to consider. The decision-maker´s problem is to choose one action from this set. Example: to start new product development or not - States of nature:

the set of two or more chance events upon which the outcome of the decision-maker´s chosen action depends. The states of nature should not overlap, so that they are independent and the probability of the sum of events is equal to the sum of probabilities. They should be complete in the sense they cover the whole spectrum of possibilities; completeness ensures that the sum of probabilities of states of nature equals to 1. Example: the intervals of expected market share for the new product based on market research (0-5%, 5-10%, 10-20%, more than 20%) - Outcomes:

The set of consequences resulting from all possible action / state of nature combinations. In general, outcomes reflect the actual reward to the decision maker in terms of payoff. Alternatively, outcomes can be expressed in terms of opportunity loss, i.e. the difference between the payoff a decision-maker receives for a chosen action and the maximum that he/she could have received for choosing the action yielding the highest payoff for the state of nature that occurred. Example: The expected payoff if we select to develop the new product and the market share will be 5-10%. - Objective variable:

the quantity used to measure and express the outcomes of a decision problem.

**Decision making under risk **Whenever possible, we attempt to determine (or estimate) the probabilities of states of nature. In this case, we are talking about decision making under risk. The problem can be formulated in terms of payoff table: the rows of the table represent actions, the columns represent the states of nature and the elements of the matrix are the outcomes for the combination of action / state of nature. Let us illustrate the model by example:

**Example 1:**

Beer producer intends to expand and build a new brewery. The problem is to determine how large a brewery is to be constructed. It has been decided that the size should be based on the planned gross profits for the fifth year of operation. The marketing department forecast is that the company cannot obtain more than a 15% market share during the 5-th year of operation and the probabilities of market shares are 0,4 for market share 0-5%, 0,5 for market share 5-10% and 0,1 for market share 10-15%). The analysis resulted in the following payoff matrix (payoffs in millions of €):

The most commonly used criterion is the **expected payoff**: the decision-maker selects the action that produces the maximum expected payoff. In terms of probability theory, the random variable x is payoff and the associated probability is the probability of occurrence of the state of nature. Then the expected (mean) value of x is

E(x) = ?x p(x)

The values of expected payoffs are calculated in the last column of the payoff table and according to the expected payoff criterion the decision-maker selects action a_{2} , i.e. recommends to build the medium-sized brewery.

Please stop and think: |

**Decision making under uncertainty**

If the decision maker does not want or is not able to determine probabilities of the states of nature, he/she faces decision making under uncertainty. There are several criteria that do not require the assignment of probabilities in order to reach a decision. Two common nonprobabilistic criteria are the **maximax **and **maximin **decision rules.

To use the **maximax **rule, determine the maximum payoff associated with each action and choose the action that corresponds to the **maxi**mum of these **max**imum payoffs. This criterion ignores all information except the maximum value; it is optimistic, as it is based on the assumption that the most favourable state of nature will occur. In our brewery example, maximax criterion would select action a3 – large brewery.

To use the **maximin **rule, determine the minimum payoff associated with each action and choose the action that corresponds to the **maxi**mum of these **min**imum payoffs. This criterion ignores all information except the minimum payoff for each action, therefore it is pessimistic. In our brewery example, maximin criterion would select action a1 – small brewery.

**Utility, risk attitude**

The above mentioned criteria sometimes fail to provide decisions compatible with the decision-maker attitude toward risk. An objective variable, reflecting this attitude, is called the utility function and the values assigned to the outcomes are refereed to as utilities. Utility function assigns to payoff values the values of utilities, usually from interval <0,1>. As the detailed explanation of this often non-intuitive approach is beyond the possibilities of this introductory text, we refer the interested reader to corresponding literature^{2}.

Please stop and think: |

**Decision trees**

Decision trees are very useful tools for the decision-making under risk. In principle, any payoff matrix can be converted to the decision tree; however the decision trees may be used for much broader range of problems, especially to multi-phase projects, which are typical for R&D and innovation projects that are usually performed within the framework of multi-stage process (typical representative being the stage-gate process of R. Cooper^{3}) with some probablities of success. Again, the detailed explanation of this method goes beyond the possibilities of this introductory text and we refer the interested reader to corresponding literature^{4}.

**Decision making using sample (a posteriori) information**

Sometimes it may be possible and useful to postpone the decision until we have gathered further information that can be used to revise prior probabilities. Using this approach, it is possible to estimate the **maximum expected worth** of the sample information before actually obtaining the sample, i.e. we can determine the maximum amount we should be willing to pay for such information. As in the above paragraphs, we have to refer the interested user to specific literature^{5}.

^{1} Part of this section (p. 8-11) adapted from: McClave J.T.,

Benson P.G.: Statistics for Business and Economics, p. 1041-1052^{2} Baker D. et al., Guidebook to Decision-Making Methods J.T. Mc Clave,

P.G. Benson: Statistics for Business and Economics, chap. 18 ^{3} G. Cooper: Winning at new products^{4} C. W. Kirkwood, Decision Tree Primer ^{5} J.T.McClave, P.G.Benson: Statistics for Business and Economics, chap. 19

## Rating of this chapter

Click the rating bar to rate this item.

## Comments on this chapter