Decision Trees

Decision trees are diagrammatic tools that support business planning and decision-making by illustrating options, probabilities, and outcomes. Their main components include decision nodes (squares), probability nodes (circles), and branches representing outcomes. Constructing decision trees involves identifying decisions, labeling costs, calculating probabilities, and estimating expected revenues. The net expected value helps determine the most beneficial option, with weaker options being rejected. While decision trees provide logical clarity, objectivity, and faster decision-making, they also rely on estimated probabilities that may be biased or outdated. Overall, decision trees balance risks and rewards, offering businesses a systematic approach to strategic decision-making.

Revision Notes – Decision Trees

Decision Trees as a Planning and Decision-Making Tool
Decision trees are a structured, visual method used by businesses to make decisions under uncertainty. They allow managers to identify possible courses of action, assign probabilities to outcomes, and calculate financial consequences before making a final decision. Unlike instinctive or purely qualitative approaches, decision trees combine logic and probability, creating a more systematic framework for decision-making. For example, a company may face a choice between investing in new technology or upgrading its existing infrastructure. A decision tree maps out both alternatives, their costs, and potential revenue outcomes, providing clarity on which path offers the greatest financial benefit.

Decision trees are particularly valuable in complex business environments where uncertainty is high, and outcomes are not guaranteed. They encourage managers to carefully consider not only what decisions are available but also what risks and rewards each decision might bring.

Components of a Decision Tree
The effectiveness of a decision tree lies in its basic components, which transform abstract decision-making into a clear visual process:

1. Decision Nodes (Squares):
Represent moments where managers must choose between different options. For instance, a garment factory may have to decide whether to refurbish old machines or purchase new ones. Each option branches out from the decision node, leading to further outcomes.

2. Probability Nodes (Circles):
Represent situations where outcomes are uncertain. At these nodes, managers estimate the likelihood of various scenarios such as high, average, or low performance. Probabilities must always add up to 1 to reflect the certainty that one of the outcomes will occur.

3. Branches (Lines):
These connect nodes and illustrate the different decisions and outcomes. Branches from decision nodes show alternative strategies, while branches from probability nodes show the range of possible results along with their probabilities.

4. Probabilities:
Assigned to each possible outcome. For example, purchasing new machinery might carry a 65% probability of high productivity, 25% of average productivity, and 10% of poor performance. These estimates allow businesses to quantify risks and weigh them against rewards.

Together, these components create a visual map that simplifies complex choices into a sequence of clear possibilities.

Constructing Decision Trees
The construction of a decision tree involves several steps, each designed to clarify the decision-making process:

1. Identify the Decision to be Made:
Begin by outlining the strategic question. For example, McCorkell Ltd. may ask whether to refurbish existing sewing machines or invest in new technology.

2. Map the Options:
From the initial decision node, branches represent each alternative. Each branch is labeled with its associated cost. For instance, refurbishing machines may cost $590,000, while purchasing new ones may cost $600,000 plus training expenses.

3. Assign Probabilities:
For each option, identify potential outcomes and assign probabilities that reflect the likelihood of success, moderate results, or failure. For example, new machines might have a 65% chance of high revenue, a 25% chance of average revenue, and a 10% chance of low revenue.

4. Estimate Revenues:
For each possible outcome, estimate the expected revenue. These figures are multiplied by their probabilities to calculate expected values.

  • Formula: Expected Revenue × Probability = Expected Value

5. Calculate Net Expected Value (NEV):
Subtract the cost of each option from the total expected value to determine which option is most financially beneficial.

  • Formula: Total Expected Value – Cost of Option = Net Expected Value

6. Reject Weaker Options:
Once expected values are calculated, options with lower NEVs are crossed out. This is represented visually on the decision tree using two parallel lines. A key should also be included to explain decision nodes, probability nodes, and rejected options.

This process ensures that decisions are based on a systematic evaluation of costs, risks, and benefits, rather than intuition alone.

Evaluation of Decision Trees

Advantages

  • Clarity and Logic: Decision trees present complex decisions in a simple and logical format, allowing managers to see the full picture.

  • Structured Decision-Making: They encourage a methodical approach by considering both risks and rewards at each stage.

  • Speed of Decision-Making: With information laid out clearly, managers can make quicker and more confident choices.

  • Objectivity: By relying on probabilities and financial outcomes, decision trees minimize subjective bias and allow for more scientific decision-making.

  • Visual Representation: The diagram provides a tangible, easy-to-understand overview that can be communicated effectively across teams or to stakeholders.

Disadvantages

  • Reliance on Estimates: Probabilities and revenues used in decision trees are often based on forecasts or managerial judgment, which may be inaccurate.

  • Potential Bias: Managers may unintentionally manipulate probabilities to favor their preferred option.

  • Time Sensitivity: Data used in decision trees can become outdated quickly, especially in industries with rapid technological or market changes.

  • Exclusion of Qualitative Factors: Decision trees focus heavily on numerical data, ignoring qualitative elements such as employee morale, brand reputation, or customer satisfaction.

  • Risk Not Eliminated: Although they help identify and assess risks, decision trees cannot remove the inherent uncertainty of business decisions.

Overall Importance
Decision trees combine quantitative and qualitative insights to create a balanced framework for decision-making. While they are not flawless, their structured approach allows businesses to evaluate complex situations systematically. By integrating costs, probabilities, and revenues, they ensure that managers base their choices on rational analysis rather than instinct alone.

Decision Trees Quiz

1. In a decision tree, what does a square represent?

2. Which of the following is a key feature of probability nodes?

3. What is the formula for calculating expected value in a decision tree?

4. One advantage of decision trees is:

5. Which of the following is a disadvantage of decision trees?

6. The final step in constructing a decision tree is to:

7. Net Expected Value (NEV) is calculated as:

8. Which component of a decision tree represents uncertain outcomes?

9. Why must probabilities in a decision tree add up to 1?

10. Which of the following is NOT a strength of decision trees?