Decision Making Under Uncertainty

Apply statistical thinking to real-world decisions: expected value, risk vs uncertainty, decision trees, cost-benefit analysis, and value of information.

23 min read
Intermediate

From Data to Decisions

Statistics isn't just about understanding data — it's about making better decisions when you don't have complete information.

Should you launch the product? Approve the drug? Hire the candidate? Every real-world decision involves uncertainty. Statistical thinking provides a framework for quantifying that uncertainty and choosing wisely.

Expected Value: The Core Concept

The expected value is the long-run average outcome if you repeated a decision many times.

E[X] = Σ (outcome × probability)

It's the probability-weighted average of all possible outcomes.

Simple Gamble

A game costs $10 to play:

  • 60% chance: win $20
  • 40% chance: win $0

Expected value: E[winnings] = 0.6(20)+0.4(20) + 0.4(0) = $12

Expected profit = 1212 - 10 cost = $2

On average, you win $2 per play. Over many games, you come out ahead.

When to use expected value:

  • Repeated decisions (insurance, investing)
  • Large-scale operations (casinos, factories)
  • Risk-neutral situations (small stakes)

When NOT to use expected value alone:

  • One-time decisions with catastrophic downside
  • Risk preferences matter (see below)

Risk vs Uncertainty

Risk: You know the probabilities (rolling dice, coin flips)

Uncertainty: You don't know the probabilities (will this startup succeed? will there be a recession?)

Much of statistics deals with risk — we can quantify probabilities. But real-world decisions often involve uncertainty — we must estimate or guess probabilities.

Example:

  • Insurance: Risk (actuaries have data on accident rates)
  • Starting a company: Uncertainty (no historical data on YOUR idea)

Under uncertainty, sensitivity analysis helps: how much does your decision change if probabilities are off by 10%? 50%?

Decision Trees

Decision trees map out:

  • Decisions (squares)
  • Uncertain outcomes (circles)
  • Probabilities and payoffs

You work backwards from outcomes to choose the best decision.

New Product Launch

Decision: Launch or don't launch?

If launch:

  • 40% chance: Success → $1M profit
  • 60% chance: Failure → -$200K loss

Expected value if launch: 0.4(1M)+0.6(1M) + 0.6(-200K) = 400K400K - 120K = $280K

If don't launch: $0

Decision: Launch! (Positive expected value)

But: If you can't afford a $200K loss, the expected value might not matter — risk aversion comes in.

Risk Aversion and Utility

Expected value assumes risk neutrality — you only care about average outcomes. But humans are risk averse: we prefer certainty to gambles with the same expected value.

Insurance

Your house (worth $300K) has:

  • 1% chance of burning down → lose $300K
  • 99% chance of no fire → lose $0

Expected loss: 0.01(300K)=300K) = 3,000

Insurance costs $4,000/year. This is negative expected value!

Yet most people buy insurance. Why? Risk aversion — the pain of losing 300Kfaroutweighsthejoyofsaving300K far outweighs the joy of saving 1,000 on premiums.

Insurance transfers risk from risk-averse individuals to risk-neutral (or diversified) companies.

Utility theory formalizes this: decisions maximize expected utility (satisfaction), not expected monetary value.

For risk-averse people, utility of money is concave: each additional dollar provides less satisfaction. Losing 1,000hurtsmorethangaining1,000 hurts more than gaining 1,000 helps.

Cost-Benefit Analysis

Systematic framework for decisions:

1. List all options

2. For each option, estimate:

  • Costs (monetary, time, opportunity cost)
  • Benefits (direct, indirect, long-term)
  • Probabilities

3. Calculate expected net benefit: E[benefit] - E[cost]

4. Choose the option with highest expected net benefit

5. Sensitivity analysis: How robust is your choice to changes in assumptions?

Going to Grad School

Option A: Work (no grad school)

  • Income: 50K/yearfor30years=50K/year for 30 years = 1.5M
  • Cost: $0
  • Net: $1.5M

Option B: Grad school

  • Cost: 100Ktuition+100K tuition + 100K foregone income = $200K
  • Income after: 80K/yearfor28years=80K/year for 28 years = 2.24M
  • Net: 2.24M2.24M - 200K = $2.04M

Difference: $540K in favor of grad school

But: This ignores discounting (money now > money later), probability of completion, job satisfaction, career uncertainty, etc.

Conclusion: Grad school looks good on expected value, but individual circumstances matter.

The Value of Information

Sometimes you can pay to reduce uncertainty before deciding. How much is that information worth?

Oil Drilling

You can drill for oil (costs $1M):

  • 30% chance: Find oil → 5Mrevenue5M revenue → 4M profit
  • 70% chance: No oil → -$1M

Expected value: 0.3(4M)+0.7(4M) + 0.7(-1M) = 1.2M1.2M - 0.7M = $0.5M

You could pay for a geological survey that's 80% accurate. How much is it worth?

With perfect information:

  • If survey says "oil": drill (expected profit $4M)
  • If survey says "no oil": don't drill (save $1M loss)

Value of perfect information = $1M (eliminates downside risk)

Even imperfect information has value — just less than $1M. Statistical decision theory quantifies this precisely.

Experiments and pilots are valuable because they provide information before full commitment. Spend 10Konapilottoavoida10K on a pilot to avoid a 1M mistake.

Minimax and Maximin Strategies

When you don't know probabilities (deep uncertainty), two conservative strategies:

Minimax: Minimize the maximum possible loss
Choose the option with the best worst-case scenario.

Maximin: Maximize the minimum gain
Choose the option with the highest guaranteed outcome.

These are very risk-averse — appropriate for catastrophic risks or when probabilities are truly unknown.

Business Strategy

Three strategies under economic uncertainty:

| Strategy | Best Case | Worst Case | | --- | --- | --- | | Aggressive | 10M10M | -2M | | Moderate | 5M5M | 1M | | Conservative | 2M2M | 1.5M |

Minimax: Conservative (worst case = 1.5M,bestamongworstcases)Maximin:Conservative(minimumgain=1.5M, best among worst cases) **Maximin:** Conservative (minimum gain = 1.5M)
Expected value: Depends on probabilities

Use minimax/maximin when you can't afford the worst case, even if it's unlikely.

Common Decision Biases

Humans are predictably irrational:

1. Overconfidence: We overestimate our knowledge and the precision of our estimates.

2. Sunk cost fallacy: Continuing because of past investment, ignoring future costs/benefits.

3. Loss aversion: Losses hurt ~2× more than equal gains feel good.

4. Anchoring: First number you see biases estimates (why negotiators go first).

5. Availability bias: Recent/vivid events feel more likely (shark attacks after news coverage).

6. Base rate neglect: Ignoring prior probabilities (Bayes' theorem violations).

Statistical thinking provides tools to overcome these biases through explicit probability reasoning.

Test your knowledge

🧠 Knowledge Check
1 / 5

A game costs $5. You win $20 with 20% probability, $0 otherwise. What is the expected profit?