How Calibration Quizzes Aid in Forecasting

Mount Kilimanjaro is higher than Denali. True or false?
Wilbur Wright made the first successful powered flight.
True or false?
The world’s tallest waterfall is in Venezuela.
True or false?

How confident are you in your answers? 100%? 70%? 50%?

Forecasting uncertainty, whether through probability estimates or numerical ranges, is a core skill in Decision Analysis. Like any skill, it improves with deliberate practice. Research shows that calibration exercises help individuals assess uncertainty more realistically and improve accuracy over time.

That’s why Decision Frameworks recently introduced a cloud-based calibration quiz tool. Attendees at the Society of Decision Professionals (SDP) conference in Vancouver had the opportunity to try it firsthand.

What Is a Calibration Quiz and Why Does It Matter?

Calibration quizzes come in two formats:

  • Binary: Users answer a series of true/false questions and rate their confidence in each answer

  • Range: Users provide P10 and P90 bounds for a series of numerical questions

Here’s an example of a range quiz from the Decision Frameworks Tool Suite:

calibration quiz

The goal of both formats is the same: to strengthen our ability to assess uncertainty. Binary questions help develop skills in assigning accurate probabilities to outcomes, such as the likelihood of regulatory approval. Range questions improve our estimates of spread, particularly when forecasting values like capital costs for a new facility.

For project teams, calibration builds stronger forecasting habits. It improves how uncertainty is modeled, which leads to better evaluations of risk and more informed strategies.

How Did the SDP Conference Participants Do?

Over three days at the conference, attendees participated in a binary calibration quiz. The results were plotted on a chart comparing actual accuracy to stated confidence:

binary calibration quiz decision frameworks

Perfect calibration would follow a straight diagonal line where the percentage of correct answers matches the stated confidence. Any deviation from that line shows a mismatch between confidence and accuracy.

Most participants fell below the green line, meaning they were more confident than correct. That pattern is common and well-documented as humans are overconfident by nature. Fortunately, calibration exercises like this one can help reduce that tendency over time.

Our new Tool Suite allows users to track progress and revisit past attempts. The software is currently in beta, and we’re welcoming new participants to test it. If you’d like to request access, we’d be glad to include you.

Looking Ahead

Calibration is not a one-time exercise. It is a habit that sharpens judgment over time, especially when decisions rely on both experience and uncertainty. If you joined the quiz at SDP, we encourage you to keep going. 

Share the tool with your team, revisit it regularly, and use it to strengthen how you think under uncertainty. With continued use, calibration improves awareness of what you know, what you don’t, and how to work within that gap.

Decision Frameworks