1. How Quantifying Probability Assessments Influences Analysis and Decision Making: Experimental Evidence from National Security Professionals
- Author
-
Jennifer S. Lerner, Jeffrey A. Friedman, and Richard J. Zeckhauser
- Subjects
Actuarial science ,National security ,business.industry ,media_common.quotation_subject ,Homeland security ,Public relations ,Imprecise probability ,Numeral system ,Action (philosophy) ,Political science ,Terrorism ,business ,Overconfidence effect ,Skepticism ,media_common - Abstract
National security is one of many fields where public officials offer imprecise probability assessments when evaluating high-stakes decisions. This practice is often justified with arguments about how quantifying subjective judgments would bias analysts and decision makers toward overconfident action. We translate these arguments into testable hypotheses, and evaluate their validity through survey experiments involving national security professionals. Results reveal that when decision makers receive numerals (as opposed to words) for probability assessments, they are less likely to support risky actions and more receptive to gathering additional information, disconfirming the idea of a bias toward action. Yet when respondents generate probabilities themselves, using numbers (as opposed to words) magnifies overconfidence, especially among low-performing assessors. These results hone directions for research among both proponents and skeptics of quantifying probability estimates in national security and other fields. Given that uncertainty surrounds virtually all intelligence reports, military plans, and national security decisions, understanding how national security officials form and interpret probability assessments has wide-ranging implications for theory and practice.
- Published
- 2016