Statistical thinking is a tool for process analysis of phenomena in relatively simple terms, while also providing a level of uncertainty surrounding it.[1] It is worth nothing that "statistical thinking" is not the same as "quantitative literacy", although there is overlap in interpreting numbers and data visualizations.
Statistical thinking relates processes and statistics, and is based on the following principles:
W. Edwards Deming promoted the concepts of statistical thinking, using two powerful experiments:
1. The Red Bead experiment,[2] in which workers are tasked with running a more or less random procedure, yet the lowest "performing" workers are fired. The experiment demonstrates how the natural variability in a process can dwarf the contribution of individual workers' talent.
2. The Funnel experiment, again demonstrating that natural variability in a process can loom larger than it ought to.
The take home message from the experiments is that before management adjusts a process—such as by firing seemingly underperforming employees, or by making physical changes to an apparatus—they should consider all sources of variation in the process that led to the performance outcome.
Nigel Marriott breaks down the evolution of statistical thinking.[3]
Statistical thinking is thought to help in different contexts, such as the courtroom,[4] biology labs, and children growing up surrounded by data.
The American Statistical Association (ASA) has laid out what it means to be "statistically educated".[5] Here is a subset of concepts for students to know, that:
Statistical thinking is a recognized method used as part of Six Sigma methodologies.