In statistics, Pyrrho's lemma is the result that if one adds just one extra variable as a regressor from a suitable set to a linear regression model, one can get any desired outcome in terms of the coefficients (signs and sizes), as well as predictions, the R-squared, the t-statistics, prediction- and confidence-intervals. The argument for the coefficients was advanced by Herman Wold and Lars Juréen[1] but named, extended to include the other statistics and explained more fully by Theo Dijkstra.[2] Dijkstra named it after the sceptic philosopher Pyrrho and concludes his article by noting that this lemma provides "some ground for a wide-spread scepticism concerning products of extensive datamining". One can only prove that a model 'works' by testing it on data different from the data that gave it birth.[3]
The result has been discussed in the context of econometrics.[4]