Another evidence that nudges may not be as effective as often assumed.
A recent second-order meta-analysis synthesizing 14 meta-analyses, 1,600+ primary studies, and ~30 million participants finds that nudges show a small positive average effect (see the corresponding forest plot below).

Forest plot of the effects of the nudging intervention.
However, after correcting for publication bias (see the corresponding funnel plot below), the pooled effect is statistically indistinguishable from zero.

Funnel plot of nudging intervention.
A common critique is that “nudging” is too broad - like meta-analyzing all medicines at once - and that we should focus on specific interventions that work.
This study challenges that optimism. The authors found no robust moderators to identify successful contexts. This is likely because the foundation itself is shaky: the majority of existing meta-analyses were rated as low or critically low in methodological quality. The signal is simply lost in the noise of low-quality methods.
The implication is stark: We cannot currently distinguish the “winners” from the statistical artifacts.
For policy and practice, we must stop assuming “what works” and demand fundamental rigor. We need valid measurement before we can even begin to identify effective interventions.
For attribution, please cite this work as
Stehlík (2025, Dec. 15). Ludek's Blog About People Analytics: Signal vs. Noise: Why we can’t yet identify effective nudges. Retrieved from https://blog-about-people-analytics.netlify.app/posts/2025-12-15-nudge-effectiveness/
BibTeX citation
@misc{stehlík2025signal,
author = {Stehlík, Luděk},
title = {Ludek's Blog About People Analytics: Signal vs. Noise: Why we can’t yet identify effective nudges},
url = {https://blog-about-people-analytics.netlify.app/posts/2025-12-15-nudge-effectiveness/},
year = {2025}
}