The author reviews mortality‑salience studies using several bias‑correction tools and finds they point in different directions—from pro‑TMT to anti‑TMT—depending on the method. Synthesizing across tools yields a modest but non‑zero effect (about r = 0.18) and a public ShinyApp to probe sensitivity. Meta‑analytic conclusions should be presented as ranges across an ensemble of methods, not as a single 'definitive' number.
— Treating meta‑analysis as an ensemble problem would improve evidence standards in psychology and other policy‑relevant fields by curbing cherry‑picking and clarifying uncertainty.
2025.10.07
82% relevant
The article shows funnel‑plot asymmetry and uses trim‑and‑fill corrections to demonstrate how meta‑analytic estimates can be biased upward, reinforcing the existing idea that meta‑results should be audited across multiple methods rather than treated as a single definitive number.
Michael Inzlicht
2025.07.30
100% relevant
Heine’s use of p‑curve, z‑curve, WAAP‑WLS, selection models, and PET‑PEESE alongside a ShinyApp demonstrating how results vary by tool.
Michael Inzlicht
2025.07.23
82% relevant
The article references Steve Heine’s comprehensive meta-analysis using 'forensic' tools on terror management theory; this maps directly onto the ensemble‑methods argument that mortality‑salience effects shrink or vanish depending on analytic choices, demanding robustness maps rather than single‑number claims.