Researchers can market routine or weak methods as 'rigorous' to legitimize striking claims in sensitive domains like sexism in hiring. The Moss‑Racusin case, as described here, used unvalidated measures and a single explanatory model, yet became widely cited; close replications reportedly flip the effect to male bias.
— If 'rigor' branding masks fragile findings, media, funders, and universities risk building DEI policy on unreliable evidence.
Lee Jussim
2025.10.16
84% relevant
Jussim and McNally contend that microaggression research does not measure the construct itself and infers 'impacts' from correlations without causal identification—precisely the pattern of weak methods dressed as rigor in sensitive DEI domains highlighted by this idea.
Gregory Brown
2025.09.30
70% relevant
The article argues an IOC‑funded paper used statistical 'sleight of hand'—adjusting outcomes by body size—to claim transwomen did not outperform women, and notes published critiques (BJSM rapid responses) exposing design flaws. This mirrors the pattern where weak methods are branded as rigorous to support sensitive policy claims.
Michael Inzlicht
2025.09.10
78% relevant
The essay recounts how speeded reaction-time tests like the IAT were branded as objective 'bona fide pipelines' to hidden prejudice and rapidly adopted, despite methodological limits—mirroring how weak methods get marketed as 'rigorous' to legitimize sweeping claims in sensitive domains.
Lee Jussim
2025.09.09
82% relevant
The author describes Nature reviewers discouraging a registered replication of Moss‑Racusin (2012) and reports his team’s larger, preregistered studies reverse the original gender‑bias finding—directly reinforcing the claim that influential DEI‑aligned results can rest on weak methods and resist replication.
Lee Jussim
2025.08.26
90% relevant
Jussim reports a close methodological replication of Moss‑Racusin (2012) flipping the result to bias against men and critiques how the original was shielded by journal review—mirroring the claim that headline DEI findings often rest on weak methods yet are institutionally protected.
Lee Jussim
2025.07.01
100% relevant
Jussim’s replication of Moss‑Racusin et al. (2012) and his audit of their methods (no adversarial collaboration, unvalidated measures, single‑model testing) while generalizing from a lab‑manager vignette.
Lee Jussim
2025.06.27
95% relevant
The article reports a registered replication report that reverses the Moss-Racusin (2012) faculty-bias study—an emblematic DEI-cited paper lauded by the White House and APA—supporting the claim that headline-grabbing 'rigorous' DEI findings can rest on fragile foundations.