Survey reports should routinely publish cumulative response rates (recruitment × recruitment follow‑ups × panel retention) alongside margins of error and design weights so readers can judge representativeness. Doing so makes clear when apparently precise estimates rest on thin recruitment and heavy weighting rather than broad participation.
— Mandating this disclosure would change how journalists, scholars and the public evaluate and cite survey results, especially on politically or culturally sensitive topics.
Sara Atske
2026.04.15
90% relevant
The article reports a study‑level AAPOR RR1 of 45% and a cumulative response rate of 1.2%; that directly exemplifies and supports the existing idea that researchers should publish cumulative response rates so readers can judge panel fragility and representativeness.
Jcoleman
2026.04.07
92% relevant
The article reports the ATP survey’s cumulative response rate (3%), directly exemplifying the matched idea that pollsters should publish cumulative response rates so readers can judge panel fragility and representativeness.
Reem Nadeem
2026.04.03
90% relevant
The methodology explicitly reports a cumulative response rate of 3% for the American Trends Panel wave — directly exemplifying the practice of publishing cumulative recruitment/attrition metrics that bear on poll credibility and nonresponse bias.
Reem Nadeem
2026.03.12
90% relevant
The article explicitly reports a 3% cumulative response rate for the American Trends Panel Wave 185 and the survey-level response of 92% (8,512 respondents of 9,302 sampled), directly implementing the existing idea that pollsters should disclose cumulative response rates so readers can judge representativeness and nonresponse risk.
Janakee Chavda
2026.03.05
100% relevant
Pew’s ATP methodology states a cumulative response rate of 3% for the panel, illustrating how low cumulative participation can underlie headline survey claims.