Create a public, auditable meta‑registry that collects near‑term AI capability predictions, records their exact operational definitions and pre‑specified prompt/tests, and publishes retrospective calibration scores. The registry would standardize how forecasts are framed (what 'AGI' concretely means), force prompt and evaluation provenance, and produce a running error‑rate metric for different predictor classes (founders, academics, pundits).
— A standard calibration registry turns noisy, attention‑driven claims about AI timelines into accountable evidence that policymakers, investors and the public can use to set graduated governance and industrial triggers.
jessicata
2026.01.16
100% relevant
The LessWrong post itself assembles and retrospectively scores many 2025 predictions (Musk, Marcus, Taelin, Jack Gallagher) and shows selection‑effects and systematic overestimation, demonstrating the need for a formal calibration registry.
← Back to All Ideas