Click‑through arbitration clauses can shunt AI harm claims into closed forums, cap liability at trivial sums, and keep evidence out of public view. In child‑safety cases, firms can even compel vulnerable minors to testify, compounding trauma and deterring broader scrutiny.
— If forced arbitration becomes standard for AI platforms, it will neuter public oversight and slow needed safety reforms for products used by children.
EditorDavid
2025.09.21
70% relevant
Meta is using UK arbitration to enforce a non‑disparagement agreement against former executive Sarah Wynn‑Williams, with threatened $50,000 per‑breach fines and an order to stop promoting her book—an instance of forced arbitration operating as a private speech‑control mechanism that chills public‑interest disclosures.
BeauHD
2025.09.18
100% relevant
Character.AI allegedly forced a grieving mother into arbitration over her autistic son’s chatbot‑linked self‑harm, with terms suggesting a $100 liability cap and a compelled deposition while institutionalized.
← Back to All Ideas