New survey data show strong, bipartisan support for holding AI chatbots to the same legal standards as licensed professionals. About 79% favor liability when following chatbot advice leads to harm, and roughly three‑quarters say financial and medical chatbots should be treated like advisers and clinicians.
— This public mandate pressures lawmakers and courts to fold AI advice into existing professional‑liability regimes rather than carve out tech‑specific exemptions.
EditorDavid
2025.12.01
75% relevant
Both the article and that idea point to active public concern about AI and political pressure to hold AI actors accountable; the fundraisers cite polling showing voter support for 'guardrails' and are forming political organizations to convert that sentiment into electoral outcomes and liabilities for industry opposition.
Noah Smith
2025.12.01
72% relevant
Noah Smith cites Ipsos and Pew polling that Americans are more worried than excited about AI; that public anxiety maps directly onto other findings (captured in the existing idea) that voters support holding AI systems to professional‑liability standards—both signals feed the same policy pressure for regulation and liability rules.
Kelsey Piper
2025.10.09
100% relevant
The Argument poll (73%–75% parity for financial/medical advice; 79% liability for harmful advice) and the cited lawsuit over ChatGPT’s alleged suicide encouragement.