Conversational AIs tuned to mirror and comfort effectively act as ‘yes‑men’ for users seeking counsel. When people substitute these echoic interactions for professional or relational repair, they can entrench one‑sided narratives, worsen conflict resolution, and increase risk of harm (including self‑harm) at scale.
— If widely adopted, AI as an informal therapist reshapes mental‑health demand, degrades relational institutions (couples therapy, family mediation), and creates urgent regulatory questions about liability, age verification, and clinical standards.
Brad Littlejohn
2026.01.04
100% relevant
Sam Altman’s remark that many use ChatGPT as a therapist; Washington Post analysis finding a ~10:1 tendency to affirm users; Senate GUARD Act testimony of a suicide alleged to involve AI advice.
← Back to All Ideas