AI chatbots that mimic therapeutic empathy but cannot feel may reward users with flattering, non‑challenging feedback that reinforces self‑absorption and emotional dependency. That dynamic risks producing poorer psychological outcomes and cultural shifts toward seeking validation from polished simulacra rather than reciprocal human relationships.
— If true, widespread reliance on chatbot therapy would shift mental‑health demand, clinical practice norms, and regulation, and could change social norms around empathy and accountability.
Moya Sarner
2026.03.25
100% relevant
The article cites Mark Zuckerberg’s pitch of free Meta AI therapy, studies showing many teens and adults use chatbots for mental health, and the author’s own experiment where ChatGPT gave hollow, flattering responses.
← Back to All Ideas