AI Ghostwriting In Therapy

Updated: 2025.09.03 1M ago 1 sources
Reports of therapists copy‑pasting client issues into ChatGPT and relaying its text back—sometimes exposed by accidental screenshares—show AI is already embedded in clinical encounters without patient consent. This raises Health Insurance Portability and Accountability Act–style privacy risks (sending protected health information to third‑party models), informed‑consent gaps, and unclear liability when machine‑generated counsel harms patients. — It forces regulators and boards to set disclosure, data‑handling, and liability rules for AI‑assisted care while challenging assumptions about the distinct value of human talk therapy.

Sources

Wednesday: Three Morning Takes
PW Daily 2025.09.03 100% relevant
MIT report and Reddit anecdotes of a therapist screensharing ChatGPT during a session and another leaving prompts in an email response.
← Back to All Ideas