Platforms should require named experts to explicitly opt in before AI features present suggestions 'in the voice of' or credited to real writers. Controls should include clear labeling, revenue/representation options for experts, and an easy opt‑out so individuals cannot be presented as endorsing AI outputs without permission.
— Establishing expert consent norms affects platform design, creator rights, misinformation risk, and possible legal standards for AI impersonation.
BeauHD
2026.03.11
100% relevant
Shishir Mehrotra's announcement that Grammarly (rebranded Superhuman) disabled its Expert Review — an agent that surfaced suggestions 'inspired by' influential voices after experts complained about misrepresentation — is a direct example.
← Back to All Ideas