Visible AI watermarks are trivially deleted within hours of release, making them unreliable as the primary provenance tool. Effective authenticity will require platform‑side scanning and labeling at upload, backed by partnerships between AI labs and social networks.
— This shifts authenticity policy from cosmetic generator marks to enforceable platform workflows that can actually limit the spread of deceptive content.
BeauHD
2026.01.10
57% relevant
The article raises provenance and incentive questions: web actors are altering formatting to be better cited by LLMs; this echoes the provenance problem (watermarks and upload‑side enforcement) because Google’s statement underlines that platform ingestion signals matter and that provenance/formatting hacks can distort downstream uses unless platforms set enforceable norms.
BeauHD
2026.01.07
82% relevant
The NWS map error is a provenance failure: a generative system produced a visual artifact with fabricated labels. This aligns with the existing idea that visible generator marks are insufficient and that platforms (and institutions) must enforce provenance at upload/time‑of‑publication rather than rely on generator watermarks alone — here, the weather office published an AI base map without adequate upload‑side checks.
BeauHD
2025.10.07
100% relevant
404 Media verified that several public sites remove Sora 2’s watermark in seconds; Hany Farid and Rachel Tobac urge platform‑level detection and labeling.