AI Translation Hallucinations Threaten Wikipedia

Updated: 2026.03.06 13H ago 1 sources
Paid translation programs using generative models (e.g., Google Gemini, ChatGPT) are introducing factual errors, missing citations, and irrelevant sources into Wikipedia articles when used to speed up cross‑language expansion. Volunteer editors are responding with ad hoc restrictions on specific contributors and tightened review policies to protect article integrity. — This reveals a current failure mode of generative AI that threatens the reliability of a key global knowledge infrastructure and forces governance choices about labor, tooling, and cross‑language verification.

Sources

AI Translations Are Adding 'Hallucinations' To Wikipedia Articles
BeauHD 2026.03.06 100% relevant
Open Knowledge Association's contractor translations—using Google Gemini and ChatGPT—were found to contain hallucinations, prompting Wikipedia editors to restrict repeat offenders.
← Back to All Ideas