Beliefs are often chosen to fit incentives, not truth. Where personal costs for error are low (e.g., an individual vote, a viral post) and rewards favor tribal alignment or outrage, epistemic irrationality can be instrumentally rational. That makes public 'stupidity' and gullibility predictable outputs of today’s incentive structures rather than mere cognitive failure.
— It shifts misinformation and polarization debates from 'educate people more' to redesigning incentives that currently reward confident error and low-cost delusion.
Dan Williams
2025.06.13
100% relevant
The article declares 'Sometimes stupidity… isn’t a mistake. It’s a strategy,' invokes 'skin in the game,' and uses Elon Musk’s conspiratorial posting and political whiplash to illustrate incentive-aligned irrationality.
David Pinsof
2024.12.09
70% relevant
Pinsof argues arguing isn’t primarily about truth or persuasion but about status and tribe, aligning with the claim that beliefs are chosen to fit incentives—outrage, alignment, and audience rewards—rather than accuracy.
David Pinsof
2024.10.15
78% relevant
Pinsof argues that 'credences'—abstract, political, or spiritual stances—carry no behavioral cost if wrong and thus function as social signals, mirroring the existing idea that beliefs are often chosen to fit incentives when personal costs for error are low.