Low-Trust Society
Monday 29 December 2025Kyle Saunders “The United States Has Become a Low-Trust Society — and That Changes Everything” link
Nederland gaat ook die kant op...
In high-trust environments, institutional breakdowns are treated as problems to be fixed. In low-trust environments, they are treated as proof that the system is fundamentally corrupt. The same event produces radically different reactions depending on the baseline level of legitimacy.
This is why fraud scandals, administrative failures, or enforcement lapses now carry outsized symbolic weight. They are no longer isolated incidents; they become narrative accelerants.
Recent high-profile welfare fraud cases illustrate this dynamic. While such cases are not representative of broader communities or programs, they nonetheless erode confidence precisely because trust is already thin. In a low-trust society, oversight failures are interpreted as intent, not error. The damage is social as much as fiscal.
Public health provides an even clearer example. Vaccine hesitancy during COVID was not simply about misinformation. It was about credibility deficits that predated the pandemic. Once trust was gone, expertise alone could not compensate.
Why do we suffer?
Friday 26 December 2025Matt Ball, “What is the adaptive value of suffering?” link
We rarely distinguish between the raw act of sensing and the subjective experiences that ensue. But that’s not because such distinctions don’t exist.
Think about the evolutionary benefits and costs of pain [subjective suffering]. Evolution has pushed the nervous systems of insects toward minimalism and efficiency, cramming as much processing power as possible into small heads and bodies. Any extra mental ability – say, consciousness – requires more neurons, which would sap their already tight energy budget. They should pay that cost only if they reaped an important benefit. And what would they gain from pain?
The evolutionary benefit of nociception [sensing negative stimuli / bodily damage] is abundantly clear. It’s an alarm system that allows animals to detect things that might harm or kill them, and take steps to protect themselves. But the origin of pain [suffering], on top of that, is less obvious. What is the adaptive value of suffering? Why should nociception suck? Animals can learn to avoid dangers perfectly well without needing subjective experiences. After all, look at what robots can do.
Engineers have designed robots that can behave as if they're in pain, learn from negative experiences, or avoid artificial discomfort. These behaviors, when performed by animals, have been interpreted as indicators of pain. But robots can perform them without subjective experiences.
Insect nervous systems have evolved to pull off complex behaviors in the simplest possible ways, and robots show us how simple it is possible to be. If we can program them to accomplish all the adaptive actions that pain supposedly enables without also programming them with consciousness, then evolution – a far superior innovator that works over a much longer time frame – would surely have pushed minimalist insect brains in the same direction. For that reason, Adamo thinks it's unlikely that insects feel pain. ...
Insects often do alarming things that seem like they should be excruciating. Rather than limping, they'll carry on putting pressure on a crushed limb. Male praying mantises will continue mating with females that are devouring them. Caterpillars will continue munching on a leaf while parasitic wasp larvae eat them from the inside out. Cockroaches will cannibalize their own guts if given a chance.
I would be very comfortable betting my life that insects do not have subjective conscious experiences. Not to get into the weeds, but I believe people too easily conflate behavior with consciousness. The ability to sense things – "sentience" in its broadest meaning – exists all the way down to single-cell organisms (Carl Sagan once told me this is why "sentience" per se couldn't be the basis of morality). But the ability to sense things is not the same as being conscious.
To me, all the evidence indicates that the ability to have conscious, subjective experiences – to be able to actually suffer – derives from and requires significant neural complexity (sorry Chalmers). And once the appropriate level of complexity is reached, further complexity can lead to a greater capacity for consciousness. That is, the ability to feel feelings is not binary, but analog.