Recently we've been exploring moral philosophy with our series on Moral Licensing, Andrew Tane Glen's Why Cooperate?, and in a workshop I ran with my daughter's class about the strategies of cooperation and defection. One phenomenon that has arisen through these explorations is that defectors gain a short term, relative advantage, while cooperators benefit from a sustained long term absolute advantage, which got me thinking about a simulation.
What if we take a group of individuals, all interacting at random, with some defectors and some cooperators, and set up a Prisoner's dilemma style payoff matrix, could we observe this phenomenon in real time?
To explain what's going on here. We have 100 "agents" which all start with 20 health points - health is reflected in their size. 15 of the agents are defectors, while the others are cooperators. When they bump into each other there is an interaction.
If two cooperators interact, they each get +1 health, two defectors -1 each, and if a defector and a cooperator interact the defector gets +2 and the cooperator gets -2.
As you can imagine this initially benefits the defectors, but there's a catch for those greedy grubs; each agent remembers who has defected against them and physically avoids them from then on. Avoiding a defector is indicated with a white dot in the body of the agent in the direction of the defector (and of course the agent moves away).
An additional rule is that when an agent gets very low in health, they can no longer give health to a defector or a cooperator, but any cooperator with adequate life that interacts with them will "gift" them +2, restoring their ability to interact normally. It's sort of like a social safety net combined with some charitable behaviour.
We can see that over time, cooperation is the dominant strategy, even if it doesn't appear to be in the short term. This is because of the presence of trust, that might be interpersonal trust, or trust in a system. It also shows that a system can still thrive despite the persistent presence of defectors—in saying that, there is a limit, when the number of defectors goes over 20% you end up with a system wide crash, a negative sum game, where no one has anything left to give, making defection an absolutely dominated strategy. Cooperation, on the other hand, breeds trust and trust breeds cooperation in a virtuous cycle that can be capitalised on sustainably, which leads to positive-sum outcomes.