How Markets Teach Us to Care about Strangers

By Glen Spiteri, (with contributions from Maximilian Maier and Falk Lieder)

Published April 28, 2026

Glen Spiteri’s and Falk Lieder’s BKI project examines how learning from experience can enhance people’s moral concern for distant others, thereby promoting kindness, compassion, and altruism. Our experiments focus on how people learn from the interpersonal consequences of past decisions. We hypothesize that people’s moral circle can expand when they learn from how their actions have affected distant others, provided they attend to those consequences and empathize with the people who were affected by them.

Introduction 

Every time you buy a coffee from a barista you might see again but will never quite know, hire a plumber through an app, or sell something to someone on Craigslist, you complete a small cooperative exchange with a stranger. These interactions are unremarkable, perhaps even insignificant to most people; yet, in actuality are quite profound. My co-authors, Maximilian Maier and Falk Lieder, and I hold that these everyday market interactions are gradually expanding the circle of people you are willing to care about.

Everyday market interactions don’t just move goods and money around. Over time, they reshape who matters, morally, in your mind. This may sound speculative, but empirically, corresponds to a striking pattern: people who live in societies that rely more heavily on market exchange tend to behave more generously toward strangers (i.e., a generalized anonymous person), and they do so even in situations where there is nothing to gain. The pattern appears across very different economic systems (Henrich 2010), across regions within a single country (Baldassarri, 2020), and in controlled laboratory experiments (Peysakhovich & Rand, 2016). 

Why? In a recent pre-print, my co-authors and I propose an answer grounded in how people learn. The core idea is simple. When you take an action that gives special weight to another person’s interests, and the interaction turns out better than you expected, your mind registers a positive reward that quietly prompts you to increase what we might call the moral weight that you place on that kind of person going forward (by moral weight, we mean the importance of others’ welfare relative to your own). In other words, you are more likely to see that type of person’s value and have feelings of care for them. Conversely, when the same kind of action goes worse than you expected, the moral weight assigned to that type of person decreases. Markets are an environment that, on average, generate more of the small positive surprises from interactions with people you would otherwise have no reason to cooperate with. That, we argue, is how moral concern for strangers is acquired, learned and updated, one ordinary transaction at a time. 

The Evidence: From Small-Scale Societies to Italian Mailboxes

In a landmark study, Joseph Henrich and colleagues traveled to 15 small-scale societies around the world – from whale hunters in the Pacific to horticulturalists in the Amazon — and asked participants to play a simple game that behavioral game theorists (Camerer, 1997) call a “dictator game.” In the game, participants were given money and could share as much or as little of it as they wanted with an anonymous stranger, at no cost. What they found was striking: unlike hunter-gatherer societies dependent on foraging, hunting, fishing, and farming for sustenance, people in societies more dependent on market exchange (i.e., wage work) gave significantly more to strangers.

Was this  simply a quirk of laboratory economic games? Not quite. Delia Baldassarri tested the idea in the real world by “accidentally” dropping nearly 1,000 stamped letters across 188 Italian towns and tracking which ones were mailed back. In towns with higher labor-force participation (i.e., places with more people in employment) — a proxy of market activity — strangers were more likely to pick up and mail the letters. Crucially, this pattern held regardless of whether the letter was addressed to someone with a typical Italian name or to someone with a non-Italian name, such as Mohamed Hassan. Market integration did not just make people kinder to their own group; it was associated with greater kindness toward strangers more generally, as measured by the likelihood of returning lost letters. A third line of evidence comes from experiments by Alexander Peysakhovich and David Rand, who examined how cooperative norms emerge in experimental settings. Some participants were placed in environments where cooperating with strangers paid off, while others were placed in environments where cooperation was disadvantageous and defection paid. Over time, the cooperative environments became more generous, while those in the defection environments became more selfish and less likely to cooperate. Importantly, these tendencies carried over into new situations involving new people.

How can we understand this? 

Our theory is simple but powerful: people learn ‘moral weights’ — the importance they assign to others’ welfare relative to theirs — through reinforcement learning (see Sutton & Barto, 2018 for a comprehensive introduction). Every time you interact with a stranger and things go well, your brain registers a small positive signal: considering this person’s welfare led to a good outcome. Over time, those signals accumulate, and the moral weight one assigns to strangers quietly increases. You start caring more about people you don’t know. Not because anyone told you to, but because experience itself taught you that doing so is worthwhile.

Markets provide exactly the kind of environment where this learning thrives. When you buy coffee, hire a plumber, or sell something online, you’re engaging in a mutually beneficial exchange with someone you may never see again. Each successful transaction becomes a data point that your brain uses to update how much a stranger’s welfare matters, given your positive experience with them. Environments that punish trust produce the opposite effect (where cooperation is exploited rather than rewarded). Imagine paying for a used bicycle advertised online, only for the seller to take your money, never ship you the item, and you have no way of recourse offering you protection. Here, our theory predicts that the moral weight assigned to that kind of stranger is revised downward. Over many such encounters, the moral circle contracts, people become more wary of others, less generous, less willing to extend concern beyond their immediate group, and less cooperative partners.

Why This Matters

This is not just an academic exercise. Understanding the mechanism behind moral change has real implications. It suggests that the institutions we build — the rules governing our economic and social interactions — do more than affect our wallets. They shape our moral psychology. Societies that create conditions for beneficial interactions between strangers may be quietly laying the foundations for broader moral concern.

It also offers a hopeful message: our capacity for caring about others is not fixed. It’s learned. The right experiences can expand our moral circle to include people we have never met, then the design of our institutions — from marketplaces to schools to online platforms — becomes a powerful lever for moral progress.

The next time you complete an unremarkable transaction with a stranger, consider this: that small exchange may be doing more than moving goods around—it may be making strangers matter more to you.

*Glen Spiteri is a PhD candidate in the Department of Psychology at UCLA, where he studies moral cognition. This work was generously supported by the Institute for Humane Studies at George Mason University.*

References

Baldassarri, D. (2020). Market integration accounts for local variation in generalized altruism in a nationwide lost-letter experiment. Proceedings of the National Academy of Sciences, 117(6), 2858-2863.

Camerer, C. F. (1997). Progress in Behavioral Game Theory. Journal of Economic Perspectives, 11(4), p. 167-188. https://pubs.aeaweb.org/doi/pdfplus/10.1257/jep.11.4.167 

Henrich, J., Ensminger, J., McElreath, R., Barr, A., Barrett, C., Bolyanatz, A., . . . Ziker, J. (2010). Markets, religion, community size, and the evolution of fairness and punishment. Science, 327(5972), 1480–1484.

Peysakhovich, A., & Rand, D. G. (2016). Habits of virtue: Creating norms of cooperation and defection in the laboratory. Management Science, 62(3), 631–647.

Singer, P. (2011). The Expanding Circle: Ethics, Evolution, and Moral Progress. Princeton University Press.

Spiteri, G., Maier, M., & Lieder, F. (2026). A computational theory of learning moral weights. PsyArXiv. https://doi.org/10.31234/osf.io/rehta_v1 

Sutton, R. S., & Barto, A. G. (2018). Reinforcement learning: An introduction (2nd ed.). MIT Press.