Jump to content

Strong reciprocity

From Wikipedia, the free encyclopedia

Strong reciprocity is an area of research in behavioral economics, evolutionary psychology, and evolutionary anthropology on the predisposition to cooperate even when there is no apparent benefit in doing so. This topic is particularly interesting to those studying the evolution of cooperation, as these behaviors seem to be in contradiction with predictions made by many models of cooperation.[1] In response, current work on strong reciprocity is focused on developing evolutionary models which can account for this behavior.[2][3] Critics of strong reciprocity argue that it is an artifact of lab experiments and does not reflect cooperative behavior in the real world.[4]

Evidence for strong reciprocity

[edit]

Experimental evidence

[edit]

A variety of studies from experimental economics provide evidence for strong reciprocity, either by demonstrating people's willingness to cooperate with others, or by demonstrating their willingness to take costs on themselves to punish those who do not.

Evidence for cooperation

[edit]

One experimental game used to measure levels of cooperation is the dictator game. In the standard form of the dictator game, there are two anonymous unrelated participants. One participant is assigned the role of the allocator and the other the role of the recipient. The allocator is assigned some amount of money, which they can divide in any way they choose. If a participant is trying to maximize their payoff, the rational solution (nash equilibrium) for the allocator to assign nothing to the recipient. In a 2011 meta study of 616 dictator game studies, Engel found an average allocation of 28.3%, with 36% of participants giving nothing, 17% choosing the equal split, and 5.44% give the recipient everything.[5] The trust game, an extension of the dictator game, provides additional evidence for strong reciprocity. The trust game extends the dictator game by multiplying the amount given by the allocator to the recipient by some value greater than one, and then allowing the recipient to give some amount back to the allocator. Once again in this case, if participants are trying to maximize their payoff, recipient should give nothing back to the allocator, and the allocator should assign nothing to the recipient. A 2009 meta analysis of 84 trust game studies revealed that the allocator gave an average of 51% and that the receiver returned an average of 37%.[6]

A third commonly used experiment used to demonstrate strong reciprocity preferences is the public goods game. In a public goods game, some number of participants are placed in a group. Each participant is given some amount of money. They are then allowed to contribute any of their allocation to a common pool. The common pool is then multiplied by some amount greater than one, then evenly redistributed to each participant, regardless of how much they contributed. In this game, for anyone trying to maximize their payoff, the rational nash equilibrium strategy is to not contribute anything. However, in a 2001 study, Fischbacher observed average contributions of 33.5%.[7]

Evidence for punishing non-cooperators

[edit]

The second component of strong reciprocity is that people are willing to punish those who fail to cooperate, even when punishment is costly. There are two types of punishment: second party and third party punishment. In second party punishment, the person who was hurt by the other parties' failure to cooperate has the opportunity to punish the non-cooperator. In third party punishment, an uninvolved third party has the opportunity to punish the non-cooperator.

A common game used to measure willingness to engage in second party punishment is the ultimatum game. This game is very similar to the previously described dictator game in which the allocator divides a sum of money between himself and a recipient. In the ultimatum game, the recipient has the choice to either accept the offer or reject it, resulting in both players receiving nothing. If recipients are payoff maximizers, it is in the nash equilibrium for them to accept any offer, and it is therefore in the allocator's interest to offer as close to zero as possible.[8] However, the experimental results show that the allocator usually offers over 40%, and is rejected by the recipient 16% of the time. Recipients are more likely to reject low offers rather than high offers.[9] Another example of second party punishment is in public goods game as described earlier, but with a second stage added in which participants can pay to punish other participants. In this game, a payoff maximizer's rational strategy in nash equilibrium is not to punish and to not contribute. However, experimental results show that participants are willing to pay to punish those who deviate from the average level of contribution – so much so that it becomes disadvantageous to give a lower amount, which allows for sustained cooperation.[10][11]

Modifications of the dictator game and prisoner's dilemma provide support for the willingness to engage in costly third party punishment. The modified dictator game is exactly the same as the traditional dictator game but with a third party observing. After the allocator makes their decision, the third party has the opportunity to pay to punish the allocator. A payoff maximizing third party would choose not to punish, and a similarly rational allocator would choose to keep the entire sum for himself. However, experimental results show that a majority of third parties punish allocations less than 50%[12] In the prisoner's dilemma with third party punishment, two of the participants play a prisoner's dilemma, in which each must choose to either cooperate or defect. The game is set up such that regardless of what the other player does, it is rational for an income maxizer to always choose to defect, even though both players cooperating yields a higher payoff than both players defecting. A third player observes this exchange, then can pay to punish either player. An income maximizing third parties' rational response would be to not punish, and income maximizing players would choose to defect. A 2004 study demonstrates that a near majority of participants (46%) are willing to pay to punish if one participant defects. If both parties defect, 21% are still willing to punish.[12]

[edit]

Other researchers have investigated to what extent these behavioral economic lab experiments on social preferences can be generalized to behavior in the field. In a 2011 study, Fehr and Leibbrandt examined the relationship between contributions in public goods games to participation in public goods in the community of shrimpers in Brazil. These shrimpers cut a hole in the bottom of their fishing bucket in order to allow immature shrimp to escape, thereby investing in the public good of the shared shrimp population. The size of the hole can be seen as the degree to which participants cooperate, as larger holes allow more shrimp to escape. Controlling for a number of other possible influences, Fehr and Leibbrandt demonstrated a positive relationship between hole size and contributions in the public goods game experiment.[13]

Rustagi and colleagues were able to demonstrate a similar effect with 49 groups of Bale Oromo herders in Ethiopia, who were participating in forest management. Results from public goods game experiments revealed more than one third of participant herders were conditional cooperators, meaning they cooperate with other cooperators. Rustagi et al. demonstrated that groups with larger amounts of conditional cooperators planted a larger number of trees.[14]

Ethnographic field evidence

[edit]

In addition to experimental results, ethnography collected by anthropologists describes strong reciprocity observed in the field.

Records of the Turkana, an acephalous African pastoral group, demonstrate strong reciprocity behavior. If someone acts cowardly in combat or commits some other free-riding behavior, the group confers and decides if a violation has occurred. If they do decide a violation has occurred, corporal punishment is administered by the age cohort of the violator. Importantly, the age cohort taking the risks are not necessarily those who were harmed, making it costly third party punishment.[15]

The Walibri of Australia also exhibit third party costly punishment. The local community determines whether an act of homicide, adultery, theft, etc. was an offense. The community then appoints someone to carry out the punishment, and others to protect that person against retaliation.[16] Data from the Aranda foragers of the Central Desert in Australia suggest this punishment can be very costly, as it carries with it the risk of retaliation from the family members of the punished, which can be as severe as homicide.[17]

Evolutionary models of cooperation which account for strong reciprocity

[edit]

A number of evolutionary models have been proposed in order to account for the existence of strong reciprocity. This section briefly touches on an important small subset of such models.

The first model of strong reciprocity was proposed by Herbert Gintis in 2000, which contained a number of simplifying assumptions addressed in later models.[2] In 2004, Samuel Bowles and Gintis presented a follow up model in which they incorporated cognitive, linguistic, and other capacities unique to humans in order to demonstrate how these might be harnessed to strengthen the power of social norms in large scale public goods games.[3] In a 2001 model, Joe Henrich and Robert Boyd also build on Gintis' model by incorporating conformist transmission of cultural information, demonstrating that this can also stabilize cooperative groups norms.[18]

Boyd, Gintis, Bowles, and Peter Richerson's 2003 model of the evolution of third party punishment demonstrates how even though the logic underlying altruistic giving and altruistic punishment may be the same, the evolutionary dynamics are not. This model is the first to employ cultural group selection in order to select for better performing groups, while using norms to stabilize behavior within groups.[19]

Though many of the previously proposed models were both costly and uncoordinated, a 2010 model by Boyd, Gintis and Bowles presents a mechanism for coordinated costly punishment. In this quorum-sensing model, each agent chooses whether or not they are willing to engage in punishment. If a sufficient number of agents are willing to engage in punishment, then the group acts collectively to administer it.[20] An important aspect of this model is that strong reciprocity is self-regarding when rare in the population, but may be altruistic when common within a group.

Cross cultural variation

[edit]

Significant cross-cultural variation has been observed in strong reciprocity behavior. In 2001, dictator game experiments were run in a 15 small scale societies across the world. The results of the experiments showed dramatic variation, with some groups mean offer as little as 26% and some as great as 58%. The pattern of receiver results was also interesting, with some participants in some cultures rejecting offers above 50%. Henrich and colleagues determined that the best predictors of dictator game allocations were the size of the group (small groups giving less) and market integration (the more involved with markets, the more participants gave).[21] This study was then repeated with a different 15 small scale societies and with better measures of market integration, finding a similar pattern of results.[22] These results are consistent with the culture-gene coevolution hypothesis.[22] A later paper by the same researchers identified religion as a third major contributor. Those people who participate in a world religion were more likely to exhibit strong reciprocity behavior.[23]

Criticisms

[edit]

A particularly prominent criticism of strong reciprocity theory is that it does not correspond to behavior found in the actual environment. In particular, the existence of third party punishment in the field is called into question.[4] Some have responded to this criticism by pointing out that if effective, third party punishment will rarely be used, and will therefore be difficult to observe.[24][25] Others have suggested that there is evidence of third party costly punishment in the field.[26] Critics have responded to these claims by arguing that it is unfair for proponents to argue that both a demonstration of costly third party punishment as well as a lack of costly third party punishment are both evidence of its existence. They also question whether the ethnographic evidence presented is costly third party punishment, and call for additional analysis of the costs and benefits of the punishment.[27] Other research has shown that different types of strong reciprocity do not predict other types of strong reciprocity within individuals.[28]

Implications

[edit]

The existence of strong reciprocity implies that systems developed based purely on material self-interest may be missing important motivators in the marketplace. This section gives two examples of possible implications. One area of application is in the design of incentive schemes. For example, standard contract theory has difficulty dealing with the degree of incompleteness in contracts and the lack of use of performance measures, even when they are cheap to implement. Strong reciprocity and models based on it suggest that this can be explained by people's willingness to act fairly, even when it is against their material self-interest. Experimental results suggest that this is indeed the case, with participants preferring less complete contracts, and workers willing to contribute a fair amount beyond what would be in their own self-interest.[29]

Another application of strong reciprocity is in allocating property rights and ownership structure. Joint ownership of property can be very similar to the public goods game, where owners can independently contribute to the common pool, which then returns on the investment and is evenly distributed to all parties. This ownership structure is subject to the tragedy of the commons, in which if all the parties are self-interested, no one will invest. Alternatively, property could be allocated in an owner and employee relationship, in which an employee is hired by the owner and paid a specific wage for a specific level of investment. Experimental studies show that participants generally prefer joint ownership, and do better under joint ownership than in the owner employee organization.[30]

See also

[edit]

Notable contributors

[edit]

References

[edit]
  1. ^ Fehr, E.; Fischbacher, U. (2003). "The nature of human altruism". Nature. 425 (6960): 785–791. Bibcode:2003Natur.425..785F. doi:10.1038/nature02043. PMID 14574401. S2CID 4305295.
  2. ^ a b Gintis, H. (2000). "Strong Reciprocity and Human Sociality". Journal of Theoretical Biology. 206 (2): 169–179. Bibcode:2000JThBi.206..169G. CiteSeerX 10.1.1.335.7226. doi:10.1006/jtbi.2000.2111. PMID 10966755. S2CID 9260305.
  3. ^ a b Bowles, S.; Gintis, H. (2004). "The evolution of strong reciprocity: Cooperation in heterogeneous populations". Theoretical Population Biology. 65 (1): 17–28. Bibcode:2004TPBio..65...17B. doi:10.1016/j.tpb.2003.07.001. PMID 14642341.
  4. ^ a b Guala, F. (2012). "Reciprocity: Weak or strong? What punishment experiments do (and do not) demonstrate" (PDF). Behavioral and Brain Sciences. 35 (1): 1–15. doi:10.1017/S0140525X11000069. hdl:10535/6259. PMID 22289303. S2CID 54599837.
  5. ^ Engel, C. (2011). "Dictator games: A meta study". Experimental Economics. 14 (4): 583–610. doi:10.1007/s10683-011-9283-7. hdl:11858/00-001M-0000-0028-6DAA-8. S2CID 195330301.
  6. ^ Johnson, N.D. and Mislin, A. (2009) Cultures of kindness: A meta-analysis of trust game experiments. Working paper, George Mason University.
  7. ^ Fischbacher, U.; Gächter, S.; Fehr, E. (2001). "Are people conditionally cooperative? Evidence from a public goods experiment" (PDF). Economics Letters. 71 (3): 397. doi:10.1016/S0165-1765(01)00394-9.
  8. ^ Camerer, Colin (2003). Behavioral game theory: experiments in strategic interaction. New York: Russell Sage Foundation. ISBN 978-0-691-09039-9.
  9. ^ Oosterbeek, H.; Sloof, R.; Van De Kuilen, G. (2004). "Cultural Differences in Ultimatum Game Experiments: Evidence from a Meta-Analysis". Experimental Economics. 7 (2): 171. doi:10.1023/B:EXEC.0000026978.14316.74. S2CID 17659329.
  10. ^ Fehr, E.; Gächter, S. (2000). "Cooperation and Punishment in Public Goods Experiments". American Economic Review. 90 (4): 980. CiteSeerX 10.1.1.23.2534. doi:10.1257/aer.90.4.980. S2CID 11045470.
  11. ^ Fehr, E.; Gächter, S. (2002). "Altruistic punishment in humans". Nature. 415 (6868): 137–40. Bibcode:2002Natur.415..137F. doi:10.1038/415137a. PMID 11805825. S2CID 4310962.
  12. ^ a b Fehr, E.; Fischbacher, U. (2004). "Third-party punishment and social norms". Evolution and Human Behavior. 25 (2): 63. Bibcode:2004EHumB..25...63F. CiteSeerX 10.1.1.120.5059. doi:10.1016/S1090-5138(04)00005-4.
  13. ^ Fehr, E.; Leibbrandt, A. (2011). "A field study on cooperativeness and impatience in the Tragedy of the Commons". Journal of Public Economics. 95 (9–10): 1144. CiteSeerX 10.1.1.643.5187. doi:10.1016/j.jpubeco.2011.05.013.
  14. ^ Rustagi, D.; Engel, S.; Kosfeld, M. (2010). "Conditional Cooperation and Costly Monitoring Explain Success in Forest Commons Management". Science. 330 (6006): 961–965. Bibcode:2010Sci...330..961R. doi:10.1126/science.1193649. PMID 21071668. S2CID 206528249.
  15. ^ Mathew, S.; Boyd, R. (2011). "Punishment sustains large-scale cooperation in prestate warfare". Proceedings of the National Academy of Sciences. 108 (28): 11375–80. Bibcode:2011PNAS..10811375M. doi:10.1073/pnas.1105604108. PMC 3136302. PMID 21670285.
  16. ^ Meggitt, Mervyn J. (1965). Desert people: a study of the Walbiri Aborigines of Central Australia. Chicago: University of Chicago Press. ISBN 978-0-226-51822-0.
  17. ^ Strehlow, T. G. H. (1970) Geography and the totemic landscape in Central Australia: A functional study. In: Australian aboriginal anthology: Modern studies in the social anthropology of the Australian aborigines, ed. R. M. Berndt, pp. 92–140. University of Western Australia Press.
  18. ^ Henrich, J.; Boyd, R. (2001). "Why People Punish Defectors". Journal of Theoretical Biology. 208 (1): 79–89. doi:10.1006/jtbi.2000.2202. PMID 11162054.
  19. ^ Boyd, R.; Gintis, H.; Bowles, S.; Richerson, P. J. (2003). "The evolution of altruistic punishment". Proceedings of the National Academy of Sciences. 100 (6): 3531–3535. Bibcode:2003PNAS..100.3531B. doi:10.1073/pnas.0630443100. PMC 152327. PMID 12631700.
  20. ^ Boyd, R.; Gintis, H.; Bowles, S. (2010). "Coordinated Punishment of Defectors Sustains Cooperation and Can Proliferate when Rare". Science. 328 (5978): 617–620. Bibcode:2010Sci...328..617B. doi:10.1126/science.1183665. PMID 20431013. S2CID 1567228.
  21. ^ Henrich, J.; Boyd, R.; Bowles, S.; Camerer, C.; Fehr, E.; Gintis, H.; McElreath, R. (2001). "In Search of Homo Economicus: Behavioral Experiments in 15 Small-Scale Societies" (PDF). American Economic Review. 91 (2): 73. doi:10.1257/aer.91.2.73.
  22. ^ a b Henrich, J.; McElreath, R.; Barr, A.; Ensminger, J.; Barrett, C.; Bolyanatz, A.; Cardenas, J. C.; Gurven, M.; Gwako, E.; Henrich, N.; Lesorogol, C.; Marlowe, F.; Tracer, D.; Ziker, J. (2006). "Costly Punishment Across Human Societies". Science. 312 (5781): 1767–1770. Bibcode:2006Sci...312.1767H. CiteSeerX 10.1.1.714.70. doi:10.1126/science.1127333. PMID 16794075. S2CID 4793548.
  23. ^ Henrich, J.; Ensminger, J.; McElreath, R.; Barr, A.; Barrett, C.; Bolyanatz, A.; Cardenas, J. C.; Gurven, M.; Gwako, E.; Henrich, N.; Lesorogol, C.; Marlowe, F.; Tracer, D.; Ziker, J. (2010). "Markets, Religion, Community Size, and the Evolution of Fairness and Punishment". Science. 327 (5972): 1480–1484. Bibcode:2010Sci...327.1480H. CiteSeerX 10.1.1.714.7830. doi:10.1126/science.1182238. PMID 20299588. S2CID 4803905.
  24. ^ Gintis, H.; Fehr, E. (2012). "The social structure of cooperation and punishment" (PDF). Behavioral and Brain Sciences. 35 (1): 28–29. doi:10.1017/S0140525X11000914. PMID 22289317. S2CID 1477246.
  25. ^ Gächter, S. (2012). "In the lab and the field: Punishment is rare in equilibrium". Behavioral and Brain Sciences. 35 (1): 26–28. doi:10.1017/S0140525X11001415. PMID 22289316. S2CID 29556488.
  26. ^ Bowles, S.; Boyd, R.; Mathew, S.; Richerson, P. J. (2012). "The punishment that sustains cooperation is often coordinated and costly". Behavioral and Brain Sciences. 35 (1): 20–21. doi:10.1017/S0140525X1100118X. PMID 22289309. S2CID 23731570.
  27. ^ Guala, F. (2012). "Strong reciprocity is real, but there is no evidence that uncoordinated costly punishment sustains cooperation in the wild". Behavioral and Brain Sciences. 35 (1): 45–59. doi:10.1017/S0140525X1100166X. hdl:2434/818589. PMID 22393582. S2CID 42924633.
  28. ^ Yamagishi, T.; Horita, Y.; Mifune, N.; Hashimoto, H.; Li, Y.; Shinada, M.; Miura, A.; Inukai, K.; Takagishi, H.; Simunovic, D. (2012). "Rejection of unfair offers in the ultimatum game is no evidence of strong reciprocity". Proceedings of the National Academy of Sciences. 109 (50): 20364–8. Bibcode:2012PNAS..10920364Y. doi:10.1073/pnas.1212126109. PMC 3528519. PMID 23188801.
  29. ^ Fehr, E., Klein, A. and Schmidt, K. M., 2001. Fairness, Incentives and Contractual Incompleteness. Institute for Empirical Research in Economics, Working Paper No. 72, University of Zurich.
  30. ^ Fehr, E.; Kremhelmer, S.; Schmidt, K. M. (2008). "Fairness and the Optimal Allocation of Ownership Rights*" (PDF). The Economic Journal. 118 (531): 1262. doi:10.1111/j.1468-0297.2008.02169.x. S2CID 560069.

Further reading

[edit]
  • Gintis, Herbert; Bowles, Samuel (2011). A Cooperative Species: Human Reciprocity and Its Evolution. Princeton, N.J: Princeton University Press. ISBN 978-0-691-15125-0.
  • Gintis, Herbert (2009). The Bounds of Reason: Game Theory and the Unification of the Behavioral Sciences. Princeton, N.J: Princeton University Press. ISBN 978-0691140520.
  • Natalie Henrich; Joseph Henrich (2007). Why humans cooperate: a cultural and evolutionary explanation. Oxford [Oxfordshire]: Oxford University Press. ISBN 978-0-19-531423-6.
  • Boyd, Robert W.; Richerson, Peter J. (2005). Not by genes alone: how culture transformed human evolution. Chicago: University of Chicago Press. ISBN 978-0-226-71284-0.
  • Ernst Fehr; Gintis, Herbert; Bowles, Samuel; Boyd, Robert W. (2004). Moral sentiments and material interests: the foundations of cooperation in economic life. Cambridge, Mass: MIT Press. ISBN 978-0-262-07252-6.