Jump to content

Moral blindness

From Wikipedia, the free encyclopedia

Moral blindness, also known as ethical blindness, is defined as a person's temporary inability to see the ethical aspect of a decision they are making. It is often caused by external factors due to which an individual is unable to see the immoral aspect of their behavior in that particular situation.[1]

While the concept of moral blindness (and more broadly, that of immorality) has its roots in ancient philosophy,[2][3] the idea of moral blindness became popular after the events of World War II, particularly the Holocaust.[4] This led to more research by psychologists and some surprising findings (notably by Stanley Milgram and Philip Zimbardo) on human behavior in the context of obedience and authority bias.[1]

Moral blindness has been identified as being a concern in areas such as business organisation and legal systems.[5][6]

Overview

[edit]

Moral blindness is a phenomenon in which people with sufficient moral reasoning abilities are temporarily unable to see reason which causes them to behave in ways counter to their actual moral values. This behaviour can be due to situational or other factors. The idea of moral blindness usually requires the following: people need to deviate from their intrinsic moral beliefs and this deviation should be temporary and unconscious i.e. people are unaware at the time of their unethical behaviour.[1][7]

Interest in the idea of moral blindness increased after Hannah Arendt's Eichmann in Jerusalem. A Report on the Banality of Evil[4] which focused on Adolf Eichmann, a German-Austrian Nazi soldier who was responsible for the deportation of Jews to extermination camps and thus played a major role in the Holocaust.[8]

The ideas of moral blindness and the "banality of evil" also influenced the field of psychology and led to some notable studies in the 70s such as the obedience studies by Stanley Milgram and the Stanford Prison Experiment by Philip Zimbardo. These studies looked at the impact of authority on obedience and individual behaviour.[1]

Subsequent research has looked at moral blindness in contexts beyond war crimes and genocide. The idea has been expanded to study people's behaviour in areas as diverse as organisational behavior and mental health to name a few.[5][9][10]

Origins and early theories

[edit]

Roots in philosophy

[edit]

The origins of the concept of moral blindness lie in philosophy and can be traced to ancient Greek philosophers such as Socrates who spoke of moral intellectualism, Plato who spoke about emotions clouding moral judgements, and Aristotle who first used the term "ethics" for the field of moral philosophy.[2] Early spiritual leaders such as the Buddha and Confucius also spoke about moral behaviour in their discourses although they were more prescriptive in nature.[3] Modern contributions to moral judgement came from Western philosophers such as Descartes, Locke, Hume, and Kant around the 17th and 18th century[11][12][13] and more contemporary philosophers such as G.E. Moore, who in his book Principia Ethica talks about the "indefinability of good".[14]

Normative ethics seeks to define the rightness or wrongness of an action. Two opposing views that have developed in this area are deontology where the morality of an action depends on its appropriateness with respect to rules and consequentialism where an action's morality depends on its results. These views are often reflected in responses to Greene's trolley problem.[15]

In psychology

[edit]

Moral blindness has been studied jointly across philosophy and psychology with empirical studies of morality going back to the 1890s. The focus on a normative approach to moral behaviour led to research focused on the cognitive and developmental context. Piaget put forth his prominent theory of cognitive development in 1936 which Kohlberg developed to come up with the three stages of moral development in 1958.[16] Later, in 1982, James Rest published his influential Four Component Model of Morality (FCM) where he identified four distinct stages from which immoral behaviour could arise: moral sensitivity, moral judgment, moral motivation, and moral implementation.[15] This model was meant to convey the complexity behind moral behaviour. Competence in one stage did not imply competence in another, so immoral behaviour could result from failure at any stage.[17] The above cognitive focus was found to be in contrast to some of the observed behavior. The field of behavioral ethics eventually emerged to study how people react to moral dilemmas.[15]

Theoretical and experimental research in psychology

[edit]

A major driver for modern research on moral blindness is purported to be post World War II sentiments towards people such as Adolf Eichmann (responsible for genocide under the Nazi regime during the Holocaust). His capture and subsequent trial in 1961 had many observers comment on his ordinary nature and appearance which seemed at contrast with his 'evil' behaviour. Hannah Arendt, who was covering the trial for the New Yorker, coined the term the "banality of evil" in reference to Eichmann as during the trial, Eichmann showed no remorse nor did he accept responsibility - he claimed to have done what he was told to do. This is believed to have influenced researchers such as Milgram to study individual behaviour in response to obedience to authority.[1][18][19]

In his obedience studies in 1961-62, Milgram had subjects think they were administering electric shocks to another participant, who in fact was a confederate of the experimenters. These studies had been designed to answer questions such as: "Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?"[20] To most people's surprise, 65% of the subjects from the original study went ahead to pull a switch that would have administered the maximum of 450 volts.[21]

Later in 1971, Zimbardo in his Stanford Prison Experiment studied showed how "good people behave in pathological ways that are alien to their nature".[1] Male undergraduate students at Stanford were assigned to be guards or prisoners in a simulated prison setting. The experiment was designed to see how far subjects would go to internalise their roles and obey external orders and later raised some ethical concerns about the nature of the study itself.[22]

Post these findings, researchers began to study moral agency, its exercise and drivers of moral blindness. In his research, Bandura argued that moral disengagement could arise out of various forces (individual, situational, or institutional) along with mechanisms such as diffusion of responsibility and disconnected division of tasks could lead to immoral behaviour.[23][1][24]

More recent research has led to the development of the concept of 'bounded ethicality" - the idea that people can be unintentionally unethical when it comes to their behaviour as well as judging others' behaviour; something they may realise only on further reflection.[25][26] Studies on individual unethicality have also looked at the role of social norms and as well as how we view others' unethical behaviour.[27][28]

[edit]

Moral blindness has been studied and applied in a range of domains beyond war crimes, politics, and administration. A major area of application has been in the field of management and organisational behaviour with research looking at a wide range of topics such as corporate transgressions, business ethics, and moral disengagement at work.[9][5] Law and justice is another area where moral blindness, especially when it comes to lawyers, is seen as a concern.[29][6] Some research has also referred to psychopathy being a specific kind of moral blindness although the findings are not conclusive.[10]

The field has also been expanded to study broader ideas such as moral blind spots (overestimating ability to act ethically),[30] ethical erosion (gradual decline of ethics over time),[28] and ethical fading (when ethical concerns around a situation 'fade' during decision making).[31]

See also

[edit]

References

[edit]
  1. ^ a b c d e f g Palazzo, Guido; Krings, Franciska; Hoffrage, Ulrich (2012-09-01). "Ethical Blindness". Journal of Business Ethics. 109 (3): 323–338. doi:10.1007/s10551-011-1130-4. ISSN 1573-0697. S2CID 254381575.
  2. ^ a b Oberhelman, David D. (2001-06-01). "Stanford Encyclopedia of Philosophy". Reference Reviews. 15 (6). Emerald Group Publishing Limited: 9. doi:10.1108/rr.2001.15.6.9.311. ISSN 0950-4125.
  3. ^ a b Tucker, John A. (2015-02-03), Davis, Bret W (ed.), "Japanese Neo-Confucian Philosophy", The Oxford Handbook of Japanese Philosophy, Oxford University Press, pp. 272–290, doi:10.1093/oxfordhb/9780199945726.013.16, ISBN 978-0-19-994572-6, retrieved 2020-11-30
  4. ^ a b Burin, Frederic S.; Arendt, Hannah (March 1964). "Eichmann in Jerusalem: A Report on the Banality of Evil". Political Science Quarterly. 79 (1): 122. doi:10.2307/2146583. ISSN 0032-3195. JSTOR 2146583.
  5. ^ a b c Barsky, Adam (2011-06-16). "Investigating the Effects of Moral Disengagement and Participation on Unethical Work Behavior". Journal of Business Ethics. 104 (1): 59–75. doi:10.1007/s10551-011-0889-7. ISSN 1573-0697. S2CID 144577232.
  6. ^ a b Eldred, Tigran (2012-09-28). "Prescriptions for Ethical Blindness: Improving Advocacy for Indigent Defendants in Criminal Cases". Rutgers Law Review. Rochester, NY. SSRN 2153869.
  7. ^ de Klerk, J. J. (2017-04-01). "Nobody is as Blind as Those Who Cannot Bear to See: Psychoanalytic Perspectives on the Management of Emotions and Moral Blindness". Journal of Business Ethics. 141 (4): 745–761. doi:10.1007/s10551-016-3114-x. ISSN 1573-0697. S2CID 147226367.
  8. ^ Becoming Eichmann: rethinking the life, crimes, and trial of a "desk murderer". 2006-10-01.
  9. ^ a b Bandura, Albert; Caprara, Gian-Vittorio; Zsolnai, Laszlo (2000). "Corporate Transgressions through Moral Disengagement". Journal of Human Values. 6 (1): 57–64. doi:10.1177/097168580000600106. S2CID 143829357.
  10. ^ a b Larsen, Rasmus Rosenberg (2020-09-01). "Psychopathy as moral blindness: a qualifying exploration of the blindness-analogy in psychopathy theory and research". Philosophical Explorations. 23 (3): 214–233. doi:10.1080/13869795.2020.1799662. ISSN 1386-9795. S2CID 221361039.
  11. ^ Cohon, Rachel (2018), "Hume's Moral Philosophy", in Zalta, Edward N. (ed.), The Stanford Encyclopedia of Philosophy (Fall 2018 ed.), Metaphysics Research Lab, Stanford University, retrieved 2020-11-29
  12. ^ García Moriyon (2011). Moral Blindness. 15th ICPIC International Conference, Gyeongsang National University, Jinju, South Korea. doi:10.13140/2.1.1717.0885.
  13. ^ Hare, John (2019), "Religion and Morality", in Zalta, Edward N. (ed.), The Stanford Encyclopedia of Philosophy (Fall 2019 ed.), Metaphysics Research Lab, Stanford University, retrieved 2020-11-29
  14. ^ Cooper, Barton C. (1959-01-01). "The Alleged Indefinability of Good". The Journal of Philosophy. 56 (25): 977–985. doi:10.2307/2022719. JSTOR 2022719. Retrieved 2020-11-29.
  15. ^ a b c Bazerman, Max H.; Gino, Francesca (December 2012). "Behavioral Ethics: Toward a Deeper Understanding of Moral Judgment and Dishonesty". Annual Review of Law and Social Science. 8 (1): 85–104. doi:10.1146/annurev-lawsocsci-102811-173815. ISSN 1550-3585. S2CID 14311511.
  16. ^ Hallpike, C. R. (Christopher Robert) (2004). The evolution of moral understanding. Prometheus Research Group. Alton: Prometheus Research Group. ISBN 0-9542168-4-9. OCLC 56463709.
  17. ^ You, Di; Bebeau, Muriel J. (2013-11-01). "The independence of James Rest's components of morality: evidence from a professional ethics curriculum study". Ethics and Education. 8 (3): 202–216. doi:10.1080/17449642.2013.846059. ISSN 1744-9642. S2CID 144861318.
  18. ^ "Eichmann Trial". encyclopedia.ushmm.org. Retrieved 2020-11-30.
  19. ^ Russell, Nestar John Charles (2011). "Milgram's obedience to authority experiments: Origins and early evolution". British Journal of Social Psychology. 50 (1): 140–162. doi:10.1348/014466610X492205. ISSN 2044-8309. PMID 21366616.
  20. ^ Schulweis, Harold M. (2009). Conscience : the duty to obey and the duty to disobey. Woodstock, Vt.: Jewish Lights Pub. ISBN 978-1-58023-419-1. OCLC 731340449.
  21. ^ Blass, Thomas (March 1991). "Understanding behavior in the Milgram obedience experiment: The role of personality, situations, and their interactions". Journal of Personality and Social Psychology. 60 (3): 398–413. doi:10.1037/0022-3514.60.3.398. ISSN 1939-1315.
  22. ^ Bartels, Jared (2019-11-02). "Revisiting the Stanford prison experiment, again: Examining demand characteristics in the guard orientation". The Journal of Social Psychology. 159 (6): 780–790. doi:10.1080/00224545.2019.1596058. ISSN 0022-4545. PMID 30961456. S2CID 104295568.
  23. ^ Bandura, Albert (1999-08-01). "Moral Disengagement in the Perpetration of Inhumanities". Personality and Social Psychology Review. 3 (3): 193–209. doi:10.1207/s15327957pspr0303_3. ISSN 1088-8683. PMID 15661671. S2CID 1589183.
  24. ^ Bandura, Albert (2002-06-01). "Selective Moral Disengagement in the Exercise of Moral Agency". Journal of Moral Education. 31 (2): 101–119. doi:10.1080/0305724022014322. ISSN 0305-7240. S2CID 146449693.
  25. ^ Gino, Francesca (2015-06-01). "Understanding ordinary unethical behavior: why people who value morality act immorally". Current Opinion in Behavioral Sciences. Social behavior. 3: 107–111. doi:10.1016/j.cobeha.2015.03.001. ISSN 2352-1546. S2CID 53205769.
  26. ^ Chugh, Dolly; Bazerman, Max H.; Banaji, Mahzarin R. (2005-04-18), "Bounded Ethicality as a Psychological Barrier to Recognizing Conflicts of Interest", Conflicts of Interest, Cambridge University Press, pp. 74–95, doi:10.1017/cbo9780511610332.006, ISBN 978-0-521-84439-0, retrieved 2020-11-30
  27. ^ Gino, Francesca; Ayal, Shahar; Ariely, Dan (2009-03-01). "Contagion and Differentiation in Unethical Behavior: The Effect of One Bad Apple on the Barrel". Psychological Science. 20 (3): 393–398. doi:10.1111/j.1467-9280.2009.02306.x. ISSN 1467-9280. PMID 19254236. S2CID 10456659.
  28. ^ a b Gino, Francesca; Moore, Don A.; Bazerman, Max H. (2008). "See No Evil: When We Overlook Other People's Unethical Behavior". SSRN Electronic Journal. doi:10.2139/ssrn.1079969. ISSN 1556-5068. S2CID 145409936.
  29. ^ Hall, Katherine (2010), Why good intentions are often not enough: The potential for ethical blindness in legal decision-making, Routledge, ISBN 978-0-415-54653-9, retrieved 2020-11-30
  30. ^ Bazerman, Max H.; Tenbrunsel, Ann E. (2011-12-31). Blind Spots. Princeton: Princeton University Press. doi:10.1515/9781400837991. ISBN 978-1-4008-3799-1.
  31. ^ Tenbrunsel, Ann E.; Messick, David M. (June 2004). "Ethical Fading: The Role of Self-Deception in Unethical Behavior". Social Justice Research. 17 (2): 223–236. doi:10.1023/B:SORE.0000027411.35832.53. ISSN 0885-7466. S2CID 26603323.