Keyboard Shortcuts?f

×
  • Next step
  • Previous step
  • Skip this slide
  • Previous slide
  • mShow slide thumbnails
  • nShow notes
  • hShow handout latex source
  • NShow talk notes latex source

Click here and press the right key for the next slide.

(This may not work on mobile or ipad. You can try using chrome or firefox, but even that may fail. Sorry.)

also ...

Press the left key to go backwards (or swipe right)

Press n to toggle whether notes are shown (or add '?notes' to the url before the #)

Press m or double tap to slide thumbnails (menu)

Press ? at any time to show the keyboard shortcuts

 

Reason and Atrocity: Hindriks’ Observation

 

Reason and Atrocity: Hindriks’ Observation

[email protected]

One compelling reason for studying moral psychology is that ethical abilities appear to play a central role in atrocities
bandura notes that massive threats to human welfare stem mainly from deliberate acts of principle rather than unrestrained acts of impulse and he goes on to note and he develops essentially over several decades of work that those principles and their role in guiding the action seems to involve quite a significant role for reason

‘The massive threats to human welfare stem mainly from deliberate acts of principle, rather than from unrestrained acts of impulse’ (Bandura, 2002, p. 116).

‘The executioners, who face the most daunting moral dilemma, [...] adopted moral, economic, and societal security justifications for the death penalty’ (Osofsky, Bandura, & Zimbardo, 2005, p. 387).

so for example um he and his colleagues studied prison executioners so people who are prison workers who are involved in the execution of prisoners different groups of these people involved at different stages and they note among other things that the executioners who face the most daunting moral dilemma adopted moral economic and societal security justifications for the death penalty
so here it looks like there's a clear role for reason in enabling people to perform moral actions actions which are harmful and probably that those executioners were averse to performing initially and for them also to adopt attitudes to condone those attitudes so on the face of it this points to a role for reason in moral just moral intuition and also ethically relevant actions more generally it seems like humans often have let's say feelings which would prevent them performing certain actions but by process of reasoning they can alter their intuitions and perform different actions now we'll look more at that later

‘If we ask people why they hold a particular moral view [their] reasons are often superficial and post hoc. If the reasons are successfully challenged, the moral judgment often remains.’

‘basic values are implemented in our psychology in a way that puts them outside certain practices of justification [...] basic values seem to be implemented in an emotional way’

(Prinz, 2007, p. 32).

Let me try to sharpen this.

An inconsistent dyad

1. moral reasoning always only ever follows moral judgement

‘moral reasoning is [...] usually engaged in after a moral judgment is made, in which a person searches for arguments that will support an already-made judgment’ (Haidt & Bjorklund, 2008, p. 189).

We’ll come back to this in Part II of the course because this is a component of Moral Foundations Theory.

Haidt & Bjorklund, 2008 figure 4.1

[Figure corrected: earlier version had lower horizontal actions pointing the wrong way.]
‘Links 5 and 6 are hypothesized to occur rarely but should be of great interest to philosophers because they are used to solve dilemmas’ (Haidt & Bjorklund, 2008, p. 188)
Social part is not relvant right now

An inconsistent dyad

1. moral reasoning always only ever follows moral judgement

‘moral reasoning is [...] usually engaged in after a moral judgment is made, in which a person searches for arguments that will support an already-made judgment’ (Haidt & Bjorklund, 2008, p. 189).

2. moral reasoning sometimes enables a moral judgement which would otherwise be impossible

Moral reasoning can overcome (i) affective support for judgements about not harming and (ii) affective obstacles to deliberately harming others.

[UPDATE ∞todo: I’m not confident there really is a problem here. It looks like there is a problem. But Haidt et al allow that A’s reasoning can affect B’s intuitions and conversely; so it may be that (2) involves the social loop.]
[IF the UPDATE is right, could present the inconsistent dyad as apparent then introduce the loop diagram to explain how it might work.]
[UPDATE 2: Some of the Bandura cases are clearly intrapersonal, not interpersonal]

significance?

Observations of the role of reason in enabling inhumane acts
appear to provide grounds sufficient to reject the view that
moral judgements are always, or even characteristically, entirely consequences of feelings.

Further significance: We also have evidence for the 'sometimes' part

puzzle

Why are moral judgements sometimes, but not always, a consequence of reasoning from known principles?

Can strengthen this by considering moral disengagement.