-
Home / Utilitarianism and justice
How do we know the right thing to do in our lives? In other words, what moral principles are at the foundation of our actions? Utilitarianism is the philosophy that the moral worth of an action is determined solely by the number of people who benefit from it relative to others who are hurt by it. It's a form of consequentialism (aka the ends justify the means). Values become something you can measure in an absolute mathematical sense. It's been described as "the greatest happiness for the greatest number" (a phrase that Ayn Rand described as "one of the most vicious slogans ever foisted on humanity"). It disregards who makes up that "greatest number." It is anti-individualistic as its focus is on the group. Does utilitarianism provide the right answers by always maximizing happiness/minimizing unhappiness to the greatest number of people? Consider the following examples from one of the most popular courses in Harvard's history: Justice.
Transcript:
0:04 Funding for this program is provided by:
0:08 Additional funding provided by
0:33 This is a course about Justice and we begin with a story
0:37 suppose you're the driver of a trolley car,
0:40 and your trolley car is hurdling down the track at sixty miles an hour
0:44 and at the end of the track you notice five workers working on the track
0:49 you tried to stop but you can't
0:51 your brakes don't work
0:53 you feel desperate because you know
0:56 that if you crash into these five workers
0:59 they will all die
1:01 let's assume you know that for sure
1:05 and so you feel helpless
1:07 until you notice that there is
1:09 off to the right
1:11 a side track
1:13 at the end of that track
1:15 there's one worker
1:17 working on track
1:19 you're steering wheel works
1:21 so you can
1:23 turn the trolley car if you want to
1:26 onto this side track
1:28 killing the one
1:30 but sparing the five.
1:33 Here's our first question
1:36 what's the right thing to do?
1:38 What would you do?
1:40 Let's take a poll,
1:42 how many
1:45 would turn the trolley car onto the side track?
1:52 How many wouldn't?
1:53 How many would go straight ahead
1:58 keep your hands up, those of you who'd go straight ahead.
2:04 A handful of people would, the vast majority would turn
2:08 let's hear first
2:09 now we need to begin to investigate the reasons why you think
2:14 it's the right thing to do. Let's begin with those in the majority, who would turn
2:19 to go onto side track?
2:22 Why would you do it,
2:23 what would be your reason?
2:25 Who's willing to volunteer a reason?
2:30 Go ahead, stand up.
2:32 Because it can't be right to kill five people when you can only kill one person instead.
2:39 it wouldn't be right to kill five
2:42 if you could kill one person instead
2:47 that's a good reason
2:48 that's a good reason
2:52 who else?
2:53 does everybody agree with that
2:56 reason? go ahead.
3:01 Well I was thinking it was the same reason it was on
3:03 9/11 we regard the people who flew the plane
3:05 who flew the plane into the
3:08 Pennsylvania field as heroes
3:09 because they chose to kill the people on the plane
3:11 and not kill more people
3:14 in big buildings.
3:16 So the principle there was the same on 9/11
3:19 it's tragic circumstance,
3:21 but better to kill one so that five can live
3:25 is that the reason most of you have, those of you who would turn, yes?
3:30 Let's hear now
3:32 from
3:33 those in the minority
3:35 those who wouldn't turn.
3:40 Well I think that same type of mentality that justifies genocide and totalitarianism
3:45 in order to save one type of race you wipe out the other.
3:50 so what would you do in this case? You would
3:53 to avoid
3:55 the horrors of genocide
3:57 you would crash into the five and kill them?
4:03 Presumably yes.
4:07 okay who else?
4:09 That's a brave answer, thank you.
4:14 Let's consider another
4:16 trolley car case
4:20 and see
4:21 whether
4:24 those of you in the majority
4:27 want to adhere to the principle,
4:30 better that one should die so that five should live.
4:33 This time you're not the driver of the trolley car, you're an onlooker
4:38 standing on a bridge overlooking a trolley car track
4:42 and down the track comes a trolley car
4:45 at the end of the track are five workers
4:49 the brakes don't work
4:51 the trolley car is about to careen into the five and kill them
4:55 and now
4:57 you're not the driver
4:58 you really feel helpless
5:01 until you notice
5:03 standing next to you
5:06 leaning over
5:08 the bridge
5:09 is it very fat man.
5:17 And you could
5:20 give him a shove
5:22 he would fall over the bridge
5:24 onto the track
5:27 right in the way of
5:29 the trolley car
5:32 he would die
5:33 but he would spare the five.
5:38 Now, how many would push
5:41 the fat man over the bridge? Raise your hand.
5:48 How many wouldn't?
5:51 Most people wouldn't.
5:54 Here's the obvious question,
5:55 what became
5:56 of the principle
6:00 better to save five lives even if it means sacrificing one, what became of the principal
6:05 that almost everyone endorsed
6:07 in the first case
6:09 I need to hear from someone who was in the majority in both
6:12 cases is
6:13 how do you explain the difference between the two?
6:17 The second one I guess involves an active choice of
6:21 pushing a person
6:22 and down which
6:24 I guess that
6:25 that person himself would otherwise not have been involved in the situation at all
6:29 and so
6:31 to choose on his behalf I guess
6:33 to
6:36 involve him in something that he otherwise would have this escaped is
6:39 I guess more than
6:41 what you have in the first case where
6:43 the three parties, the driver and
6:45 the two sets of workers are
6:47 already I guess in this situation.
6:50 but the guy working, the one on the track off to the side
6:55 he didn't choose to sacrifice his life any more than the fat guy did, did he?
7:02 That's true, but he was on the tracks.
7:05 this guy was on the bridge.
7:10 Go ahead, you can come back if you want.
7:13 Alright, it's a hard question
7:15 but you did well you did very well it's a hard question.
7:19 who else
7:21 can
7:22 find a way of reconciling
7:26 the reaction of the majority in these two cases? Yes?
7:30 Well I guess
7:31 in the first case where
7:32 you have the one worker and the five
7:35 it's a choice between those two, and you have to
7:37 make a certain choice and people are going to die because of the trolley car
7:41 not necessarily because of your direct actions. The trolley car is a runway,
7:45 thing and you need to make in a split second choice
7:48 whereas pushing the fat man over is an actual act of murder on your part
7:52 you have control over that
7:54 whereas you may not have control over the trolley car.
7:57 So I think that it's a slightly different situation.
8:00 Alright who has a reply? Is that, who has a reply to that? no that was good, who has a way
8:04 who wants to reply?
8:06 Is that a way out of this?
8:09 I don't think that's a very good reason because you choose
8:12 either way you have to choose who dies because you either choose to turn and kill a person
8:16 which is an act of conscious
8:18 thought to turn,
8:19 or you choose to push the fat man
8:21 over which is also an active
8:23 conscious action so either way you're making a choice.
8:27 Do you want to reply?
8:29 Well I'm not really sure that that's the case, it just still seems kind of different, the act of actually
8:34 pushing someone over onto the tracks and killing them,
8:38 you are actually killing him yourself, you're pushing him with your own hands you're pushing and
8:42 that's different
8:43 than steering something that is going to cause death
8:47 into another...you know
8:48 it doesn't really sound right saying it now when I'm up here.
8:52 No that's good, what's your name?
8:54 Andrew.
8:55 Andrew and let me ask you this question Andrew,
8:59 suppose
9:02 standing on the bridge
9:03 next to the fat man
9:04 I didn't have to push him, suppose he was standing
9:07 over a trap door that I could open by turning a steering wheel like that
9:17 would you turn it?
9:18 For some reason that still just seems more
9:20 more wrong.
9:24 I mean maybe if you just accidentally like leaned into this steering wheel or something like that
9:30 or but,
9:31 or say that the car is
9:33 hurdling towards a switch that will drop the trap
9:37 then I could agree with that.
9:39 Fair enough, it still seems
9:42 wrong in a way that it doesn't seem wrong in the first case to turn, you say
9:45 An in another way, I mean in the first situation you're involved directly with the situation
9:50 in the second one you're an onlooker as well.
9:52 So you have the choice of becoming involved or not by pushing the fat man.
9:56 Let's forget for the moment about this case,
9:59 that's good,
10:01 but let's imagine a different case. This time your doctor in an emergency room
10:06 and six patients come to you
10:11 they've been in a terrible trolley car wreck
10:18 five of them sustained moderate injuries one is severely injured you could spend all day
10:23 caring for the one severely injured victim,
10:27 but in that time the five would die, or you could look after the five, restore them to health, but
10:32 during that time the one severely injured
10:35 person would die.
10:36 How many would save
10:37 the five
10:39 now as the doctor?
10:40 How many would save the one?
10:44 Very few people,
10:46 just a handful of people.
10:49 Same reason I assume,
10:51 one life versus five.
10:55 Now consider
10:57 another doctor case
10:59 this time you're a transplant surgeon
11:02 and you have five patients each in desperate need
11:06 of an organ transplant in order to survive
11:09 on needs a heart one a lung,
11:12 one a kidney,
11:13 one a liver
11:15 and the fifth
11:16 a pancreas.
11:20 And you have no organ donors
11:22 you are about to
11:24 see you them die
11:27 and then
11:28 it occurs to you
11:30 that in the next room
11:32 there's a healthy guy who came in for a checkup.
11:39 and he is
11:43 you like that
11:47 and he's taking a nap
11:53 you could go in very quietly
11:56 yank out the five organs, that person would die
12:00 but you can save the five.
12:03 How many would do it? Anyone?
12:10 How many? Put your hands up if you would do it.
12:18 Anyone in the balcony?
12:21 You would? Be careful don't lean over too much
12:26 How many wouldn't?
12:29 All right.
12:30 What do you say, speak up in the balcony, you who would
12:33 yank out the organs, why?
12:35 I'd actually like to explore slightly alternate
12:38 possibility of just taking the one
12:40 of the five he needs an organ who dies first
12:44 and using their four healthy organs to save the other four
12:50 That's a pretty good idea.
12:54 That's a great idea
12:57 except for the fact
13:00 that you just wrecked the philosophical point.
13:06 Let's step back
13:07 from these stories and these arguments
13:10 to notice a couple of things
13:12 about the way the arguments have began to unfold.
13:17 Certain
13:18 moral principles
13:20 have already begun to emerge
13:23 from the discussions we've had
13:25 and let's consider
13:27 what those moral principles
13:29 look like
13:31 the first moral principle that emerged from the discussion said
13:35 that the right thing to do the moral thing to do
13:39 depends on the consequences that will result
13:43 from your action
13:45 at the end of the day
13:47 better that five should live
13:49 even if one must die.
13:52 That's an example
13:53 of consequentialist
13:56 moral reasoning.
13:59 consequentialist moral reasoning locates morality in the consequences of an act. In the state of the
14:04 world that will result
14:06 from the thing you do
14:09 but then we went a little further, we considered those other cases
14:12 and people weren't so sure
14:15 about
14:17 consequentialist moral reasoning
14:20 when people hesitated
14:22 to push the fat man
14:24 over the bridge
14:25 or to yank out the organs of the innocent
14:28 patient
14:29 people gestured towards
14:32 reasons
14:34 having to do
14:35 with the intrinsic
14:37 quality of the act
14:39 itself.
14:40 Consequences be what they may.
14:42 People were reluctant
14:45 people thought it was just wrong
14:47 categorically wrong
14:49 to kill
14:50 a person
14:51 an innocent person
14:53 even for the sake
14:54 of saving
14:55 five lives, at least these people thought that
14:58 in the second
15:00 version of each story we reconsidered
15:05 so this points
15:06 a second
15:09 categorical
15:10 way
15:12 of thinking about
15:14 moral reasoning
15:16 categorical moral reasoning locates morality in certain absolute moral requirements in
15:22 certain categorical duties and rights
15:24 regardless of the consequences.
15:27 We're going to explore
15:29 in the days and weeks to come the contrast between
15:33 consequentialist and categorical moral principles.
15:36 The most influential
15:38 example of
15:40 consequential moral reasoning is utilitarianism, a doctrine invented by
15:45 Jeremy Bentham, the eighteenth century English political philosopher.
15:51 The most important
15:54 philosopher of categorical moral reasoning
15:56 is the
15:58 eighteenth century German philosopher Emmanuel Kant.
16:02 So we will look
16:03 at those two different modes of moral reasoning
16:07 assess them
16:08 and also consider others.
16:10 If you look at the syllabus, you'll notice that we read a number of great and famous books.
16:16 Books by Aristotle
16:18 John Locke
16:19 Emanuel Kant, John Stuart Mill,
16:22 and others.
16:24 You'll notice too from the syllabus that we don't only read these books,
16:28 we also all
16:30 take up
16:32 contemporary political and legal controversies that raise philosophical questions.
16:37 We will debate equality and inequality,
16:40 affirmative action,
16:41 free speech versus hate speech,
16:43 same sex marriage, military conscription,
16:47 a range of practical questions, why
16:50 not just to enliven these abstract and distant books
16:55 but to make clear to bring out what's at stake in our everyday lives including our political
17:01 lives,
17:03 for philosophy.
17:05 So we will read these books
17:07 and we will debate these
17:09 issues and we'll see how each informs and illuminates the other.
17:15 This may sound appealing enough
17:17 but here
17:19 I have to issue a warning,
17:22 and the warning is this
17:25 to read these books
17:28 in this way,
17:31 as an exercise in self-knowledge,
17:34 to read them in this way carry certain risks
17:38 risks that are both personal and political,
17:42 risks that every student of political philosophy have known.
17:47 These risks spring from that fact
17:50 that philosophy
17:52 teaches us
17:54 and unsettles us
17:56 by confronting us with what we already know.
18:01 There's an irony
18:03 the difficulty of this course consists in the fact that it teaches what you already know.
18:09 It works by taking
18:12 what we know from familiar unquestioned settings,
18:16 and making it strange.
18:20 That's how those examples worked
18:22 worked
18:23 the hypotheticals with which we began with their mix of playfulness and sobriety.
18:29 it's also how these philosophical books work. Philosophy
18:33 estranges us
18:35 from the familiar
18:37 not by supplying new information
18:40 but by inviting
18:41 and provoking
18:43 a new way of seeing
18:47 but, and here's the risk,
18:49 once
18:50 the familiar turns strange,
18:54 it's never quite the same again.
18:58 Self-knowledge
19:00 is like lost innocence,
19:03 however unsettling
19:04 you find it,
19:06 it can never
19:07 be unthought
19:09 or unknown
19:13 what makes this enterprise difficult
19:17 but also riveting,
19:19 is that
19:20 moral and political philosophy is a story
19:25 and you don't know where this story will lead but what you do know
19:29 is that the story
19:31 is about you.
19:34 Those are the personal risks,
19:37 now what of the political risks.
19:40 one way of introducing of course like this
19:43 would be to promise you
19:44 that by reading these books
19:46 and debating these issues
19:48 you will become a better more responsible citizen.
19:51 You will examine the presuppositions of public policy, you will hone your political
19:56 judgment
19:57 you'll become a more effective participant in public affairs
20:02 but this would be a partial and misleading promise
20:06 political philosophy for the most part hasn't worked that way.
20:11 You have to allow for the possibility
20:14 that political philosophy may make you a worse citizen
20:19 rather than a better one
20:21 or at least a worse citizen
20:23 before it makes you
20:25 a better one
20:27 and that's because philosophy
20:30 is a distancing
20:32 even debilitating
20:34 activity
20:36 And you see this
20:37 going back to Socrates
20:39 there's a dialogue, the Gorgias
20:42 in which one of Socrates’ friends
20:44 Calicles
20:45 tries to talk him out
20:47 of philosophizing.
20:49 calicles tells Socrates philosophy is a pretty toy
20:54 if one indulges in it with moderation at the right time of life
20:57 but if one pursues it further than one should it is absolute ruin.
21:03 Take my advice calicles says,
21:06 abandon argument
21:08 learn the accomplishments of active life, take
21:11 for your models not those people who spend their time on these petty quibbles,
21:16 but those who have a good livelihood and reputation
21:20 and many other blessings.
21:22 So Calicles is really saying to Socrates
21:26 quit philosophizing,
21:28 get real
21:30 go to business school
21:35 and calicles did have a point
21:38 he had a point
21:39 because philosophy distances us
21:42 from conventions from established assumptions
21:45 and from settled beliefs.
21:46 those are the risks,
21:48 personal and political
21:49 and in the face of these risks there is a characteristic evasion,
21:54 the name of the evasion is skepticism. It's the idea
21:57 well it goes something like this
21:58 we didn't resolve, once and for all,
22:03 either the cases or the principles we were arguing when we began
22:09 and if Aristotle
22:11 and Locke and Kant and Mill haven't solved these questions after all of these years
22:17 who are we to think
22:19 that we here in Sanders Theatre over the course a semester
22:23 can resolve them
22:26 and so maybe it's just a matter of
22:29 each person having his or her own principles and there's nothing more to be said about
22:33 it
22:34 no way of reasoning
22:36 that's the
22:37 evasion. The evasion of skepticism
22:39 to which I would offer the following
22:41 reply:
22:42 it's true
22:43 these questions have been debated for a very long time
22:47 but the very fact
22:49 that they have reoccurred and persisted
22:52 may suggest
22:54 that though they're impossible in one sense
22:57 their unavoidable in another
22:59 and the reason they're unavoidable
23:02 the reason they're inescapable is that we live some answer
23:06 to these questions every day.
23:10 So skepticism, just throwing up their hands and giving up on moral reflection,
23:16 is no solution
23:18 Emanuel Kant
23:19 described very well the problem with skepticism when he wrote
23:23 skepticism is a resting place for human reason
23:26 where it can reflect upon its dogmatic wanderings
23:29 but it is no dwelling place for permanent settlement.
23:33 Simply to acquiesce in skepticism, Kant wrote,
23:35 can never suffice to overcome the restless of reason.
23:42 I've tried to suggest through theses stories and these arguments
23:47 some sense of the risks and temptations
23:49 of the perils and the possibilities I would simply conclude by saying
23:55 that the aim of this course
23:58 is to awaken
23:59 the restlessness of reason
24:02 and to see where it might lead
24:04 thank you very much.
24:15 Like, in a situation that desperate,
24:16 you have to do what you have to do to survive. You have to do what you have to do you? You've gotta do
24:21 What you
24:22 gotta do. pretty much,
24:23 If you've been going nineteen days without any food
24:25 someone has to take the sacrifice, someone has to make the sacrifice and people can survive. Alright that's good, what's your name? Marcus.
24:33 Marcus, what do you say to Marcus?
24:40 Last time
24:44 we started out last time
24:46 with some stores
24:48 with some moral dilemmas
24:51 about trolley cars
24:53 and about doctors
24:54 and healthy patients
24:56 vulnerable
24:57 to being victims of organ transplantation
25:00 we noticed two things
25:04 about the arguments we had
25:06 one had to do with the way we were arguing
25:10 it began with our judgments in particular cases
25:13 we tried to articulate the reasons or the principles
25:18 lying behind our judgments
25:22 and then confronted with a new case
25:25 we found ourselves re-examining those principles
25:30 revising each in the light of the other
25:34 and we noticed the built-in pressure to try to bring into alignment
25:38 our judgments about particular cases
25:41 and the principles we would endorse
25:43 on reflection
25:46 we also noticed something about the substance of the arguments
25:50 that emerged from the discussion.
25:55 We noticed that sometimes we were tempted to locate the morality of an act in the consequences
26:00 in the results, in the state of the world that it brought about.
26:06 We called is consequentialist
26:09 moral reason.
26:11 But we also noticed that
26:13 in some cases
26:16 we weren't swayed only
26:18 by the results
26:22 sometimes,
26:23 many of us felt,
26:25 that not just consequences but also the intrinsic quality or character of the act
26:31 matters morally.
26:35 Some people argued that there are certain things that are just categorically wrong
26:40 even if they bring about
26:42 a good result
26:44 even
26:45 if they save five people
26:47 at the cost of one life.
26:49 So we contrasted consequentialist
26:52 moral principles
26:54 with categorical ones.
26:58 Today
26:59 and in the next few days
27:00 we will begin to examine one of the most influential
27:06 versions of consequentialist
27:08 moral theory
27:10 and that's the philosophy of utilitarianism.
27:16 Jeremy Bentham,
27:17 the eighteenth century
27:19 English political philosopher
27:21 gave first
27:22 the first clear systematic expression
27:26 to the utilitarian
27:28 moral theory.
27:32 And Bentham's idea,
27:36 his essential idea
27:38 is a very simple one
27:42 with a lot of
27:44 morally
27:46 intuitive appeal.
27:48 Bentham's idea is
27:50 the following
27:51 the right thing to do
27:54 the just thing to do
27:57 it's to
27:58 maximize
28:01 utility.
28:02 What did he mean by utility?
28:06 He meant by utility the balance
28:11 of pleasure over pain,
28:14 happiness over suffering.
28:16 Here's how we arrived
28:18 at the principle
28:19 of maximizing utility.
28:22 He started out by observing
28:24 that all of us
28:26 all human beings
28:27 are governed by two sovereign masters,
28:31 pain and pleasure.
28:34 We human beings
28:37 like pleasure and dislike pain
28:42 and so we should base morality
28:45 whether we are thinking of what to do in our own lives
28:49 or whether
28:50 as legislators or citizens
28:52 we are thinking about what the law should be,
28:57 the right thing to do individually or collectively
29:02 is to maximize, act in a way that maximizes
29:05 the overall level
29:07 of happiness.
29:11 Bentham's utilitarianism is sometimes summed up with the slogan
29:15 the greatest good for the greatest number.
29:18 With this
29:20 basic principle of utility on hand,
29:22 let's begin to test it and to examine it
29:26 by turning to another case
29:28 another story but this time
29:30 not a hypothetical story,
29:32 a real-life story
29:34 the case of
29:35 the Queen versus Dudley and Stephens.
29:38 This was a nineteenth-century British law case
29:41 that's famous
29:44 and much debated in law schools.
29:47 Here's what happened in the case
29:50 I'll summarize the story
29:51 and then I want to hear
29:54 how you would rule
29:57 imagining that you are the jury.
30:04 A newspaper account of the time
30:06 described the background:
30:09 A sadder story of disaster at sea
30:11 was never told
30:12 than that of the survivors of the yacht
30:15 Mignonette.
30:16 The ship foundered in the south Atlantic
30:19 thirteen hundred miles from the cape
30:21 there were four in the crew,
30:24 Dudley was the captain
30:26 Stephens was the first mate
30:28 Brooks was a sailor,
30:30 all men of
30:31 excellent character,
30:32 or so the newspaper account
30:34 tells us.
30:35 The fourth crew member was the cabin boy,
30:38 Richard Parker
30:40 seventeen years old.
30:42 He was an orphan
30:44 he had no family
30:46 and he was on his first long voyage at sea.
30:51 He went, the news account tells us,
30:53 rather against the advice of his friends.
30:56 He went in the hopefulness of youthful ambition
31:00 thinking the journey would make a man of him.
31:03 Sadly it was not to be,
31:05 the facts of the case were not in dispute,
31:07 a wave hit the ship
31:08 and the Mignonette went down.
31:12 The four crew members escaped to a lifeboat
31:14 the only
31:16 food they had
31:18 were two
31:19 cans of preserved
31:20 turnips
31:21 no fresh water
31:23 for the first three days they ate nothing
31:26 on the fourth day that opened one of the cans of turnips
31:30 and ate it.
31:31 The next day they caught a turtle
31:34 together with the other can of turnips
31:36 the turtle
31:38 enabled them to subsist
31:40 for the next few days and then for eight days
31:43 they had nothing
31:44 no food no water.
31:47 Imagine yourself in a situation like that
31:50 what would you do?
31:52 Here's what they did
31:55 by now the cabin boy Parker is lying at the bottom of the lifeboat in a corner
32:00 because he had drunk sea water
32:03 against the advice of the others
32:05 and he had become ill
32:07 and he appeared to be dying
32:10 so on the nineteenth day Dudley, the captain, suggested
32:14 that they should all
32:17 have a lottery. That they should
32:18 all draw lots to see
32:19 who would die
32:20 to save the rest.
32:24 Brooks
32:25 refused
32:26 he didn't like the lottery idea
32:29 we don't know whether this
32:30 was because he didn't want to take that chance or because he believed in categorical moral
32:35 principles
32:36 but in any case
32:38 no lots were drawn.
32:42 The next day
32:43 there was still no ship in sight
32:45 so a Dudley told Brooks to avert his gaze
32:48 and he motioned to Stephens
32:50 that the boy Parker had better be killed.
32:53 Dudley offered a prayer
32:55 he told a the boy his time had come
32:58 and he killed him with a pen knife
33:00 stabbing him in the jugular vein.
33:03 Brooks emerged from his conscientious objection to share in the gruesome bounty.
33:09 For four days
33:11 the three of them fed on the body and blood of the cabin boy.
33:15 True story.
33:17 And then they were rescued.
33:19 Dudley describes their rescue
33:22 in his diary
33:24 with staggering euphemism, quote:
33:27 "on the twenty fourth day
33:29
as we were having our breakfast
33:34
a ship appeared at last."
33:38 The three survivors were picked up by a German ship. They were taken back to Falmouth in England
33:44 where they were arrested and tried
33:47 Brooks
33:47 turned state's witness
33:49 Dudley and Stephens went to trial. They didn't dispute the facts
33:54 they claimed
33:55 they had acted out of necessity
33:58 that was their defense
33:59 they argued in effect
34:01 better that one should die
34:03 so that three could survive
34:06 the prosecutor
34:08 wasn't swayed by that argument
34:10 he said murder is murder
34:12 and so the case went to trial. Now imagine you are the jury
34:16 and just to simplify the discussion
34:19 put aside the question of law,
34:21 and let's assume that
34:23 you as the jury
34:25 are charged with deciding
34:28 whether what they did was morally
34:31 permissible or not.
34:34 How many
34:36 would vote
34:39 not guilty, that what they did was morally permissible?
34:49 And how many would vote guilty
34:51 what they did was morally wrong?
34:54 A pretty sizable majority.
34:57 Now let's see what people's reasons are, and let me begin with those who are in the minority.
35:03 Let's hear first from the defense
35:07 of Dudley and Stephens.
35:10 Why would you morally exonerate them?
35:14 What are your reasons?
35:17 I think it's I think it is morally reprehensible
35:20 but I think that there's a distinction between what's morally reprehensible
35:24 what makes someone legally accountable
35:26 in other words the night as the judge said what's always moral isn't necessarily
35:30 against the law and while I don't think that necessity
35:34 justifies
35:36 theft or murder any illegal act,
35:38 at some point your degree of necessity does in fact
35:43 exonerate you form any guilt. ok.
35:45 other defenders, other voices for the defense?
35:50 Moral justifications for
35:53 what they did?
35:56 yes, thank you
35:58 I just feel like
35:59 in a situation that desperate you have to do what you have to do to survive.
36:03 You have to do what you have to do
36:04 ya, you gotta do what you gotta do, pretty much.
36:06 If you've been
36:07 going nineteen days without any food
36:09 you know someone just has to take the sacrifice has to make sacrifices and people can survive
36:14 and furthermore from that
36:16 let's say they survived and then they become productive members of society who go home and then start like
36:21 a million charity organizations and this and that and this and that, I mean they benefit everybody in the end so
36:26 I mean I don't know what they did afterwards, I mean they might have
36:28 gone on and killed more people
36:30 but whatever.
36:32 what? what if they were going home and turned out to be assassins?
36:35 What if they were going home and turned out to be assassins?
36:38 You would want to know who they assassinated.
36:42 That's true too, that's fair
36:45 I would wanna know who they assassinated.
36:49 alright that's good, what's your name? Marcus.
36:50 We've heard a defense
36:52 a couple voices for the defense
36:54 now we need to hear
36:55 from the prosecution
36:57 most people think
36:59 what they did was wrong, why?
37:05 One of the first things that I was thinking was, oh well if they haven't been eating for a really long time,
37:09 maybe
37:11 then
37:12 they're mentally affected
37:15 that could be used for the defense,
37:16 a possible argument that oh,
37:20 that they weren't in a proper state of mind, they were making
37:24 decisions that they otherwise wouldn't be making, and if that's an appealing argument
37:28 that you have to be in an altered mindset to do something like that it suggests that
37:33 people who find that argument convincing
37:36 do you think that they're acting immorally. But I want to know what you think you're defending
37:40 you k 0:37:41.249,0:37:45.549 you voted to convict right? yeah I don't think that they acted in morally
37:45 appropriate way. And why not? What do you say, Here's Marcus
37:49 he just defended them,
37:51 he said,
37:52 you heard what he said,
37:53 yes I did
37:55 yes
37:56 that you've got to do what you've got to do in a case like that.
38:00 What do you say to Marcus?
38:04 They didn't,
38:06 that there is no situation that would allow human beings to take
38:13 the idea of fate or the other people's lives into their own hands that we don't have
38:17 that kind of power.
38:19 Good, okay
38:21 thanks you, and what's your name?
38:24 Britt? okay.
38:24 who else?
38:26 What do you say? Stand up
38:28 I'm wondering if Dudley and Stephens had asked for Richard Parker's consent in, you know, dying,
38:35 if that would
38:37 would that exonerate them
38:41 from an act of murder, and if so is that still morally justifiable?
38:45 That's interesting, alright consent, now hang on, what's your name? Kathleen.
38:51 Kathleen says suppose so what would that scenario look like?
38:56 so in the story
38:56 Dudley is there, pen knife in hand,
39:00 but instead of the prayer
39:02 or before the prayer,
39:04 he says, Parker,
39:07 would you mind
39:11 we're desperately hungry,
39:14 as Marcus empathizes with
39:17 we're desperately hungry
39:19 you're not going to last long anyhow,
39:22 you can be a martyr,
39:23 would you be a martyr
39:25 how about it Parker?
39:29 Then, then
39:33 then what do you think, would be morally justified then? Suppose
39:37 Parker
39:38 in his semi-stupor
39:40 says okay
39:42 I don't think it'll be morally justifiable but I'm wondering. Even then, even then it wouldn't be? No
39:47 You don't think that even with consent
39:50 it would be morally justified.
39:52 Are there people who think
39:54 who want to take up Kathleen's
39:56 consent idea
39:57 and who think that that would make it morally justified? Raise your hand if it would
40:01 if you think it would.
40:05 That's very interesting
40:07 Why would consent
40:09 make a moral difference? Why would it?
40:15 Well I just think that if he was making his own original idea
40:18 and it was his idea to start with
40:20 then that would be the only situation in which I would
40:23 see it being appropriate in anyway 0:40:25.940,0:40:28.359 because that way you couldn't make the argument that
40:28 he was pressured you know it’s three
40:30 to one or whatever the ratio was,
40:32 and I think that
40:34 if he was making a decision to give his life then he took on the agency
40:38 to sacrifice himself which some people might see as admirable and other people
40:42 might disagree with that decision.
40:45 So if he came up with the idea
40:49 that's the only kind of consent we could have confidence in
40:52 morally, then it would be okay
40:55 otherwise
40:57 it would be kind of coerced consent
40:59 under the circumstances
41:01 you think.
41:05 Is there anyone who thinks
41:07 that the even the consent of Parker
41:10 would not justify
41:13 their killing him?
41:15 Who thinks that?
41:18 Yes, tell us why, stand up
41:19 I think that Parker
41:21 would be killed
41:22 with the hope that the other crew members would be rescued so
41:26 there's no definite reason that he should be killed
41:29 because you don't know
41:31 when they're going to get rescued so if you kill him you're killing him in vain
41:35 do you keep killing a crew member until you're rescued and then you're left with no one?
41:38 because someone's going to die eventually?
41:40 Well the moral logic of the situation seems to be that.
41:44 That they would
41:45 keep on picking off the weakest maybe, one by one,
41:50 until they were
41:51 rescued and in this case luckily when three at least were still alive.
41:57 Now if
41:58 if Parker did give his consent
42:01 would it be all right do you think or not?
42:04 No, it still wouldn't be right.
42:06 Tell us why wouldn't be all right.
42:08 First of all, cannibalism, I believe
42:10 is morally incorrect
42:13 so you shouldn’t be eating a human anyway.
42:14 So
42:17 cannibalism is morally objectionable outside
42:19 so then even in the scenario
42:22 of waiting until someone died
42:24 still it would be objectionable.
42:27 Yes, to me personally
42:27 I feel like of
42:29 it all depends on
42:31 one's personal morals, like we can't just, like this is just my opinion
42:35 of course other people are going to disagree.
42:39 Well let's see, let's hear what their disagreements are
42:41 and then we'll see
42:42 if they have reasons
42:44 that can persuade you or not.
42:46 Let's try that
42:48 Let's
42:50 now is there someone
42:53 who can explain, those of you who are tempted by consent
42:57 can you explain
42:59 why consent makes
43:02 such a moral difference,
43:03 what about the lottery idea
43:05 does that count as consent. Remember at the beginning
43:08 Dudley proposed a lottery
43:11 suppose that they had agreed
43:13 to a lottery
43:16 then
43:17 how many would then say
43:20 it was all right. Say there was a lottery,
43:23 cabin boy lost,
43:25 and the rest of the story unfolded. How many people would say it's morally permissible?
43:33 So the numbers are rising if we add a lottery, let's hear from one of you
43:37 for whom the lottery would make a moral difference
43:41 why would it?
43:43 I think the essential
43:44 element,
43:45 in my mind that makes it a crime is
43:47 the idea that they decided at some point that their lives were more important than his, and that
43:53 I mean that's kind of the basis for really any crime
43:56 right? It's like
43:57 my needs, my desire is a more important than yours and mine take precedent
44:01 and if they had done a lottery were everyone consented
44:04 that someone should die
44:06 and it's sort of like they're all sacrificing themselves,
44:09 to save the rest,
44:11 Then it would be all right?
44:12 A little grotesque but,
44:15 But morally permissible? Yes.
44:18 what's your name? Matt.
44:22 so, Matt for you
44:25 what bothers you is not
44:27 the cannibalism, but the lack of due process.
44:31 I guess you could say that
44:34 And can someone who agrees with Matt
44:38 say a little bit more
44:40 about why
44:41 a lottery
44:43 would make it, in your view,
44:47 morally permissible.
44:50 The way I understood it originally was that that was the whole issue is that the cabin boy was never
44:55 consulted
44:56 about whether or not it something was going to happen to him even though with the original
45:00 lottery
45:01 whether or not he would be a part of that it was just decided
45:04 that he was the one that was going to die. Yes that's what happened in the actual case
45:08 but if there were a lottery and they all agreed to the procedure
45:11 you think that would be okay?
45:13 Right, because everyone knows that there's gonna be a death
45:16 whereas
45:17 you know the cabin boy didn't know that
45:18 this discussion was even happening
45:21 there was no
45:21 you know forewarning
45:23 for him to know that hey, I may be the one that's dying. Okay, now suppose the everyone agrees
45:28 to the lottery they have the lottery the cabin boy loses any changes his mind.
45:35 You've already decided, it's like a verbal contract, you can't go back on that. You've decided the decision was made
45:40 you know if you know you're dying for the reason for at others to live,
45:45 you would, you know
45:45 if the someone else had died
45:47 you know that you would consume them, so
45:51 But then he could say I know, but I lost.
45:57 I just think that that's the whole moral issue is that there was no consulting of the cabin boy and that that's
46:01 what makes it the most horrible
46:04 is that he had no idea what was even going on, that if he had known what was going on
46:08 it would
46:10 be a bit more understandable.
46:13 Alright, good, now I want to hear
46:14 so there's some who think
46:17 it's morally permissible
46:18 but only about twenty percent,
46:24 led by Marcus,
46:26 then there are some who say
46:28 the real problem here
46:30 is the lack of consent
46:32 whether the lack of consent to a lottery to a fair procedure
46:37 or
46:38 Kathleen's idea,
46:39 lack of consent
46:40 at the moment
46:42 of death
46:45 and if we add consent
46:48 then
46:49 more people are willing to consider
46:51 the sacrifice morally justified.
46:54 I want to hear now finally
46:56 from those of you who think
46:58 even with consent
47:00 even with a lottery
47:01 even with
47:02 a final
47:04 murmur of consent from Parker
47:06 at the
47:08 very last moment
47:09 it would still
47:10 be wrong
47:12 and why would it be wrong
47:14 that's what I want to hear.
47:16 well the whole time
47:18 I've been leaning towards the categorical moral reasoning
47:22 and I think that
47:25 there's a possibility I'd be okay with the idea of the lottery and then loser
47:29 taking into their own hands to
47:31 kill themselves
47:33 so there wouldn't be an act of murder but I still think that
47:37 even that way it's coerced and also I don't think that there's any remorse like in
47:42 Dudley's diary
47:43 we're getting our breakfast
47:44 it seems as though he's just sort of like, oh,
47:47 you know that whole idea of not valuing someone else's life
47:51 so that makes me
47:53 feel like I have to take the categorical stance. You want to throw the book at him.
47:57 when he lacks remorse or a sense of having done anything wrong. Right.
48:02 Alright, good so are there any other
48:06 defenders who
48:08 who say it's just categorically wrong, with or without consent, yes stand up. Why?
48:13 I think undoubtedly the way our society is shaped, murder is murder
48:17 murder is murder and every way our society looks down at it in the same light
48:21 and I don't think it's any different in any case. Good now let me ask you a question,
48:24 there were three lives at stake
48:27 versus one,
48:30 the one, that the cabin boy, he had no family
48:33 he had no dependents,
48:34 these other three had families back home in England they had dependents
48:38 they had wives and children
48:41 think back to Bentham,
48:43 Bentham says we have to consider
48:44 the welfare, the utility, the happiness
48:48 of everybody. We have to add it all up
48:51 so it's not just numbers three against one
48:54 it's also all of those people at home
48:58 in fact the London newspaper at the time
49:00 and popular opinion sympathized with them
49:04 Dudley in Stephens
49:05 and the paper said if they weren't
49:07 motivated
49:08 by affection
49:09 and concern for their loved ones at home and dependents, surely they wouldn't have
49:13 done this. Yeah, and how is that any different from people
49:15 on the corner
49:17 trying to having the same desire to feed their family, I don't think it's any different. I think in any case
49:21 if I'm murdering you to advance my status, that's murder and I think that we should look at all
49:25 of that in the same light. Instead of criminalizing certain
49:28 activities
49:30 and making certain things seem more violent and savage
49:33 when in that same case it's all the same act and mentality
49:36 that goes into the murder, a necessity to feed their families.
49:40 Suppose there weren't three, supposed there were thirty,
49:43 three hundred,
49:44 one life to save three hundred
49:47 or in more time,
49:48 three thousand
49:49 or suppose the stakes were even bigger.
49:51 Suppose the stakes were even bigger
49:52 I think it's still the same deal.
49:54 Do you think Bentham was wrong to say the right thing to do
49:58 is to add
49:58 up the collected happiness, you think he's wrong about that?
50:02 I don't think he is wrong, but I think murder is murder in any case. Well then Bentham has to be wrong
50:06 if you're right he's wrong. okay then he's wrong.
50:09 Alright thank you, well done.
50:12 Alright, let's step back
50:14 from this discussion
50:16 and notice
50:19 how many objections have we heard to what they did.
50:23 we heard some defenses of what they did
50:26 the defense has had to do with
50:28 necessity
50:28 the dire circumstance and,
50:32 implicitly at least,
50:33 the idea that numbers matter
50:36 and not only numbers matter
50:37 but the wider effects matter
50:40 their families back home, their dependents
50:43 Parker was an orphan,
50:44 no one would miss him.
50:47 so if you
50:49 add up
50:50 if you tried to calculate
50:52 the balance
50:53 of happiness and suffering
50:56 you might have a case for
50:58 saying what they did was the right thing
51:02 then we heard at least three different types of objections,
51:09 we heard an objection that's said
51:11 what they did was categorically wrong,
51:14 right here at the end
51:15 categorically wrong.
51:17 Murder is murder it's always wrong
51:19 even if
51:20 it increases the overall happiness
51:23 of society
51:25 the categorical objection.
51:28 But we still need to investigate
51:30 why murder
51:32 is categorically wrong.
51:35 Is it because
51:38 even cabin boys have certain fundamental rights?
51:42 And if that's the reason
51:44 where do those rights come from if not from some idea
51:47 of the larger welfare or utility or happiness? Question number one.
51:53 Others said
51:56 a lottery would make a difference
51:58 a fair procedure,
52:00 Matt said.
52:05 And some people were swayed by that.
52:08 That's not a categorical objection exactly
52:12 it's saying
52:13 everybody has to be counted as an equal
52:16 even though, at the end of the day
52:18 one can be sacrificed
52:20 for the general welfare.
52:23 That leaves us with another question to investigate,
52:26 Why does agreement to certain procedure,
52:29 even a fair procedure,
52:31 justify whatever result flows
52:34 from the operation of that procedure?
52:38 Question number two.
52:39 and question number three
52:42 the basic idea of consent.
52:45 Kathleen got us on to this.
52:48 If the cabin boy had agreed himself
52:52 and not under duress
52:54 as was added
52:57 then it would be all right to take his life to save the rest.
53:01 Even more people signed on to that idea
53:04 but that raises
53:06 a third philosophical question
53:08 what is the moral work
53:11 that consent
53:12 does?
53:14 Why does an act of consent
53:16 make such a moral difference
53:19 that an act that would be wrong, taking a life, without consent
53:23 is morally
53:25 permissible
53:26 with consent?
53:29 To investigate those three questions
53:31 we're going to have to read some philosophers
53:34 and starting next time
53:35 we're going to read
53:36 Bentham,
53:37 and John Stuart Mill, utilitarian philosophers.
53:43 Don't miss the chance to interact online with other viewers of Justice
53:43 join the conversation,
53:49 take a pop quiz, watch lectures you've missed, and a lot more. Visit www.justiceharvard.org. It's the right thing to do.
54:36 Funding for the program is provided by
54:40 Additional funding provided by
DISCUSS!
Original posting by Braincrave Second Life staff on Dec 19, 2010 at http://www.braincrave.com/viewblog.php?id=409
About braincrave
We all admire beauty, but the mind ultimately must be stimulated for maximum arousal. Longevity in relationships cannot occur without a meeting of the minds. And that is what Braincrave is: a dating venue where minds meet. Learn about the thoughts of your potential match on deeper topics... topics that spawn your own insights around what you think, the choices you make, and the actions you take.
We are a community of men and women who seek beauty and stimulation through our minds. We find ideas, education, and self-improvement sexy. We think intelligence is hot. But Braincrave is more than brains and I.Q. alone. We are curious. We have common sense. We value and offer wisdom. We experiment. We have great imaginations. We devour literacy. We are intellectually honest. We support and encourage each other to be better.
You might be lonely but you aren't alone.
Sep, 2017 update: Although Braincrave resulted in two confirmed marriages, the venture didn't meet financial targets. Rather than updating our outdated code base, we've removed all previous dating profiles and retained the articles that continue to generate interest. Moving to valME.io's platform supports dating profiles (which you are welcome to post) but won't allow typical date-matching functionality (e.g., location proximity, attribute similarity).
The Braincrave.com discussion group on Second Life was a twice-daily intellectual group discussions typically held at 12:00 PM SLT (PST) and 7:00 PM SLT. The discussions took place in Second Life group chat but are no longer formally scheduled or managed. The daily articles were used to encourage the discussions.
Someone's Reading
Related Posts
Latest Activity
-
braincrave posted "Dating online? Watch that ad!" in braincrave
-
Mitcheltafur commented on "An interview with David Eagleman, neuroscientist" in braincrave
-
braincrave posted "The solution to the war problem" in braincrave
-
braincrave posted "The Great Hoax of Equality" in braincrave
-
braincrave posted "Manipulating Americans is easy" in braincrave
-
braincrave posted "Irrational self-interest" in braincrave
-
braincrave posted "The age of nanotechnology" in braincrave
-
braincrave posted "TIL real-life levitation does exist" in braincrave
- More...