Jump to content
Objectivism Online Forum

The Trolley Problem

Rate this topic


epistemologue

Recommended Posts

30 minutes ago, Doug Morris said:

Somebody challenged me once with the following.

You're driving a car down a mountain road.  The brakes fail.  You can steer but not slow or stop.  You are approaching a bridge packed with kids.  Your choices are steer for the bridge and mow down the kids or steer to the side and be killed yourself.  He indicated he wouldn't have any respect for anyone whose choice was to steer for the bridge.  I tried to explain that this was an abnormal situation with no fully satisfactory outcome and he said he would be perfectly satisfied to steer to the side and die. 

He had a military background.  I don't know to what extent that entered into it.  

The truth such "hypotheticals" reveals is that people like to think morality is only about choosing between sacrificing yourself or sacrificing others, so they manufacture a "situation" where those are the only choices.  Of course, the set up is intended to imply (at least at the subconscious level) that non-altruism is akin to murder.

Reality is not "jiggered" to force you to make a choice of sacrifices.  Morality is about living.

In almost any situation, it would require omniscience to validate with any certainty that your choices ARE limited.  Since you are fallible and given that life is on the line, it would be IMMORAL simply to give up and kill yourself or the kids... i.e. the immorality would be to take any action which does not seek to save BOTH you and the kids.   Trying some other action  even in the face of near certainty of failure. 

E.G. steering back and forth to cause the car to roll over or slow down... trying to put it into a low gear / park or reverse... steer up against into the mountain to reduce your speed.

 

But you know as I do, the type of person who poses these hypotheticals would not be happy with your answer.. they don't care about cars, gears, steering, or mountains... they just want to you answer whether you will sacrifice yourself or sacrifice others and then label you a martyr or a villain.

 

Edited by StrictlyLogical
Link to comment
Share on other sites

10 hours ago, StrictlyLogical said:

they don't care about cars, gears, steering, or mountains... they just want to you answer whether you will sacrifice yourself or sacrifice others and then label you a martyr or a villain.

Exactly. In real life there is always a context. In addition to the environmental context, there is also the personal. Who are you, and who are the children? What if I'm a spy in cold war Soviet Union and I'm trying to escape with a secret that will topple the communist network in Eastern Europe? I think I'm mowing down the kids. But if I'm a regular old fart coming back from the hospital in my home town, I'll probably go off the cliff. Context matters. Moral choice fantasies have absolutely no meaning outside a serious context. All you're doing is creating dogma.

Link to comment
Share on other sites

On 2/18/2019 at 7:09 PM, dreadrocksean said:

Is there a number above 5 that I would do nothing and let it remain on my child's track? 100?  No. 1000?  No. 10000?  No. 100000?  No. 1 Million?  Maybe.  Depends upon which country I'm in.  America?  No.  My home country?  Hmm Now we're getting into serious grey area.

Are you saying the nominal value of 1 million human traders might surpass the value of your child, and that's why you'd let your child die? If you're comfortable slaughtering 999,999 people, why not 1 million?

Link to comment
Share on other sites

16 hours ago, MisterSwig said:

Are you saying the nominal value of 1 million human traders might surpass the value of your child, and that's why you'd let your child die? If you're comfortable slaughtering 999,999 people, why not 1 million?

Hahaha.  I'm saying that at some point, which I arbitrarily measured at 1 Mil, a particular society's value rivals that of my offspring, whom I'm ironically raising to be a productive member of.
Also, how would he feel, after he's grown up, when he finds out that 90% of his heritage and culture was sacrificed for him?

Edited by dreadrocksean
Link to comment
Share on other sites

19 minutes ago, dreadrocksean said:

Those are irrelevant.  All we require is the decision AS IF it were real.  It was well thought out and it worked.
Most did not switch the tracks and they claim to in all surveys.

It was a joke.

Besides, it's silly to think what you think you would do would always match what you actually do. Anything about moral psychology research reveals that as much as people can imagine a scenario, the vast majority of people fail to predict their own behavior.

Link to comment
Share on other sites

11 hours ago, dreadrocksean said:

I'm saying that at some point, which I arbitrarily measured at 1 Mil, a particular society's value rivals that of my offspring, whom I'm ironically raising to be a productive member of.

I think there is a critical problem here. You cannot possibly assess the value of the million without knowing whether their deaths would result in the collapse of the rest of society. For example, among the million might be a set of individuals necessary to stop an extinction-level threat in the future. You have to assume that society can continue to your satisfaction despite the loss of the million. And if you arbitrarily make that assumption, why not slaughter a billion? Or a trillion? After all, they're just numbers now. It's not actually the number that's relevant. It's the effect on society in relation to your life. And I don't think you could possibly calculate that effect, especially not in the seconds you would have to make a decision at the lever.

11 hours ago, dreadrocksean said:

Also, how would he feel, after he's grown up, when he finds out that 90% of his heritage and culture was sacrificed for him?

If he were raised to be rational, he would not accept unearned guilt. He's not the one who killed a million people.

Edited by MisterSwig
Link to comment
Share on other sites

7 hours ago, MisterSwig said:

why not slaughter a billion?

Didn't you already answer this yourself? I mean, you'd be more likely to kill somebody who is a brilliant doctor or something like that. More versus less.

And anyway, I think you have it backwards. He was talking about letting a child die if it meant saving a million people.

Link to comment
Share on other sites

1 hour ago, Eiuol said:

Didn't you already answer this yourself? I mean, you'd be more likely to kill somebody who is a brilliant doctor or something like that. More versus less.

Right, but I'm trying to establish a non-assumptive standard. Dreadrock is claiming that at some point the society's value itself rivals his own child's. So how do you calculate that value? How do you know when killing the plurality would destroy the value which is society? I'm suggesting that you can't know this through mere assumption or arbitrary assertion.

Perhaps you yourself don't like society much and would slaughter billions to save your child, because you wouldn't want to live knowing you let your child die. Or maybe you value society so highly that you would kill your child for a mere couple, because the number two is greater than one. Even this choice depends on the particular context and personal values.

Ultimately what I think it boils down to is whether you choose family or society. Which one is the greater value to you? If you had to completely destroy one, which one would it be?

Edited by MisterSwig
Link to comment
Share on other sites

11 hours ago, MisterSwig said:

I think there is a critical problem here. You cannot possibly assess the value of the million without knowing whether their deaths would result in the collapse of the rest of society. For example, among the million might be a set of individuals necessary to stop an extinction-level threat in the future. You have to assume that society can continue to your satisfaction despite the loss of the million. And if you arbitrarily make that assumption, why not slaughter a billion? Or a trillion? After all, they're just numbers now. It's not actually the number that's relevant. It's the effect on society in relation to your life. And I don't think you could possibly calculate that effect, especially not in the seconds you would have to make a decision at the lever.

If he were raised to be rational, he would not accept unearned guilt. He's not the one who killed a million people.

You assume that the society is significantly more than a million.  You need to get your vision out of the Murica box for a min.
So yes I can calculate in a split second what would happen to my culture should I allow 1 Million of us to be killed.  No they're not just numbers as those other ones do not exist.

Good on your last point.  But then he'd just despise me.

Link to comment
Share on other sites

22 hours ago, Eiuol said:

It was a joke.

Besides, it's silly to think what you think you would do would always match what you actually do. Anything about moral psychology research reveals that as much as people can imagine a scenario, the vast majority of people fail to predict their own behavior.

It's not silly when many of the decisions we vote for are based upon studies such as this.

Link to comment
Share on other sites

On 10/22/2016 at 1:27 AM, Eiuol said:

No. In general, if a tragedy -is- bound to occur, and all people involved are all equally strangers to me, more strangers are more valuable. It's not greater good, insofar as more people in this case are more valuable to me. It doesn't matter to me if it requires me to act or this result occurs if I wasn't there. I don't care about consequences per se, it's part of virtuous nature to protect values as best I can.

I'll get to whatever necro'd this thread soon, but if for some reason I was in this insane situation caused by whatever, I'd be more likely to save the individual person over the five only because it would be slightly more likely that the single person is an individualist and that the group is composed of collectivists of some sort. This would be my snap judgement based on the only information available and no known context besides the immediately available visual stimuli, assuming no ability to signal the train to just stop, etc.

Link to comment
Share on other sites

On 2/19/2019 at 11:28 AM, Doug Morris said:

Somebody challenged me once with the following.

You're driving a car down a mountain road.  The brakes fail.  You can steer but not slow or stop.  You are approaching a bridge packed with kids.  Your choices are steer for the bridge and mow down the kids or steer to the side and be killed yourself.  He indicated he wouldn't have any respect for anyone whose choice was to steer for the bridge.  I tried to explain that this was an abnormal situation with no fully satisfactory outcome and he said he would be perfectly satisfied to steer to the side and die. 

He had a military background.  I don't know to what extent that entered into it.  

Oh, I understand the necro now. It sucks but obviously you yourself are a greater value than anyone else. People that wouldn't save themselves over anyone other person in existence shouldn't be allowed into the military to begin with. 

Link to comment
Share on other sites

6 hours ago, dreadrocksean said:

It's not silly when many of the decisions we vote for are based upon studies such as this.

Not sure what you're talking about, the only examples I can even think of are possibly people talking about how how intention doesn't match up with what people actually do. Even in social policy discussions. Studies about trolley problems (or any kind of similar choice between few in many) usually get into nudging, which of everything to do with context to influence decisions.

But ECs post is more interesting to me.

More than likely, if any such scenario ever came up, I bet it would be in the context of a mass against an individual. I hadn't considered that before. As was mentioned on the first page, this whole thought experiment is only good for stimulating ideas. It's not so much "the answer" philosophers even care about as the justifications people use and explanations of value.

Link to comment
Share on other sites

On 2/18/2019 at 10:09 PM, dreadrocksean said:

To apply this to in first person, in that situation, I would know the difference between causation and choice.  The first choice is - "Do I get involved or not?"  Not - "Who should die?"

Exactly. If the lever is owned by the train company, what right do you have to get involved with the workings of their property?

Link to comment
Share on other sites

New trolley problem:

The trolley is speeding towards a stack of dynamite that would explode and kill you if you crash into it. If you hit the switch, you will change tracks and be safe. But, you don't own the trolley! Either you get involved the workings of their property, or you die. What do you do?

Edited by Eiuol
Link to comment
Share on other sites

11 hours ago, Doug Morris said:
14 hours ago, MisterSwig said:

Ayn Rand in her prime is tied to one track. Your newborn baby is on the other...

Can you derail the train?

I doubt it. But feel free to add whatever context you need.

These sorts of extreme hypotheticals don't provide moral guidance, but they might say something about your hierarchy of values. 

Link to comment
Share on other sites

On 2/22/2019 at 10:47 PM, Eiuol said:

New trolley problem:

The trolley is speeding towards a stack of dynamite that would explode and kill you if you crash into it. If you hit the switch, you will change tracks and be safe. But, you don't own the trolley! Either you get involved the workings of their property, or you die. What do you do?

We already have established elsewhere that you're okay with violating others' rights, up to and including stealing and even murder, for the sake of your own survival, so I doubt flipping a switch would pose any trouble for you. For me it's a serious question, and I have to wonder how I managed getting on a trolley in which I'm strictly prohibited from pulling any of the levers in case of an emergency (that for the sake of argument would be perfectly reasonable and safe to pull, if only I had the permission); it sounds pretty improbable to begin with.

Link to comment
Share on other sites

2 hours ago, intrinsicist said:

We already have established elsewhere that you're okay with violating others' rights, up to and including stealing and even murder, for the sake of your own survival

This isn't quite accurate, because this position applies to specific scenarios, and in those specific scenarios, I don't think any correct answer could be provided by a rational code of ethics (it would be something like an aesthetic choice instead). These are specific circumstances. The trolley problem/situation is not one of those circumstances.

My new trolley problem was just a joke, I wasn't trying to be serious. I mean, your response was like that, you seem to say any kind of interference with someone else's property must necessarily be a violation of property rights. But if it is a serious question to you, why don't you just answer? If you want to half serious answer to half serious question, I would pull the lever, but I don't think it would constitute a violation of property rights. In that specific circumstance, I know I would actually be saving their property from certain destruction, on top of how I don't think this imminent danger is even a reasonable expectation when I pay for a trolley ticket (violating a reasonable expectation without prior explanation that this trolley was somehow different would be fraud, I would argue).

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...