Moral Beliefs as Prices

Would you kill them for money?

No it would not be for me. I cannot see myself doing that but then again how can we ever really know unless faced with something like this though we would like to believe otherwise. I am not so money hungry. The money would go to the orphanage or to St. Jude’s or to rescue animals.

I might have to stipulate in a contract that it would not go to me - just those places - just in case. :laughing:

Considering someone to have little value does not directly translate into being prepared to kill .

If someone was to do the deed instead of you, then would you consider it to be good and moral? The payment would go to fund noble causes, of course.

That’s what Carleas is suggesting … “choose any charity, give to the poor” as the song goes.

If you enjoy killing, does the buyer need to pay less to keep things moral?

Psychopaths will be well off, in any case, even if they have to tally up a few more murders.

I suppose the price would also have to be higher, like if they wanted you to kill your kid or your mother and you also, coincidentally, felt affection for these family members.

But there is always a price Carleas and I guess most people would go for.

And again, while standing feeling a little guilty at mom’s cemetary plot, you can comfort yourself with the fact that you got a bigger sum of money which you can give to doctors without borders to help even more children get rid of harelips or even life-saving operations. Even, with the extra sum, spend a little on yourself, a vacation, perhaps. I mean, one death in the family, 6 kids saved, and the family-murder bonus could go to a week in Barcelona. Still a net gain folr others.

People killing loved ones or random people will have no negative side effects on how we bond and function as societies.

And then there’s the bonuses for raping members of your own family.

There’s always a price that convinces. If you think you wouldn’t rape your own daughter, you just don’t realize how tempting a billion dollars is. Your self-assessment must be wrong.

Is this your response to the vanilla trolley problem as well?

Yes, by random I mean there is an equal probability of it being any person. It’s the same ‘random person’ that stands on one side in the trolley problem, and five of whom stand on the other side. The people being saved by the charity are also random people.

And you can make the decision without know who they are specifically because expected value is well defined, and because whatever the expected value of 1 person, the expected value of 5 people is 5 x [expected value of 1].

We can plug in specific people to change the question, but that’s just a different question.

I think the original idea was not a random person, but rather an anonymous stranger, a ‘known-unknown’ person. I don’t think this changes the math, but feel free to substitute if it keeps the question on track.

KT, I’m not proposing a policy, I’m proposing a thought experiment: put dollar values on your moral beliefs. You promised you’d try.

Carleas, you assume that the non-zero in randomness must be another person … what if the infinitely valuable person was you or I ?

It’s defined as random after all. Why should an infinitely valuable person give power to those who again, seek to kill it. See, in your thought experiment, everyone, including us, is wearing the mask of randomness. So as a cost benefit analysis, it makes no sense for anyone to give all their sustanence to another random person.

The vanilla trolley problem does not appear to have hidden consequences in the options. At least, I don’t see them.

It doesn’t suggest that killing the people on the tracks is good. It doesn’t suggest that random people ought to be killed in the future.

Phyllo wrote

Later Carleas wrote

Perhaps some folks believe that a life, any life, is priceless and not interchangeable with another life. Is this a thought experiment in how to be evil and justify it?

As Karpel Tunnel said the psychopaths would get rich, their workloads would be tremendous especially if they were willing to off people for $1.99 without any donations to charity. Why better, save any lives through a charity if lives aren’t priceless? Letting people parish due to their poor luck and lot in life would be extremely cost efficient. In world A the mindset that possibly everyone is expendable for possibly a 1 cent payment does make a life saving charity absurd. That type of mentality was not advertised in world B. World B would be better for everyone for everyone would have greater odds of surviving without the rampant kill for a buck mentality.

I’m sympathetic to the ‘hidden consequences’ argument, but that argument does not make the question meaningless or unanswerable or absurd. Hidden consequences distinguish the hypothetical world in which anything is possible from the real world. So what you’re saying in appealing to it seems to be, yes, in the hypothetical, we should kill the person if offered a trillion dollars, but in the real world we shouldn’t because XYZ.

I feel like you’re resisting that pretty strongly, but the rejection of an absurd hypo is missing the point. Look at Hillary Putnam’s twin earth thought experiment, it’s as absurd as can be and it doesn’t matter, because it helps to isolate certain concepts.

First, the original problem does suggest that killing the one person is good, at least to a consequentialist who values human life: It is a moral good to cause the death of one person who would not die but for your intervention in order to save five people who will die but for your intervention.

Second, the problem I’m proposing doesn’t suggest anything about the future. Let’s concede, if you require it, that this will be just the worst if it happens all the time, and just mentally insert into the hypo whatever additional props you need to limit it to a one-time offer to you and only you.

Doesn’t depend on the future?

All cost benefit analysis would be null and void if no future for anyone existed after the event.

In World B, Joe Random had some kind of “right” to exist and to be free of harm. He doesn’t have that in World A.

I see that as very important - more important than the math.

It’s not stated but it’s there.

If you lose it once, then it’s very hard or perhaps impossible to get it back.

To clarify, I’m just not trying to extend the analysis to rearranging society so that what we’re talking about happens all the time. I see the questions of “Should you do X in this one-off situation” and “Should we as a society make doing X a regular part of our everyday lives” as separate questions that it is consistent to answer differently.

Sure, but the same is true if you pull the switch in the vanilla problem, right?

No

Unfairness exists, but you don’t create the unfairness.

Injustice happens, but you don’t make it happen.

You don’t choose a world where rights are destroyed.

=D>

I don’t see how you aren’t doing that when you intentionally hit someone with a train, but you are when you intentionally shoot someone with a gun.

If you can enjoy your wealth after having obtained it by killing a random person, your life must have been supremely shitty beforehand.
That indeed there are a lot of such humans is reason I rank all other mammals above humans (in general) qua degree of sentience.

Spending wealth badly or selfishly is a separate moral question. If rather than “enjoy[ing] your wealth”, you use that wealth to do more good than you have done wrong, you can leave the world better off for having done that wrong, and a consequentialist should conclude that the transaction was a good thing, i.e. if World A is better than World B, a consequentialist should be OK with someone taking actions that lead to World A instead of World B. If feeling bad about it weighs against World A, increase X to compensate.

Has anyone, particularly you Carleas, seen the movie, The Box?

[youtube]https://www.youtube.com/watch?v=nSOjMkoBYYA[/youtube]

Haha, looks like this discussion has been done! I haven’t seen it, but I’ll add it to my list in case it ever comes on my streaming platforms.

I found the short story it’s based on, Button, Button.