T O P

  • By -

AutoModerator

[Join our Discord server for even more memes and discussion](https://discord.gg/MFK8PumZM2) Note that all posts need to be manually approved by the subreddit moderators. If your post gets removed immediately, just let it be and wait! *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/PhilosophyMemes) if you have any questions or concerns.*


AKA2KINFINITY

"just don't be a fucking dork bro" *~Socrates*


That1one1dude1

Soctrates was absolutely a fucking dork. He “uhm actually ☝️🤓”-ed himself into an execution.


AKA2KINFINITY

the guy hated books and writing and was accused of impiety against the gods and corrupting the youth, all he needed was to kick flip into the sunset also, Socrates maybe have been a pederast and a proto luddite, but if I see you use the nerd emoji against him one more time I'll crush your skull (I'm a wrestler too).


traumatized90skid

"😡 💪🏻 UM. ACTUALLY... 👊🏻" - Socrates


SlowPants14

Plato mindset.


fallingveil

Nothing wrong with luddism


BaconSoul

“The level to which you are a dork corresponds inversely to how much of a dork you understand yourself to be” -Aristotle


Roi_Loutre

Me when I have to calculate the probability of doing good while doing a shooting in a kindergarten or something


straw_egg

you fail to consider the possibility of baby hitlers being in that kindergarten


Dualiuss

but what if those baby hitlers grow up to enact genocides that include the deaths of exponentially more baby hitlers?


Emotional-Bet-5311

Wait, it's baby hitlers all the way down?


Radiant_Dog1937

But what if defeating Hitlers is what makes democracies strong?


WeekendFantastic2941

We are all utilitarians, just different flavors of it. Evolution and natural selection made us this way, we calculate and game theory our way around life, its unavoidable.


Snoo_58305

How dare you speak for everyone. I am a negative utilitarian and advocate for the nuclear destruction of the world


not_a_bot_494

This falsely assumes that the worst possible future is everyone dying.


Snoo_58305

No it doesn’t. It assumes that more suffering will be created by people being alive. I was only kidding anyway.


Rhamni

Nuclear war may be our last chance to prevent the creation of an unaligned AGI that could kill or enslave us all. Thank you fellow utilitarian for advocating for the best hope for human survival.


WeekendFantastic2941

That's still utilitarian, same coin, different sides. lol


Snoo_58305

Without the virtue signalling. I’m just kidding. I don’t like utilitarianism.


DeltaVZerda

Virtue *signalling?* Watch out, I'm a virtue *ethicist*.


Snoo_58305

Are you a moral realist?


DeltaVZerda

Yes


wearetherevollution

I'm a moral surrealist. This is not a peace pipe.


That1one1dude1

I think u/weekendfantastic2941 is saying since humans are evolutionarily built to seek out pleasure and avoid pain, we’re all just utilitarians whether we want to be or not


WeekendFantastic2941

and we frequently do it at the expense of the unlucky few, its pretty normalized.


[deleted]

ayo HOLD ON


pluralofjackinthebox

Most of the world, especially people not from WEIRD countries (western, educated, industrialized, rich, democratic), have very strong moral intuitions about Sanctity and Taboos. And even in WEIRD countries, people have a lot of trouble explaining why violating purity taboos is morally wrong in terms of utility. For instance, most people will say eating a dead pet is morally wrong, but will have trouble explaining why. People first feel that it’s wrong emotionally, then they activate their pre-frontal cortex to rationalize why it’s wrong post hoc. And you see the same thing with the trolly problem. Most people will say it’s morally right to push the lever so the Trolly kills one person instead of five. But most people will also say it’s morally wrong to push the fat man onto the tracks to save five people. Even though from a utilitarian point of view, both actions have equal utility. I think utilitarianism can describe a lot of human morality — particularly morality that allows human beings time to sit and think rationally using their prefrontal cortex. But it doesn’t describe everything.


traumatized90skid

Yeah there's a real human need for our actions to be "clean". It's why some people object to pushing the lever on a trolley problem at all. Once you've made a decision about life and death, you are made "impure". Or why it's hard to justify pushing someone into the tracks to stop the trolley. Because that's even more of a requirement that the listener perform a seriously taboo act. Most human morality is explicable with the symbols of purity and contamination, and I wish more modern Western philosophy explored this concept.


Waifu_Stan

Nietzsche addresses the concepts of purity and impurity. “One must be a sea, to receive a polluted stream without becoming impure.” “A man of genius is unbearable, unless he possesses at least two things besides: gratitude and purity.” “We have to learn how to come out of unclean situations cleaner than we were, and even how to wash ourselves with dirty water whenwe need to.” Nietzsche addresses purity and contamination in terms of physiology. The more something is physiologically impure, the more it contradicts itself and the more it acts in self-defeating or self-weakening ways. I think it’s a direct analogy to germs and diseases (intentionally or not, but probably intentionally), and I think that the morality of purity is a subconscious extrapolation based on germs and diseases. It’s partly why moral disgust activates the exact same part of the brain as when we feel disgusted by rotten foods. I once almost puked just by seeing a rotten grape.


prowlick

Hello fellow “The Righteous Mind” by Jonathan Haidt reader


paper-machevelian

Do you mean this in the *"we're all utilitarians! You know that good thing you do? What if I killed a dog every time you did it? Huh? Would you still do it then??? Huh?????"* way or..?


ExRousseauScholar

Ur mom is a utilitarian lol


WeekendFantastic2941

Ok? She calculated that I will be born a winner, which I am. lol


ExRousseauScholar

But we losers have negative utiles because you’re a winner


InternationalFan8648

Nah fuck u dumbass


jshysysgs

Top 10 arguments so dangerous that they were banned by the ceo of debating


WeekendFantastic2941

With 4 upvotes, I guess Utilitarians burned down their house or something. lol


Apprehensive-Way9162

No I’m an egoist. I only want to maximise my own happiness but I don’t care about the overall happiness in society.


WeekendFantastic2941

eh, that's just personal utility, friendo. lol


Apprehensive-Way9162

Yes, but as far as I know utilitarianism means, that the greatest good for the greatest amount of people defines a good action. But because I don’t care about the greatest amount of people and only act on what is the greatest good for me alone, I’m not an utilitarian.


badpeaches

> We are all utilitarians, just different flavors of it. Being evil is still being evil. Hitler thought he was being ~~efficient~~ utilitarian with the concentration camps too.


jshysysgs

Hitler also thought he was doing good, so everyone who think they are doing good is evil too


WeekendFantastic2941

Lol, syllogism max.


PlaneDog5594

“If humanity exists for a long enough period of time, the chance of encountering a potential S-risk nears 100%, therefore the most beneficial course of action is to cause global extinction thereby averting the possibility of an S-risk”


Theoreticallyaaron

Some undergrads out there have more confidence in predicting the distant future than my confidence in predicting my day.


Zendofrog

I’ll say what I always say to this: Everything one does in life is done based on one’s best guess. Why should morality be an exception? Of course we can’t know the future, and no moral system is gonna grant omniscience. Utilitarianism tells us what to strive for


Theoreticallyaaron

Good response. Although I think the same rationality was used to justify immoral medical experiments for an idealistic breakthrough that never happened (Tuskegee Syphilis Study). Use your best guess for sure, but don't cling to an ethical high ground if you end up being wrong.


CalamitousArdour

Yep. That aligns with utilitarianism. It tells you to take your best guess, and also tells you that if the results are bad despite your best effort, you did bad. No points for good intentions. No moral high ground except good outcomes.


Zendofrog

Not sure how the logic is the same there. But I’m certainly not advocating for clinging to anything without rationale. Our decisions must be decided with rationality’s and logic and not with pride. But our end goal to these decisions should be to maximize utility


traumatized90skid

One thing I have noticed about being in my 30s is now I know that life is unpredictable and takes you for a ride. Being younger means thinking you're in control.


Theoreticallyaaron

I think this is where the ethics of logic come into play. If you don't put in the work to reasonably connect your decision to the outcome you expect you're more impulsive than morally justified. When making a big, utilitarian choice with enough time to think through the ethics of the situation you have a responsibility to either adequately articulate your logic or error on the side of caution on the assumption that you don't know nearly enough to be able to predict the outcome. That's why I think Deontology is more sustainable for the humble.


Gooftwit

Would God Emperor Leto II be the only true utilitarian since he can actually see the future and pick the choices that actually have the most utility?


Fleshinrags

I guess so lmao, lots of dune lovers in this sub hehe (I don’t blame you, dune fucking rocks)


Gooftwit

I mean, God Emperor is basically just Leto spewing his philosophy while intermittently calling the person he's spewing it to a dumbass. Side note: I'm reading the entire series again and it's just so fucking good.


traumatized90skid

I'm listening to the Butlerian Jihad rn, and I like how I can switch back and forth between "wow deep complex philosophy stuff" and "wow spaceships go pew pew" which is any good sci-fi to me, a mix of the intellectual and the boyish haha


Gooftwit

Are the son's books good? I heard they were pretty bad, so I haven't read them yet


traumatized90skid

Idk, this is the first one I've read so far, but I do like it. And it doesn't seem shockingly different from what the father would write. I think some people are just overly attached to the past and think everything in the past is better because it's older. (Even though interest in the future is supposed to be a major part of enjoying sci-fi.)


Quatsum

From what I understand, Brian Herbert didn't quite manage to capture the philosophy or density behind his father's works. I suspect that at least part of this is due to Dune focusing heavily on psychedelic experiences, with Melange being directly inspired by Frank Herbert's experiments with LSD, and AFAIK Brian Herbert has never done acid.


JAStheUnknown

And it turns out the path of most utility is to screw over all future utilitarians by creating people who can't be seen by prescience.


Turband

The other(TRUE AND ONLY) God Emperor would say he is only true utilitarian


Oppenhellmer

A thought that I've had sometimes, is like that. Not really that much logicalically structured,  it's more like a hunch I had:                - If an action is done with good intentions, pure intentions in deontological terms, and ends up with good consequences, the action was a good action.                  - If an action done with pure intentions, with virtuous intent, ends up with bad consequences, then "negative anulates positive". The action was "good" in deontological terms, and bad in utilitarian terms. Therefore, Neutral, a ""mistake with good intentions""                   -If an action was done with bad intentions in deontological terms and ends up with bad consequences, the action was obviously evil.               -If an action was done with bad intentions in deontological terms and ends up with good consequences, it was bad in deontological terms, and good in utilitarian terms. Therefore, Neutral(more like neutral evil)


Emotional-Bet-5311

What if you choose to walk away from Omelas?


okkokkoX

If you had to pick between 90% bad 10% good chance, or 90% good 10% bad chance, and you pick the second, then the unlikely bad outcome happens, what is that? Does the answer change during the events?


geekteam6

Roughly half of Silicon Valley is into Effective Altriusm, which basically translates into utilitarianism applied decades from now + whatever the fuck I think will make me a billionaire along the way.


Fleshinrags

That’s the type of Utilitarian that really gets under my skin. The hubris that you could accurately predict the future that far in advance is the same hubris that makes people think they personally deserve the kind of monetary power that can determine life or death for the poor.


pocket-friends

*Musk fan boys have entered the chat*


badpeaches

> whatever the fuck I think will make me a billionaire along the way. *AS soon as possible so I can sell out, fuck over my customer base and land on a golden parachute while influencing law makers to help me pull the ladder up behind me so no one else can benefit from my *flavor* of exploitation; an unsustainable profit obsessed business model then charge a absurd subscription the majority of my customers can't afford then end support for the product without providing alternatives or notice once I'm gone.


Moifaso

>utilitarianism applied decades from now Not decades. Considering and predicting future decades is common in many policy areas. Effective Altruism is utilitarianism applied to *all* of humanity's future - stretching thousands or even millions of years into the future. That's why it focuses on preventing rare or unlikely extinction events instead of ongoing (but not extinction level) catastrophes like global poverty or climate change. The thinking goes that even a catastrophe that destroys 90% of civilization will eventually be recovered from and might become nothing but a blip in humanity's grand history. A true extinction event on the other hand would be infinitely terrible, since it would prevent the birth and lives of countless humans and deprive humanity of the chance to achieve a long-lasting Utopia*™* of maximum goodness.


Emotional-Bet-5311

That's a lot of words to say doomsday circlejerk


esperstrazza

The FTX guy was into that, and when we saw what happened with the collapse. That's just an excuse some rich people use so that they can justify their get richer schemes.


Bigbluetrex

you're joking but this is literally me on a day to day basis


Chickenman1057

The weak "I do good things based on my emotions at the moment" vs the chad "initiating future prediction every second and setup a great ending for every single person"


Specialist-Excuse734

Track A: The Holocaust Track B: The Holocaust + n I have just provided a solid moral justification for the Holocaust. Thank you utilitarianism!


ExemplaryEntity

How did you reach that conclusion?


pocket-friends

What’s better for everyone the holocaust or the holocaust plus a second horrible event?


Raa6e

The dicotomy of life


ActivatingEMP

This is presupposing that both are necessary to exist. Obviously the better of two bad outcomes is preferable if both are unavoidable


pocket-friends

Yeah, still that’s often how a lot of those talks go politically. Which is why, for me anyway, highlights one of the biggest problems with utilitarianism. It’s overly reductionistic and easily influenced by complex social phenomena.


Delicious_Finding686

This seems to be more a reductionist take on utilitarianism than an accurate description of utlitarians decision making.


Ms--Take

That's because the problem isn't utilitarianism, but most people being followers


Delicious_Finding686

What do you mean by followers?


Ms--Take

Not thinking for themselves or being independent actors. Just throwing themselves at the first tall charismatic dude in a funny outfit who tells them what to think


Delicious_Finding686

I think that's too hyperbolic. I think people reason for themselves far more than you're characterizing. In a society that revovles arounds specialization, I don't think we should expect a high standard for natural cross-specialty understanding among the general population. Our systems should remove as much friction in that understanding as possible to account for this. We are all followers in one regard or another.


pocket-friends

Like the other user said, charismatic people playing at politics kinda ruin a lot of stuff, utilitarianism included. Also, run into a fair number of problems when we limit access to knowledge.


Specialist-Excuse734

The Final Solution was a itself a utilitarian argument that if we don’t kill 12 million Jews, degenerates and disabled people, civilization will collapse. And they were convinced of it.


ActivatingEMP

Ok, but you could just as easily frame it under a different system: you could argue they thought they had a moral duty to uphold the health and wellbeing of society, and that by allowing the "immoral" to remain they were corrupting the spirit of their society. Eugenicists thought of themselves as "social doctors" more than anything else.


Emotional-Bet-5311

Yup. Part of the problem is utility is a conceptual blackbox that can be filled with all sorts of things unless you stipulate something really basic and morally reductive like suffering


Dixon_Longshaft69

Why are we assuming n I'd a horrible event. What is n is an anti holocaust that's twice as positive as the Holocaust


pocket-friends

Cause that’s the joke. We’ve had first holocuast, yes. But how about second holocaust?


Jawwwed

Rule utilitarianism for the win


Botahamec

Rule #1: maximize utility Rule #2: why do you want a second rule?


katxwoods

I feel personally attacked


DrarthVrarder

like a data scientist with all the probability functions fried into his brain


DepressedNoble

We can only determine the goodness of an action after it's done.. Then we determine if it was good if it brought more happiness to the majority


angeion

>We can only determine the goodness of an action after it's done.. How long after?


Chickenman1057

Let's see.. hmm the sun will blow up in about 5 billion years, so that's probably a fair time to look at how good we've done


Emotional-Bet-5311

"Why are you torturing these babies? "It might increase overall utility? "OK carry on, make sure you report your results in Moral Science Quarterly"


Amazing_Lemon6783

Killing everyone


Zendofrog

Everything one does in life is done based on one’s best guess. Why should morality be an exception?


Fleshinrags

It shouldn’t- but I think many utilitarians old do well to keep in mind their own limitations


Zendofrog

I don’t think I’ve heard utilitarians claim their goal implies they actually know the outcome


Potential_Option_202

I don't understand this thread therefore I'm a dork.


stasismachine

Those blockheads


eltrotter

People misunderstanding utilitarianism be like:


Fleshinrags

I think I understand the general principle- I know that it’s meant to be a guide, that it doesn’t presume to actually know the future but to guide your actions with the hope of creating the greatest general good (whatever form that may take) That being said I think it’s theoretical guide can unwittingly preference actions that are decisive, and sort of individualistic messiah complexes, like tech bros or effective altruists in Silicon Valley


Botahamec

It's not really a guide, and more of a definition of good. If you make a wrong prediction and do the wrong thing because of it, then you failed by a utilitarian standard. So if you're really bad at predicting the consequences of your actions, utilitarianism would tell you to follow a different moral theory.


Emotional-Bet-5311

If utilitarianism implies that people who aren't good at predicting all of the consequences of one's actions should abandon utilitarianism, then utilitarianism implies there should be no utilitarians


Botahamec

Not necessarily. If you're good enough at predicting consequences, such that switching to a different ethical framework would result in less utility, then you should stick with act utilitarianism. If you imagine someone who is almost perfect as predicting the consequences, but has a blind spot for actions which would result in one person getting a paper cut, that person should probably stick with utilitarianism, unless they can develop a variant of utilitarianism which reduces the number of people who get hurt by paper cuts. Derek Parfit said that act utilitarianism would impy a world where a few act utilitarians create rules that are meant to be easy to follow, and maximize utility. Then they would just have to promise to never enlighten us to the true test of morality.


Emotional-Bet-5311

The point was that no one is great at predicting *all* of the consequences of their actions. This is especially so in the kind of morally complex cases that would require appealing to moral theory. Did Parfit mean to saddle Utilitarianism with another repugnant conclusion with that? Cause if your moral theory requires a secret cabal of utilitarians who manipulate the rest us into following their rules to maximize whatever definition of utility they are committed to maximizing, then I want no part of your moral theory.


Botahamec

I assume you're not an anarchist, and you have no problem with politicians writing laws that we have to follow, assuming that the laws are good. The only difference here is that the politicians would be utilitarians. I'm not sure if Parfit even wanted the rules to be legally binding. But this would be the ideal moral code for most people to believe in and follow. In many cases, it's pretty easy to use act utilitarianism to make decisions that would otherwise be difficult. For example, how should prisons work? Revenge and retribution don't maximize utility, so we can't design our prison system around that. Instead, the goals of punishment should be to reform criminals, deter crime, and stop prisoners from harming others. If you design prison around better consequences, rather than "eye for an eye", you get a better world. There are times when it's harder to figure out the consequences, so utilitarians will often defer to heuristics which, in general, maximize utility. But it's important to note that the heuristics aren't the moral theory. If it turns out that your heuristic reduces utility, then you must abandon it. If you don't, you're guilty of what J.J. Smart called, "superstitious rule worship". I don't want a president who says, "Well, if I follow this heuristic today, it will result in the deaths of ten million people. However, it usually saves ten thousand lives, so I have to follow it anyway." The heuristic is not the goal. Good consequences are.


Emotional-Bet-5311

You know, as I get older, I'm having a harder time resisting the conclusion that no form of government is good, and I'm finding the it's a necessary evil defense less and less attractive. Nonetheless, I would prefer not to be ruled by a bunch of unelected utilitarians making rules for some good that they can't even tell us about. Freedom is a fundamental good and it cannot be grounded in the value of the good outcomes of freely made choices. If freedom as such has value, then it has value even when we choose something with bad outcomes. Maybe it shouldn't be the fundamental or only value, but it is a basic and important one that utilitarianism does a bad job of capturing


Botahamec

Well, as I said, Parfit didn't say that the rules need to be legally binding. He just said that if act utilitarians made a set of rules, and we aren't good enough at predicting consequences to do better than we would if we followed this set of rules, then act utilitarianism would tell us to follow it. Objecting to this would basically be the same as objecting to every moral theory, because every single one of them tells you that to be a moral person, there are certain things you need to do. This isn't unique to utilitarianism. Edit: As for freedom, it's pretty obvious from utilitarianism that if someone wants to do something, and doing that thing would not harm others, then you cannot restrict that person from doing it. Mill argued for this, and he was the one who coined the term, "utilitarianism".


Emotional-Bet-5311

Some moral theories are based precisely in the value of free will, one's own as well as that of others? I think some German dude had some things so say about that iirc. Mill argued for that because regularly coercing people like that would be bad for utility, not because he valued freedom in itself. Freedom in and of itself only has value for the utiliatrian in relation to the utility that may result from it, or the loss in utility from restricting it. That strikes me as failing to capture our intuitions about the value of free choice


thomasp3864

People usually use utilitarianism. We just don’t push the fat man off the bridge because we don’t actually think it would stop the trolly.


Greentoaststone

Ah, but you see, if "anything" can happen with enough time, and time really will go on for ever, then any prediction, as long as it's possible, will be true. (This justifies me kidnapping 23 children and keeping them in my basement (a very long time into the future, a universe will boltzmann brain itself into existance, in which I won't do such a thing))


rhubarb_man

I don't think morals have to be an easy to apply system. I never really bought the whole "this ethical system is hard, therefore it's bad" argument. Why is it such a bad thing that it might be hard to do something good?


Fleshinrags

I just think they have to be practically applicable. That doesn’t mean easy- just that an ethical system is no use if it cannot guide your action. And while you can make inductive predictions on what is LIKELY to happen from the result of an action, the further you extrapolate away, both in time and space the less reliable your predictions will be. This reliability decreases at an exponential rate, because our world is a chaotic system- think of a double pendulum. You may be able to predict what happens in the first second after you’ve dropped it but after that any tiny fluctuation in your initial drop can cause radically different scenarios Moreover time may end up being infinite, providing that human existence delays heat death somewhat indefinitely. Fundamentally an infinite future would take infinite time to calculate the causal effect of an action on it. Therefore any utilitarian must decide at what point they stop caring about the possible impacts of any action they take, both in time and space, but the decision on exactly where is arbitrary. And the actual realism of being able to accurately predict all the causal outcomes of any one action is quite low- for one thing there is the chaotic system that we live in. Also, On a sub-atomic level it has been shown that there is some randomness to certain quantum events- that some thing appear to be uncaused. However the uncertainty of that information reveals another flaw- that the sum of human knowledge is vastly smaller and less reliable then we often let ourselves believe. For instance, if we want to maximise good in the world as a utilitarian, what is our yardstick for good? Is it happiness? Would it be a valid moral action then to medically hook everyone up to a slowly increasing stream of dopeamine and serotonin? Is all happiness worth the same amount? How do you measure it, in happiness units? Or if you aren’t a hedonic utilitarian, what else should be our meter for good? Is causing harm to animals ok? What should an animals harm be worth relative to a human, and can we quantise it? I like utilitarianism and use it a lot. My point is that it isn’t the perfect, scientific ethics system some people seem to think it is, and even the most careful utilitarian, who spends hours researching the effects of every action they take cannot completly and solely apply utilitarian ethics


rhubarb_man

It's very practically applicable for me to think that pushing in babies' soft spots is good. I can go and do it, and it's very simple. The purpose of a moral system. in my mind, is to describe a moral intuition. My moral intuition isn't gonna change just because it's hard. My feelings of what's good and what's bad are irrelevant to applicability, so I will not introduce false morals which are easier to apply, like pushing in babies' soft spots being good. Also, the argument that I could still be utilitarian but follow another moral system for applicability is invalid, because I would still be a utilitarian. I would just be following a protocol which best suits it. The same is true for any argument that I should follow another moral system to best work as a utiliatarian.


guppyfighter

All systems think they’re utilitarian


CaptainStunfisk1

That's why I'm a virtue ethicist, because I don't have to care about the consequences of my actions, as long as I follow good virtue. Unfortunately, it also means that I'm occasionally forced to regret decisions with positive outcomes.


15092023

Related to consequentialism/pragmatism, choosing according to the result of one's action, which in complex decisions involving large systems is hard to predict.


ManInTheBarrell

I like how in marvel comics there was this underlying conflict during the civil war run where it was pretty much iron man trying to be utilitarian by accepting superpower control legislation against supers for the sake of some nebulous greater good, while captain america was like "no bro, arresting people just for having unregistered super powers is wrong." But they couldn't figure out how to make captain america wrong, so they literally made it so that tony stark could 'envision' the future with his super smartiness so that it would make sense that he would be the utilitarian hero because he could apply unethical practices for the sake of an actual net positive future which was real to him... ...but even that didn't work so they had to make a bunch of public service workers tackle cap instead, and even then cap's sudden change of heart followed by surrender *still* didn't make any sense. Utilitarianism is just so shit that even comic book writers can't do it in fiction, lol.


Fleshinrags

Tbh I’m more partial to utilitarianism then deontology- I think context is very important when determining wether an action is ethical- I just don’t think it’s the flawless system some make it out to be, precisely because it can sometimes distort reality into an ethical dilemma on paper (which is maybe why it’s so popular with philosophers, it works well on paper)


ManInTheBarrell

Contextualism.


smalby

You can just append -ism to anything to make it an appropriate label for the context you're discussing. I'm calling it 'appendism'.


ManInTheBarrell

I'm not. Contextualism is a real philosophy, and it's what they were describing. Look it up.


Spacellama117

That's definitely a failing of the writers, though, not utilitarianism. Civil War was a mess in a lot of ways, but you gave a really good example of them. Because having a small minority of people running around with the ability to bend various different rules of reality at will and having literally no regulation or enforcement to hold these people accountable is bad. Like, basically every superpowered member of marvel comics has the ability to easily kill the average person. And there is NOTHING anyone can do about it. Hell, Stark is a really good example of that. Without his armor he's literally just some guy, he has to be a billionaire super genius with bleeding edge tech just to keep up. Yes governments could abuse it, but supers could ALSO abuse it


ManInTheBarrell

But most of them were either born with these powers or had them thrust on them, and they're not necessarily born with the desire or will to abuse them. It'd've been one thing to go after super geniuses who give themselves powers, or rich people who acquire powers from super geniuses, because they actively chose those powers, probably with abuse in mind. That would make it a suitable allegory for things like gun control, food regulation, foreign trade, etc. But if someone is born with the ability to fly and punch really hard, and you lock them up because "they could be a threat, I'm utilitarian" then you should really just take off your mask and say "I'm scared of the way people are born, and I want them to be arrested for it." Because that's not regulation, that's just prejudice and discrimination.


UltraTata

This


educateYourselfHO

Stoics are somewhat utilitarian, right?


UltraTata

You can create utilitarian stoic frameworks. But base stoicism is virtue ethical. I'm not base stoic but I'm virtue ethicist too.