ChicagoCommunist [none/use name]

  • 0 Posts
  • 4 Comments
Joined 3 months ago
cake
Cake day: August 19th, 2024

help-circle















  • undermined their imperative power.

    Maybe this is me being a vulgar materialist but on a larger scale I think ethical considerations are mostly normative and derived from power relations, and superstructural. There’s a strong tension to define right and wrong around whatever material/class interests are at play.

    It’s good that there’s people thinking about ethics and trying to hash out detailed and coherent models and whatnot, but I think for most if not all people ethics is gonna be very vibey and dynamic, maybe instinctive or intuitive.

    Which meshes, I think, with your final statement. Actually-existing ethics isn’t particularly scientific nor mathematical. Imo it’s constantly being produced between people at every level of relationship, and philosophical models are tools that help us communicate and hone in on ethical concepts, perhaps identify contradictions and power dynamics.


  • I don’t think any theoretical model is gonna be able to perfectly describe the complexities of human ethics, let alone prescribe “good” actions in broad strokes. But any of them might be useful lenses to judge a situation by.

    Utilitarianism is useful (ha) in situations like the prisoner’s dilemma, where a selfish action results in less overall benefit than a non-selfish action, despite potentially resulting in more personal benefit. Most of us won’t face the exact prisoner’s dilemma, but there’s frequently decisions to be made with a similar structure:

    Some task has to be done, neither I nor my roommate want to do it. But I know it’s harder for my roommate to do (for whatever reason) so the utilitarian action would be to do it myself because that results in less misery than if my roommate did it.

    I have a pizza. Theoretically I could eat the whole pizza, but the second and third slices aren’t nearly as enjoyable as the first. So sharing the pizza with other people maximizes overall utility at the cost of marginal personal utility.

    I gain omnipotence and decide to construct a universe where one person is tortured for all eternity to avoid a billion trillion gazillion people from ever getting an eyelash stuck in their eyes. This maximizes utility because a billion trillion gazillion times negative 0.01 utils outweighs 1 times negative 1000 utils.