Quantcast
Channel: January 2013 – Rational Altruist
Viewing all articles
Browse latest Browse all 4

Pressing ethical questions

$
0
0

In general I spend surprisingly little time thinking about ethics. My thoughts tend to go like this: even if I don’t know exactly what I want now or what I will want in the future, there are some convergent instrumental goals which I want to pursue anyway, and I can mostly postpone ethical deliberation. (Here I am going to set aside my self-interest and focus on my altruistic interest.)

In particular, for a broad range of values, the first thing to do is to establish a stable, technologically sophisticated civilization at a large scale, which can then direct its action on the basis of careful argument and reflection. When I need to make a tradeoff between clarifying my ethics and increasing the probability of such a civilization existing, I’m not inclined to reflect on ethics. This might be an error, but it’s my current well-intentioned best guess.

However, there are a few decisions I face today that do require that I have some idea what I value. So it seems worth putting in a bit of time to get a clearer picture. Here are some ethical questions that seem to bear on immediate practical issues (albeit, often in a roundabout way):

  1. Who do I want to have influence over civilization?
    • Should I be OK if humans all die and later life on Earth replaces us?
    • Should I be OK if humans build automations with significantly different values, and they replace us?
    • Should I work to change social values to be more in line with my own values?
    • How concerned should I be with gradually shifting social values over time, and how hard should I work to prevent such changes?
  2. How much should I value the long-run?
    • Should I care about disorganized futures, in which the universe is not very efficiently converted into morally relevant experiences? These futures may have very low populations.
    • Should I have any altruistic concern for the events of today, or are they negligible compared to the (more numerous) events of the future?
      • Should I care about the experiences people have today?
      • Should I care particularly about the people alive today?
    • On net, is an expansive human civilization a good thing or a bad thing?
  3. Are we in a rush? Should I be happier if humanity gets wherever it is going faster, or should I care only about where humanity gets?

These questions will very loosely guide my own ethical inquiries. I realize I’m already narrowed down the space of ethical theories enormously (basically accepting consequentialism, a strong presumption towards caring about the future)—this is the result of a good bit of ethical thinking, more than most people do though not enough to have a clear picture of my own values.

ETA: At Luke’s suggestion, here is a publicly editable Workflowy with the same list. The modal outcome is no one adding anything, so if you have thoughts or references on any point don’t be timid about adding them.


Viewing all articles
Browse latest Browse all 4

Trending Articles