Overview

This week we will consider some of the ethical positions which inspire effective altruism, how a history of changing ethical norms might affect how we want to do good, and how our own values line up with the tools EAs use.

EAs often advocate for impartiality when doing good. For example, many EAs would claim that there's no intrinsic moral reason why people who are 1000 miles away are less worth helping than people right next to you (though they may be more or less worth helping for other reasons, for example if you have better specific opportunities available to help one of these group of people). There are many dimensions along which you might want to be impartial, such as across space, across time, and between species. Deciding which dimensions you think you should be impartial over might drastically change what you would prefer to work on, so this question is worth a lot of attention. For example, the priority you place on improving the conditions of animals in factory farms varies drastically depending on how much moral consideration you believe animals deserve.

We know our ethical beliefs have a big effect on how we do good, but perhaps we can't trust that our current ethical beliefs are good, or will line up with our future ethical beliefs. Societies’ moral beliefs have changed drastically over the course of history and there may be reason to believe they’ll change again. Should we act in line with the moral beliefs we hold right now? Or should we try to figure out where our moral beliefs might be mistaken, or might change in future, and let that inform our actions?

Core Reading

Moral Progress as an Idea

Possible Cause X’s (Read some but you don’t need to read all)

Exercise (1 hour)

In this exercise we will explore the idea of impartiality and where we might want to apply it. You will imagine you’re in a court which decides how much moral standing to give various groups- how much an altruistic person should want to help them.

The court doesn’t care yet whether it is physically easier to help one group or another (that’s something to figure out later); it is simply asking how much we’d want to help them if it was as easy to help one group or another. It’s useful to consider what we value independently from what we can do, as this makes it easier to work in line with our values when it comes time to consider practicalities.

For example we might think that animals do deserve moral standing, but that may not mean that we prioritise helping them because it may be very difficult (more difficult than the next opportunity)

First you will imagine you are a lawyer on the side of always being impartial, and give the strongest arguments you can in favour of members of the group being worthy of equal moral consideration. Then you will take the opposite side, arguing against expanding the moral circle to include this group. We expect sometimes you will find it easier to argue for impartiality and sometimes you will find it easier to argue against it, but it is still worth considering the best case for both sides.

Go through the categories below and write down all the arguments for and against giving members of the group being worthy of equal moral consideration. Try to spend about 10 minutes per category, with about 5 minutes on each side of the argument.