‘Who else belongs to my moral circle?’: The Foundations of Longtermism

By Professor Richard Pettigrew, Department of Philosophy, School of Arts

To celebrate World Philosophy Day, Professor Richard Pettigrew tells us about a new project which will challenge the radical philosophical view of Longtermism—the idea that the impact of our actions on the far future is the most important consideration today. The project recently received a Leverhulme Trust Grant and runs until August 2027.

Should I tell a friend a lie to save them from upset? Should I spend my latest pay check on something for myself, or should I use it to treat a family member who’s been going through a tough time? In ethics, we ask questions like this. We ask what we should do when the different things we might do affect others in different ways. But this raises a question: Which others? Who else belongs to my ‘moral circle’? Many arguments in ethics in recent years have tried to show that our moral circle is larger than we often take it to be. Animal welfare advocates argue animals other than humans should be included in our moral circle; and, more recently, people have argued that, at some point in the future, artificial intelligences might become sufficiently sophisticated that they too should be included. Those who study philanthropy and charitable giving argue that people in countries far from our own, people we have never met, should be part of our moral circle. And, more recently, some have argued that people who will exist in the future should also be included, and not only the next generation or two, but all people who live in the future, whether in the next hundred years or the subsequent million years.


‘If you have to choose between doing something that has a small chance of improving the lives of every future person by quite a small amount, or doing something that will certainly improve the lives of all existing people by a very large amount, morality will often demand you do the first thing.’


‘The Foundations of Longtermism’ is a research project funded for three years by the Leverhulme Trust. The project aims to scrutinise an argument that begins with the claim that people in the near and far future should be included in our moral circle. This argument hopes to establish a dramatic conclusion. It points out that there will most likely be vastly more future people than current people. If there are a little over 8 billion current people, there might easily be 8 trillion future people. And if all those future people are in our moral circle, then morality says we must take them into account when we decide what to do. But the sheer number of them suggests that trying to do things that benefit them is a higher priority than doing things that benefit people who are living now. So, the argument goes, if you have to choose between doing something that has a small chance of improving the lives of every future person by quite a small amount, or doing something that will certainly improve the lives of all existing people by a very large amount, morality will often demand you do the first thing. That’s the dramatic conclusion, and it strikes many as implausible. The Foundations of Longtermism considers the various ingredients in this argument and asks whether they stand up to scrutiny: Does morality really require that we weigh up probability of great gains against certainty of moderate gains in the way the argument does? What are the consequences of such a principle, and should we accept them? Must we really include a person living in a million years’ time in our moral circle, and must we give them the same weight we give people with whom we inhabit the world now?

Professor Richard Pettigrew is Professor of Philosophy in the Department of Philosophy whose research covers a number of different areas, from the philosophy of mathematics and the epistemology of uncertainty, to the theory of rational decision-making and the politics of consent. To find out more about The Foundations of Longtermism project, contact richard.pettigrewbristol.acuk.

Leave a Reply

Your email address will not be published. Required fields are marked *