Lexicographic utility functions

Intuitions about there being extreme kinds of suffering that cannot be outweighed by any amount of happiness and that are more important than any amount of mild suffering violate the continuity axiom* of the Von Neumann-Morgenstern (vNM) utility theorem. Does that mean that holding extreme suffering to be impossible to outweigh (as, for example, threshold negative utilitarians do) makes it impossible to represent your preferences with a utility function? Can you not maximize expected utility? Is it irrational to hold such preferences?

It turns out, there’s a theorem which basically says that such preferences can still be represented by a utility function, it just has to be taken from a broader space. The function does not necessarily map to real numbers, but to some larger set of possible utilities. Specifically, without continuity there is still always a utility function that maps outcomes onto members of a lexicographically ordered real-valued vector space and that accurately represents the given preferences. A very good and (to the mathematically literate) fairly accessible exposition is given by Blume, Brandenburger and Dekel (1989), ch. 1 and 2.

Complications arise when the space of possible outcomes (the things that utility is assigned to) is infinite, which would allow for an infinite number of thresholds – an infinite hierarchy of outcomes, each of which is infinitely better than the lower ones. This can’t be captured with a finite-dimensional, lexicographically ordered, real-valued vector space, anymore. However, in this case, one can map lotteries into an infinite-dimensional space with a lexicographic ordering. Alternatively, one can add an axiom which limits the number of “levels” to a finite number n and then an n-dimensional real-valued vector space suffices again. The latter is done by Fishburn (1971) in A Study of Lexicographic Expected Utility, which is pay-walled and not as readable.

It would be good if those interested in suffering-focused ethics knew that continuity in the vNM axioms is not really an argument against thresholds. (In general, continuity seems less compelling than completeness and transitivity.) Saying that holding extreme suffering to be impossible to outweigh is irrational because it violates the vNM “rationality axioms” is an objection that I would expect to be raised, and it would be good if proponents of such a view could easily refer to some place for a clarification without spending too much time on this red herring. Personally, I don’t think I’d defend this view myself, but despite moral anti-realism “what can be destroyed by the truth, should be”, even in the field of ethics.

*E.g., if N and M are mild amounts of happiness/suffering such that M

Edit: Simon Knutsson made me aware of some discussion of the continuity axiom in the philosophical literature:

  • Wolf, C. (1997): Person-Affecting Utilitarianism and Population Policy or, Sissy Jupe’s Theory of Social Choice. In J. Heller and N. Fotion (Eds.), Contingent Future Persons.
  • Arrhenius, G., & Rabinowicz, W. (2005): Value and Unacceptable Risk. Economics and Philosophy, 21(2), 177–197

  • Danielsson, S. (2004): Temkin, Archimedes and the transitivity of ‘Better’. Patterns of Value: Essays on Formal Axiology and Value Analysis, 2, 175–179.

  • Klint Jensen, K. (2012): Unacceptable risks and the continuity axiom. Economics and Philosophy, 28(1), 31–42.

  • Temkin, L. (2001): Worries about continuity, transitivity, expected utility theory, and practical reasoning. In D. Egonsson, J. Josefsson, B. Petersson, & T. Rønnow-Rasmusen (Eds.), Exploring Practical Philosophy (pp. 95–108).

  • Note 7 in Hájek, A. (2012): Pascal’s Wager. In: Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2012 Edition).

One thought on “Lexicographic utility functions

  1. Thanks. 🙂

    > Saying that threshold NU is irrational because it violates the vNM “rationality axioms”

    I have an “ethics axiom”: “Views other than negative(-leaning) utilitarianism are wrong.” Clearly, views other than N(L)U are unethical because they violate my ethics axiom.

    Like

Leave a comment