Economic Decisions and Social Preferences -

in honour of Philippe Mongin

Presentation titles and abstracts


Mohammed Abdellaoui: Ambiguity Attitudes and Learning.

We investigate learning in ambiguous situations where subjects sample from an unknown distribution before betting on it. Varying the number of draws prior to choice, while explicitly eliciting posterior beliefs, allows us to “scan” ambiguity attitudes and perception across differing amounts of information, as well as divergences from Bayesian update. Both attitude to and perception of ambiguity are impacted by learning, in the direction of ambiguity neutrality. Moreover, the impact of these changes on preferences is comparable to that of the divergence from Bayesian update, especially at small sample sizes. These findings emerge under both the smooth ambiguity model and biseparable preferences.

Joint work with B. Hill, E. Kemel, and H. Maafi.

[Presentation slides]


Claude d’Aspremont: Bayesian implementation of public decision rules and belief-class efficiency.

Necessary and sufficient conditions on individual beliefs are introduced to characterize the possibility of Bayesian incentive compatible and budget-balanced mechanisms for any (resp. any belief-class efficient) public decision rule, with imposed (resp. voluntary) participation. The concept of belief-class efficiency is introduced to enlarge the possibility.

This is joint work with Jacques Crémer.

[Presentation slides]

Richard Bradley: Diamond Fairness and Risk Aversion

In his famous 1967 paper criticising Harsanyi, Peter Diamond argued that society could (and perhaps should) strictly prefer prospects that confer equal chances of benefits on all individuals to those that benefit some particular individual with certainty, even when the prospects confer equal expected total benefit from an impartial perspective. In this paper we examine the possible grounds (both individual and social) for the pattern of preferences proposed by Diamond. Firstly, we characterise the relationship between individual attitudes to chances (risks) and social attitudes to inequality required for social preferences to exhibit this pattern. And secondly, we argue that these conditions do not sit well with most rival theories of decision making under risk to Expected Utility theory. Finally, we present a theory of chance-sensitive preference that explains why reasonable social preferences should display Diamond Fairness.

Joint work with H. Orri Stefánsson.

[Presentation slides]



Christopher Chambers: Decreasing impatience

We characterize decreasing impatience, a common behavioral phenomenon in intertemporal choice, and a property with certain normative support in the literature on project evaluation. Discount factors that display decreasing impatience are characterized through a convexity axiom for investments at fixed interest rates. Then we show that they are equivalent to a geometric average of generalized quasi-hyperbolic discount rates. Finally, they emerge through parimutuel preference aggregation of exponential discount factors.

This is joint work with Fede Echenique and Alan Miller.


Pierre André Chiappori: Gender preferences and social norms: an experiment

Are wine-related preferences gender specific? We run an experiment in which wine specialists (men and women) blind test different vintages of the same wines (all white Burgundy), and are asked (i) to indicate which wine they prefer, and (ii) to assess the 'tension' (i.e. acidity) of each tasted wine; acidity is then scientifically measured for each wine. We find that preferences are gender-specific, in the sense that women, unlike men, tend to prefer 'rounder' (i.e. less acidic) wines. More interestingly, women tend to overestimate their preference for 'tense' wines, and this directly translates into frequent mistakes regarding the acidity assessments. It appears that a social norm - 'tense wines are more interesting' - is mostly masculine, yet largely internalized by female tasters as well.

This is joint work with Jo Gryn.

[Presentation slides]


Hervé Crès: Aggregation of opinions in networks of individuals and collectives

This is a theoretical study of the formation of opinions in a bipartite network of firms' boards and directors. A director and a board are connected provided the director is a board member. Opinions are sets of beliefs about the likelihood of different states of the world tomorrow. Our basic assumption is that boards as well as directors aggregate opinions of each other: a production plan is better than another for a board (director) provided every director (board of which she is a member) finds it better. Opinions are stable provided aggregation does not result in revision of opinions. We show that for connected networks, opinions are stable if and only if they are unambiguous and identical; and repeated aggregation leads to stable opinions. Hence, there will eventually be a single society-wide intersubjective "truth".

[Presentation slides]


Eric Danan: Partial utilitarianism

Mongin (1994) proved a multi-profile version of Harsanyi (1955)'s Aggregation Theorem: within the expected utility model, a social welfare functional mapping profiles of individual utility functions into social preference relations satisfies the Pareto and Independence of Irrelevant Alternatives principles if and only if it is utilitarian. The present paper extends Mongin's analysis by allowing individuals to have incomplete preferences, represented by sets of utility functions. An impossibility theorem is first established: social preferences cannot satisfy all the expected utility axioms, precluding utilitarian aggregation in this extended setting. Adapting the objective vs. subjective rationality approach of Gilboa et al. (2010) to the present social choice settings representation theorems are then obtained by relaxing either the Completeness or the Independence axioms at the social level, yielding two forms of partial utilitarianism.

[Presentation slides]


Franz Dietrich: Statically and Dynamically Rational Groups

Rationality has a static and a dynamic side. The former requires having coherent attitudes at a given time. The latter requires coherent change in attitudes over time as information arrives. Aggregation rules should ideally generate a group agent that is statically and dynamically rational, but only special aggregation rules achieve this. I shall discuss the scope of group rationality within three aggregation problems on which Philippe Mongin has worked: binary judgment aggregation, probabilistic opinion pooling, and preference aggregation under uncertainty. I shall draw on two papers, “Dynamically Rational Judgment Aggregation” (with Christian List) and “Fully Bayesian Aggregation” (Journal of Economic Theory, 2021).

[Presentation slides]


Marc Fleurbaey: Universal social welfare orderings and risk

How to evaluate and compare social prospects when there may be a risk on i) the actual allocation people will receive; ii) the existence of these future people; and iii) their preferences? This paper investigate this question that may arise when considering policies that endogenously affect future people, for instance climate policy. We show that there is no social ordering that meets minimal requirements of fairness, social rationality, and respect for people's ex ante preferences. We explore three ways to avoid this impossibility. First, if we drop the ex ante Pareto requirement, we can obtain fair ex post criteria that take an (arbitrary) expected utility of an equally-distributed equivalent level of well-being. Second, if the social ordering is not an expected utility, we can obtain fair ex ante criteria that assess uncertain individual prospects with a certainty-equivalent measure of well-being. Third, if we accept that interpersonal comparisons rely on VNM utility functions even in absence of risk, we can construct expected utility social orderings that satisfy of some version of Pareto ex ante.

Joint work with Stéphane Zuber.

[Presentation slides]


Itzhak Gilboa: Economic Theory: Economics, Methods and Methodology

Economic theory comprises three types of inquiry. One examines economic phenomena, one develops analytical tools, and one studies the scientific endeavor in economics in general and in economic theory in particular. We refer to the first as economics, the second as the development of economic methods, and the third as the methodology of economics. The same mathematical result can often be interpreted as contributing to more than one of these categories. We discuss and clarify the distinctions between these categories, and argue that drawing the distinctions more sharply can be useful for economic research.

Joint work with Andrew Postlewaite, Larry Samuelson and David Schmeidler.

[Presentation slides available]


Takashi Hayashi: Recursive median voter equilibrium in public capital accumulation

We study a dynamic public capital accumulation model with infinitely-lived agents who follow the standard discounted utility model in their roles as consumers and voters. Saving policies are determined sequentially, period-by-period, by majority voting. When there is discounting heterogeneity, but no heterogeneity in consumption smoothing, a unique and Pareto-efficient recursive median-voter equilibrium exists. Although equilibrium need not always exist when agents are heterogeneous in both dimensions, we show its existence and Pareto-efficiency when agents’ preferences are in the CES family.

Joint work with Michele Lombardi.

[Presentation slides]


Christian List: Dynamically rational judgment aggregation

Judgment-aggregation theory has always focused on the attainment of rational collective judgments. But so far, rationality has been understood in static terms: as “coherence” of judgments at a given time, understood as consistency, completeness, and/or deductive closure. By contrast, this paper discusses whether collective judgments can be dynamically rational, so that they change rationally in response to new information. Formally, a judgment aggregation rule is dynamically rational with respect to a given revision operator if, whenever all individuals revise their judgments in light of some information (a learnt proposition), then the new aggregate judgments are the old ones revised in light of this information, i.e., aggregation and revision commute. We prove a general impossibility theorem: if the propositions on the agenda are sufficiently interconnected, no judgment aggregation rule with standard properties is dynamically rational with respect to any revision operator satisfying some mild conditions (familiar from belief revision theory). Our theorem is the dynamic-rationality analogue of some well-known impossibility theorems for static rationality. We also explore how dynamic rationality might be achieved by relaxing some of the conditions on the aggregation rule and/or the revision operator.

Joint work with Franz Dietrich.


Edi Karni: Incomplete Preferences and Random Choice Behavior: Axiomatic Characterizations

This paper proposes axiomatic characterizations of random choice behavior that is due to incomplete preferences. It proposes a model of irresolute choice and examines its applications to decision making under certainty, uncertainty and risk.


François Maniquet: Well-being measurement with reference consumption

We investigate how to define well-being measures when individuals have heterogeneous preferences and consumption bundles should be evaluated by comparison with some reference consumption (such as the average consumption in a reference group, a poverty line bundle, etc.). A well-being measure has three arguments: the current consumption bundle of an individual, her self-centered preferences and the reference consumption, which we call aspiration. Most of the axioms we impose on these measures build on the lattice structure of the set of indifference surfaces generated by convex preferences, as in Fleurbaey \& Maniquet (2017). We characterize five families of relative well-being measures. For three of them, well-being is equal to the ratio between the money value of the consumption and the money value of the aspiration, with prices being endogenous to the preferences.

This is joint work with Domenico Moramarco (ECARES, ULB).

[Presentation slides]



Bertrand Munier: A Maverick Guide to the sources of risk modeling: Maurice Allais’s precedence. An essay in honor and memory of Philippe Mongin

The set of concepts which support today’s prevailing risk and uncertainty theory is frequently credited to Tversky and Kahneman, due to their shrewdly entitled 1992 paper “Advances in Prospect Theory: Cumulative Representation of Uncertainty” (CPT). I will take here an approach similar to Philippe Mongin’s last ante mortem paper, which analyzed Allais’s Paradox mainly through the 1953 paper and through the discussions held at the Paris 1952 International Conference “Fondements et Applications de la théorie du Risque en Économétrie” (labeled “Econométrie”). Philippe conjectured that Allais’s Paradox had in fact been preparatory for a normative theory. Yet, the 1952 contribution to risk modeling still remains insufficiently considered, in contrast to the often-quoted 1953 paper. Credit should definitely be given to Tversky and Kahneman (1992) for the extension of the theory to the uncertainty case. Nevertheless, from the series of Allais’s contributions between 1952 and 1988, and from a few discussions that I have directly witnessed, a half century long genesis of the economic thought on risk modeling emerges. I will argue that not only the seminal ideas, but also the main innovations in this domain, are to be credited to Maurice Allais, along with a separate development of these seminal ideas by John Quiggin (1982), as long as one stays within the bounds of the world of risk, as opposed to that of uncertainty. I offer a few tentative explanations for the relative ignorance of Maurice Allais’s contributions, as well as an interpretation of Allais’s end goal in providing us with rank dependence modeling. I see this paper as comforting in a certain sense Philippe Mongin’s conjecture.


Robert Nau: Arbitrage and rational choice

Most of the fundamental representation theorems for individual and group preferences in rational choice theory can be formulated as linear programming duality theorems. Essentially they are variations on a single result, a theorem-of-the-alternative which appeared in von Neumann and Morgenstern’s 1944 book and was anticipated in de Finetti’s 1937 paper but whose full scope was not appreciated until much later. The primal criterion of rationality is that preferences which are publicly revealed by offers to bet or trade should not lead to an arbitrage profit for an observer, and the equivalent dual criterion is that those preferences should be additively represented by subjective or intersubjective variables such as probabilities, utilities, strategic equilibria, state prices, or interpersonal weights. The primal variables are actions of the body and the dual variables are properties of the mind, so to speak. This characterization of rationality unifies the fundamental theorems of subjective probability theory, expected and subjective expected utility theory, noncooperative game theory (correlated rather than Nash equilibrium), asset pricing theory, and utilitarianism (Harsanyi’s theorem). It also highlights the importance of money for rational thought and communication that is quantifiable in exact terms, as well as the general impossibility and unnecessity of separating probability and utility or knowing an individual's prior stakes in events. In the best cases for measurement, preferences under uncertainty and ambiguity are modeled with risk neutral probabilities, interpretable as products of probabilities and state-dependent marginal utilities for money.

[Presentation video]......[Presentation slides]


Klaus Nehring : Generalized Borda Rules as a Resolution of Arrow's Impossibility Challenge

The Borda rule evaluates alternatives by the unweighted average of the majority margins with respect to all feasible alternatives. Generalized Borda rules (GBRs) evaluate alternatives by a weighted average of the majority margins with respect to all feasible alternatives where the weights may depend on the set of feasible alternatives. We call this weighting function the "relevance index" defining the GBR. A basic result of the paper characterizes GBRs in terms of an axiom of Ordinal Admissibility. Ordinal Admissibility expresses the normative desideratum that the social choice rule be justifiable as optimal on "purely ordinal" grounds.

The Borda rule itself is not a satisfactory response to Arrow's impossibility challenge, since it is highly vulnerable to the exact specification of the feasible set. In particular, it starkly violates the Independence of Clones axiom due to Tideman 1987. The original formulation due to Tideman 1987 has proved rather restrictive and is arguably too strong. In particular, it essentially forces Polarization, ie. it forces the social choice of an alternative that is top ranked by a majority of the agents and bottom ranked by all others.

By contrast, we show that moderate versions of Invariance to Cloning are satisfied by GBRs whenever their relevance indices satisfying appropriate invariance conditions. Such GBRs need not, and, typically, do not exhibit Polarization.

A particular simple and naturally axiomatized index is the Plurality index given by the distribution of agents preference tops. The associated GBR, the "Pluri-Borda rule", has attractive properties and instantiates a new possibility result on Post-Arrowian social choice.

[Presentation slides]


Christina Pawlowitsch: Narrative analysis and game theory: the sequencing of functions in time vs. the matrix.

One of Philippe Mongin's more recent occupations was analytic-narrative studies (Mongin 2010, 2018). In preparation of the colloquium on "The Limits and Possibilities of Narrative Explanations in Game Theory," which he had organized in 2016 at the Wissenschaftskollege zu Berlin, he had sent out a couple of classical texts on the analysis of narratives by representatives of the French structuralist school (Barthes 1966, Bremond 1966, Greimas 1966). In their analysis of narrative, the French structuralists were heavily influenced by the Russian formalist school, most importantly Vladimir Propp's analysis (1928) of the magical tale. Working with a corpus of about one hundred Russian magical tales, Propp had advanced the idea that all magical tales followed the same sequence of basic events, which he called "functions." A major endeavour of the French structuralists was to show that this sequence of functions, which anchors the narrative in time, can be reabsorbed by a "timeless" matrix structure. In this talk, I am going to work-out some parallels of the debate on "the sequence in time vs. the matrix" in narrative theory with the debate on the relation between the extensive form and the normal-form representation - the matrix - of a game.

[Presentation slides]


Clemens Puppe: Condorcet Solutions in Frugal Models of Budget Allocation

We study a voting model in which the evaluation of social welfare must be based on information about agents' top choices plus general qualitative background conditions on preferences. The former is elicited individually, while the latter is not. We apply this `frugal aggregation' approach to budget allocation problems, relying on the specific assumptions of convexity and separability of preferences.

We propose a unifying solution concept of ex ante Condorcet winners which incorporates the epistemic assumptions of particular frugal aggregation models. We show that for the case of convex preferences, the ex ante Condorcet approach naturally leads to refinement of the Tukey median. By contrast, in the case of separably convex preferences, the same approach leads to a different solution, the L1-median, i.e. the minimization of the sum of the L1-distances to the agents' tops.

As the case of separably convex preferences has been dealt with extensively in earlier versions of this work, the talk will focus on the general ex ante Condorcet approach and its application to the convex case.

This is joint work with Klaus Nehring.

[Presentation slides]


Jean-Marc Tallon: Efficient Risk Sharing under Model Uncertainty

We study optimal risk and ambiguity sharing arrangements in an economy with smooth ambiguity averse agents. Ambiguity is embedded in model uncertainty as perceived by the agents. We study the case where models are point identified and revealed by the state. We define two notions of efficiency (ex ante and condiional on a model) and show how they relate to one another, yielding some results on efficient allocations. We next turn to more specific cases, where we specify the utility function and ambiguity aversion. We can then construct a representative agent, which is key to obtain a full characterization of the way risk and ambiguity are shared in the economy in those cases.

Joint work with C. Hara, S. Mukerji and F. Riedel.

[Presentation slides]



Alain Trannoy: The outbreak of WW1: a contribution of rational decision making.

We build a decision model enabling to predict the choice between war and peace. This model articulates root causes as the risk of future war and other contingent factors such as the potential gains/losses in case of victory/defeat and the potential losses linked to the war itself. We calibrate the model to the case of British, French, German, Russian, decision-makers at the end of July 14, taking into account the decisions already taken by the Austrian Hungarian Empire. We consider the case of a short war that would not have gone beyond the end of 1914. Our model predicts that France and Germany went to war, the preventive-war argument (doing war today is better than tomorrow) prevailing for both countries, with for France the additional benefits linked to the return of Alsace-Moselle to France in case of victory. For the Russian and British Empires, the case for seizing war was more dubious. The calibration reveals that when asked which of the two countries, France and Germany, seemed to have the most interest to war, France emerges as the best candidate, enabling to explain the passive behavior of French decision-makers, Raymond Poincaré ahead, who, if they did not aim at war, did not do their best to avoid it. Beyond, the timing of decisions of Entente’s partners, who did not coordinate, represents a major factor of the outbreak of war on July 14.

[Presentation slides]


Vassili Vergopoulos: Subjective Expected Utility through Stochastic Independence

This paper studies decision-making in the face of two stochastically independent sources of uncertainty. It characterizes axiomatically a Subjective Expected Utility representation of preferences where subjective beliefs consist of a product probability measure. The two key axioms in this characterization both involve some behavioral notions of stochastic independence. Our result can be understood as a purely subjective version of the Anscombe and Aumann (1963) theorem that avoids the controversial use of exogenous probabilities by appealing to stochastic independence. We also obtain an extension to Choquet Expected Utility representations.

Joint work with Michel Grabisch and Benjamin Monet.

[Presentation slides]


Peter Wakker: The Prettiest Preference Foundation of All for Time and Risk, and Other Ideas by Philippe Mongin.

This lecture presents valuable ideas that the speaker learned from Philippe Mongin, including the prettiest preference axiomatization of all for intertemporal risky choice.

[Presentation slides]


John Weymark: Precedent-Based Judgment Aggregation in the U.S. Supreme Court

In the U.S. legal system, when a case is before a court, a precedent may apply. In cases where a precedent is under consideration, the court needs to answer three questions: (1) Is the precedent good law? (2) Does the precedent apply to this case? (3) Should the court uphold the precedent? In the event that the court answers yes to the first two questions and no to the last, there is what David Cohen (Boston University Law Review, 2010) calls a precedent-based voting paradox. Cohen has identified eleven instances of this paradox in U.S. Supreme Court decisions prior to 2010. We review Cohen's paradox and relate it to the doctrinal paradox that has played a foundational role in the judgment aggregation literature. We also identify what is arguably one more instance of a precedent-based voting paradox in the period since Cohen's article was published. Furthermore, we show that Cohen's proposal for a generalized precedent-based voting paradox is problematic.

Joint work with Sarah Friedman.