Decision science
(S) means a summary of the paper is available by clicking the down arrow button
(A) means an abstract is available by clicking the down arrow button
(S) means a summary of the paper is available by clicking the down arrow button
(A) means an abstract is available by clicking the down arrow button
Recent Working Papers
ABSTRACT: Using a trailing digit betting design in four experiments, we identify familiarity bias as an explanation for home bias that is distinct from ambiguity aversion or informational advantage. The first two studies establish the existence of a strong familiarity bias within an investment context. A third experiment establishes that standard home bias and familiarity bias are of comparable magnitudes. The last study further demonstrates that home bias can be explained, within subject, by familiarity bias but not by ambiguity aversion. The studies consistently point to familiarity bias as a driver of home bias across investment context and subject types.
Published Papers
ABSTRACT: We axiomatize subjective probabilities on finite domains without requiring richness in the outcome space or restrictions on risk preference through event exchangeability, defined in Chew and Sagi (Econometrica, 74:771–786, 2006), which was implicit in the prior literature (Savage in The Foundations of Statistics, Weily, NewYork, 1954; Machina and Schmeidler in Econometrica, 60:745–780, 1992; Grant in Econometrica, 63:159–189, 1995). We characterize the unique subjective probability representing the underlying exchangeability relation and apply this subjective probability to characterize finite-state probabilistic sophistication.
ABSTRACT: Hart (J Polit Econ, 119(4):617–638, 2011) argues that the Aumann and Serrano (J Polit Econ, 116(5): 810–836, 2008) and Foster and Hart (J Polit Econ, 117(5):785–814, 2009) measures of riskiness have an objective and universal appeal with respect to a subset of expected utility preferences, UH. We show that mean-riskiness decision-making criteria using either measure violate expected utility and are generally inconsistent with optimal portfolio choices made by investors with preferences in UH. We also demonstrate that riskiness measures satisfying Hart’s other behavioral requirements do not generally exist when his argument is generalized to incorporate non-expected utility preferences. Finally, we identify other attributes of the Aumann-Serrano and Foster-Hart measures that raise concerns over their operationalizability and usefulness in various decision making, risk management, and risk assessment settings.
SUMMARY: The Ellsberg two-urn choice problem, first attributable to Keynes (1921), provides a clean and clear test of non-Bayesian behavior and the prevalence of “ambiguity aversion”. Most choices featuring ambiguity, in practice, are not as clear-cut as and easy to understand as the Ellsberg two-urn choice problem. This leads one to ask whether individuals who are able to discern the presence of ambiguity in a “complex” situation – one can think of these as sophisticated or high-comprehension decision makers – exhibit ambiguity aversion. Within a large-scale experimental framework, involving a total of 3583 adult subjects with diverse demographics (i.e., not just University students), we address this question. Subjects are presented with screening questions before choosing between two alternatives represented by payoff-matrices which are essentially equivalent to those in Ellsberg's (1961) two-urn problem. When facing this essentially equivalent yet more complex matrix-based choice task, high-comprehension subjects continue to exhibit ambiguity aversion typical of the standard two-urn problem while, as expected, low-comprehension subjects appear to behave randomly. Our design allows us to classify subjects as “probability minded” or “ambiguity minded” based whether they assign probabilities to draws from a deck of cards with unknown composition during the screening phase. High-comprehension subjects who are ambiguity-minded are far more likely to be ambiguity averse than those who are probability-minded. Significantly, subject “mindedness” appears to explain ambiguity attitudes an order of magnitude more than all other demographic characteristics, combined. Contrary to intuition about subjects' sophistication, ambiguity-minded high-comprehension subjects are younger, more educated, more analytic, and more reflective about their choices compared with their probability-minded counterparts. Our findings cast grave doubts on attempts to discount ambiguity aversion (or, more generally, non-Bayesian attitudes toward uncertainty) as mistakes made by less sophisticated decision makers. At the same time, one must conclude that the vast majority of individuals are unlikely to discern ambiguity in complex situations and may therefore exhibit randomization behavior that confounds standard models of ambiguity attitudes.
SUMMARY: It is self-evident that fairness is something that matters to individuals even when the notion is not directly applicable to themselves or their own welfare. We bemoan the misfortunes of others, generally consider charity a virtue, and devote resources as a society to provide the “have not” segment of society with opportunities for welfare improvement. The seminal works of Harsanyi (1955), Kolm (1969), Atkinson (1970), Rothschild and Stiglitz (1970), and Weymark (1981) contributed to the view that one can measure inequality using measures such as the Gini Coefficient. Diamond (1967) points out that such measures neglect the fact that randomization induces fairness. In other words, neglecting efficiency concerns, if there is to be inequality, it seems “more fair” if inequality is determined randomly rather than via a prejudicial mechanism. This is the notion of “ex-ante” fairness. Ben-Porath, Gilboa, and Schmeidler (1997) point out that correlations also matter when inequality is concerned, and it may be preferable that we all share similar destinies. This is the notion of “ex-post” fairness. Since the recognition of these various attributes of fairness, it has been seen as “difficult” to obtain a measure of inequality that exhibits an aversion to static inequality, as well as an affinity for ex-post and ex-ante fairness. We provide a set of intuitive axioms that lead to a simple family of inequality measures and which exhibit the desired traits. The measures can be viewed as a one-parameter extension of the Generalized Gini Mean and can be easily used by econometricians. This measure can be used to compare inequality in two countries, where in one the distribution of income is broader and yet the lower income cohorts have better opportunities for advancement. The measure can also be used in other context where one may wish to measure “inequality” but the rankings are not static or deterministic: for instance, competition in an industry.
SUMMARY: Nearly 50 years after Savage’s Foundations of Statistics (1954), empirical and experimental evidence on individual decision making leaves little doubt that most of Savage’s assumptions are violated in practice. Arguably, violations of probabilistic sophistication (e.g., the Ellsberg paradox), presents the most difficulties – as evidenced by the large body of theoretical literature on the subject. Our earlier paper on probabilistic sophistication finds the weakest conditions currently known that ensure probabilistic sophistication. The advantage to this parsimonious approach is that it leaves little doubt as to which behavioral assumption (or axiom) one ought to relax to depart from probabilistic sophistication so as to accommodate ‘Ellsbergian’ behavior. In this paper, we relax this one assumption and derive a model and representation of behavior in which distinct sources of uncertainty (e.g., an urn with 50 red and 50 black balls versus an urn with an unknown mixture of red and black balls) correspond to probabilistic sophistication within the distinct domains (e.g., red and black from the same urn are equally likely) but not across domains (e.g., I’d rather bet on red from the known versus red from the unknown urn). We show that this leads to a very simple yet intuitive and easily applied model of Ellsbergian behavior (preferring a bet on the known urn to the same bet on the unknown urn). The notion of “source dependent” preferences has come into vogue and our paper will hopefully continue to play a key part in the theoretical development of this literature.
SUMMARY: Savage’s ‘Foundations of Statistics’ (1954) is recognized as one of the most influential and impressive treatises on the theory of decision making. In it, Savage shows that any individual who adheres to six behavioral principles (or axioms) will make decisions as if she assigns subjective probabilities to uncertain events. The latter is termed ‘probabilistic sophistication’ and is at the heart of much of modern Bayesian statistics and economic theory. One problem with Savage’s approach is that his axioms are too strong. In particular, they rule out probabilistic behavior even in cases where a decision maker is obviously probabilistically sophisticated. One simple example is an individual with mean-variance preferences. Such an individual violates three of Savage’s axioms even though her preferences are explicitly probabilistic. This paper provides the heretofore simplest behavioral basis for either determining or requiring probabilistic sophistication. In comparison with other theories, our approach subsumes the largest set of non-pathological preferences consistent with a subjective Bayesian approach to uncertainty. As such, it is a fundamental contribution to social science and statistics and extends the work of Machina and Schmeidler (1992) and Grant (1995).
Because it offers the most parsimonious rationalization of probabilistic attitudes towards uncertainty to-date, the paper also paves the way to understanding what assumptions ought to be relaxed in order to accommodate the growing experimental evidence against probabilistic sophistication (e.g., the Ellsberg paradox).
Abstract: We derive an inter-temporal theory of choice, in the spirit of Kreps and Porteus [Kreps, D.M., Porteus, E.L., 1978. Temporal resolution of uncertainty and dynamic choice theory. Econometrica 46, 185–200], where decision makers have incomplete preferences. This can be used to model indecisiveness as well as unforeseen contingencies. The key to our approach is a time consistency condition and therefore the normative connection between ex-ante and ex-post choice. The time consistency condition enables a representation that is a straight forward extension of recursive utility with the exception that it features an inter-temporal ‘utility for flexibility’.
SUMMARY: Perhaps the most cited and descriptively successful theory of static individual choice is Cumulative Prospect Theory – a variant on Kahneman and Tversky’s Nobel winning (1979) paper. One of its central tenets is the notion that losses loom larger than gains (loss aversion) and to model this one assumes that the utility for a prospect is measured relative to some status quo benchmark, such as current wealth. Even aside from this application, the idea of a status quo bias in decision-making plays an important role elsewhere. An example is the well documented endowment effect in which decision makers opt for a mug rather than $5 if they have the mug, or prefer to keep the $5 if they do not have the mug. Other examples in the form of an often large disparity between the willingness to pay for something versus the willingness to accept cash for something are widely documented in the consumer behavior literature.
All of these phenomena are attributed to reference dependence in the way decision makers approach choice problems. A potential problem with a generic model of reference dependent preferences, however, is that it may inadvertently allow for the possibility of systematic manipulation. My paper poses the following question: What model restrictions are implied if one is to rule out situations in which a decision maker has a choice between two prospects, selects one, subsequently changes her mind and selects the other – even if the change is costly. Although quite appealing and simple, my paper shows that such a condition places very stringent constraints on any theory that models reference dependence. In particular, it rules out cumulative prospect theory and virtually every other model commonly used. This is a strong result. However, my purpose is not to prove that reference dependence is unreasonable. Quite to the contrary, I believe that reference dependence plays an important role in decision making even while decision makers do not systematically expose themselves to manipulation. To demonstrate that it is indeed possible to ‘have your cake and eat it too,’ I explicitly show the existence of a class of reference dependent models that do not exhibit the cycling described.
SUMMARY: Kreps (1979) and Dekel, Lipman, and Rustichini (2001), among others, derive models in which the decision maker is not sure about his/her futures preferences because of “unforeseen contingencies”. Their axiomatic models derive utility representations in which the decision maker behaves as if there are a multitude of future “utility states” and each state corresponds to possible future utility function. The set of states thus derived is interpreted as a subjective state space even though subsequent rankings need not conform to any one of the aggregated utilities. This paper proposes a definition for a subjective state space under unforeseen contingencies that is topologically unique, derives its existence from preference primitives as opposed to the representation of preferences, and does not commit to an interpretation in which states correspond to future realized rankings. Roughly, a unique endogenous state space, is defined in the paper to exist if the maximal Pareto Frontier of every choice problem can be made to look topologically the same by making infinitesimal changes in the choice set. The maximal Pareto Frontier can then be identified with the state space. Such an approach allows one to give the endogenous state space a more objective sense, because it can be, in principle, observed through experiment.
Unpublished Manuscripts
ABSTRACT: Rabin (2000)’s Calibration Theorem established that intuitive risk attitudes towards small gambles, across a wide range of endowments, cannot be reconciled in the context of expected utility with sensible attitudes towards highly lucrative large-payoff gambles. Safra and Segal (2008, 2009) confirm that most of the commonly employed non-expected utility specifications also cannot be calibrated to attitudes to risks in the small and in the large if endowments are stochastic. We present an axiomatically motivated non-expected utility theory that is capable of satisfying Rabin’s calibration requirement even with stochastic endowments, and is also consistent with standard violations of the Independence Axiom. The theory hinges on a restriction of the latter Axiom to holding only when mixing equal-mean lotteries. While the model still exhibits unreasonably large risk premia for small-probability large losses it appears to address the bulk of the critique offered by Safra and Segal (2008).
ABSTRACT: Machina (2004) introduced the notion of an ‘almost objective’ event in a continuous state space—high frequency events in a subjective setting such as ‘the realization of the nth decimal place of a stock index.’ Payoffs on such events intuitively appear as objective lotteries in the sense that decision makers should not prefer to place bets on any particular digit when n is large even if the state space is fully subjective. This paper investigates the implications of requiring decision makers to treat almost objective events the same regardless of their source (e.g., regardless of the identity of the stock index). Multi-prior models in which the set of representing priors are smooth (i.e., possess densities) can accommodate such source indifference. The major contribution of this paper is to demonstrate that, under mild behavioral conditions, a multi-prior representation with smooth priors is also necessary.
ABSTRACT: This paper axiomatically explores minimal conditions under which reference dependent preferences over risky prospects are normatively admissible. It is shown that two simple and intuitive conditions are sufficient to place strong requirements over such preferences. The first condition is tantamount to ‘no-cycling’ when the reference point is the status quo; the second condition requires the reference dependent representations to be continuous with respect to the reference point. In particular, these conditions rule out Cumulative Prospect Theory as well as any theory in which all reference dependent indifference surfaces are smooth – the latter case also holds for risk-less theories of the endowment effect (e.g., Tversky and Kahneman (1991)). It is also shown that one can construct satisfactory alternatives, axiomatically derived or otherwise, to Cumulative Prospect Theory as well as Tversky and Kahneman’s (1991) theory of the risk-less endowment effect. The alternative theories I propose take the form of max-min representations over a set of expected (or Choquet-expected) utility differences, where utility difference is measured between the prospect evaluated and the reference point.