Revise and Resubmit, Journal of Political Economy, Feb 2021
Updated July 2020
Policy: SEC Commissioner
Abstract: This paper builds a model of stock exchange competition tailored to the institutional and regulatory details of the modern U.S. stock market. The model shows that under the status quo market design: (i) trading behavior across the seemingly fragmented exchanges is as if there is just a single synthesized exchange; (ii) as a result, trading fees are perfectly competitive; (iii) however, exchanges are able to capture and maintain economic rents from the sale of speed technology such as proprietary data feeds and co-location — arms for the high-frequency trading arms race. We document stylized empirical facts consistent with each of the three main results of the theory. We then use the model to examine the private and social incentives for exchanges to adopt new market designs, such as frequent batch auctions, that address the negative aspects of high-frequency trading. The robust conclusion is that private innovation incentives are much smaller than the social incentives, especially for incumbents who face the loss of speed technology rents. A policy insight that emerges from the analysis is that a regulatory “push,” as opposed to a market design mandate, may be suﬃcient to tip the balance of incentives and encourage “the market to ﬁx the market.”
Arbitrage Comovement [Link]
updated May 2020
Abstract: I argue that arbitrage mistranslates factor information from ETFs to constituent securities and distorts comovement. The intuition behind this distortion is arbitrageurs trade constituent securities not based on their fundamental exposures but by their portfolio weights, causing securities to comove with the ETF based on a measure I call arbitrage sensitivity — a combination of portfolio weight and price impact sensitivity — rather than fundamental exposures. Arbitrage sensitivity predicts comovement between stock and ETF returns, especially in periods of high ETF volume and volatility, but not before 2008 when ETFs were not as heavily traded. Arbitrage-induced comovement leads to over-reaction to ETF returns for stocks more sensitive to arbitrage and under-reaction for those less sensitive. A long-short portfolio constructed based on arbitrage sensitivity generates an alpha of around 7.5% per year. Unlike most anomalies, arbitrage comovement is strongest in large-cap stocks, which are held by the most actively traded ETFs. Arbitrage comovement implies observed factor loadings are less reliable for assessing risk since they are at least partially driven by mechanical arbitrage trading instead of fundamental exposures.
Work in Progress
Do Electronic Markets Improve Execution If You Cannot Identify Yourself? [Link]
with Yenan Wang
New, Feb 2021
Abstract: We study two aspects of the transition to electronic markets for large uninformed traders: (1) the increase in trading frequency, and (2) the move to anonymous trading, which challenged the implementation of reputation-based strategies. We present a model of optimal execution where a risk-neutral trader must sell over multiple trading periods before a ﬁxed deadline to several risk-averse market makers. We model reputation-based strategies as the ability to commit ex-ante to a sequence of market orders, what we deﬁne as scheduling. While scheduling and more trading periods are both valuable, a trader typically prefers implementing scheduling over having more trading periods. Moreover, the beneﬁts of scheduling signiﬁcantly increase when coupled with more frequent trading, enough so to overcome “predatory traders” that proﬁt from predictable orders. Our model suggests that transparent execution algorithms, which could be used to implement scheduling in anonymous electronic markets, may have important beneﬁts for large, uninformed traders.
The High Frequency Trading Arms Race: Frequent Batch Auctions as a Market Design Response [Link]
Quarterly Journal of Economics, 2015
Abstract: The high-frequency trading arms race is a symptom of ﬂawed market design. Instead of the continuous limit order book market design that is currently predominant, we argue that ﬁnancial exchanges should use frequent batch auctions: uniform price double auctions conducted, for example, every tenth of a second. That is, time should be treated as discrete instead of continuous, and orders should be processed in a batch auction instead of serially. Our argument has three parts. First, we use millisecond-level direct-feed data from exchanges to document a series of stylized facts about how the continuous market works at high-frequency time horizons: (i) correlations completely break down; which (ii) leads to obvious mechanical arbitrage opportunities; and (iii) competition has not affected the size or frequency of the arbitrage opportunities, it has only raised the bar for how fast one has to be to capture them. Second, we introduce a simple theory model which is motivated by and helps explain the empirical facts. The key insight is that obvious mechanical arbitrage opportunities, like those observed in the data, are built into the market design—continuous-time serialprocessing implies that even symmetrically observed public information creates arbitrage rents. These rents harm liquidity provision and induce a never-ending socially wasteful arms race for speed. Last, we show that frequent batch auctions directly address the ﬂaws of the continuous limit order book. Discrete time reduces the value of tiny speed advantages, and the auction transforms competition on speed into competition on price. Consequently, frequent batch auctions eliminate the mechanical arbitrage rents, enhance liquidity for investors, and stop the high-frequency trading arms race.