Selected International Publications


A Short Term Credibility Index for Central Banks under Inflation Targeting: An Application to Brazil (with Alain Hecq and Elisa Voisin) 

Journal of International Money and Finance, Vol. 143, 103057, 2024.


Abstract

Our goal is to provide econometric tools that could act as an almost real-time warning-system for central banks working under an Inflation-Targeting Regime. In any given month, it computes the probability that inflation will remain within the tolerance bounds set in advance by the Regime. So, our short-term index gives a proper response time for Central Banks, something long-term indices prevalent in the literature do not provide. Although we showcase Brazil in our application, our method could be broadly applied to other countries that operate under an Inflation-Targeting Regime. Our key statistic is the probability that inflation will be within the bounds in the next 1- 3- and 6-months ahead. It is based on predictive densities obtained from a mixed causal-noncausal autoregressive (MAR) model. We polish the accuracy of our key statistic using the receiver operating characteristic curve (ROC curve), something new in this literature. 


Incentive-Driven Inattention (with Wagner Gaglianone, Raffaella Giacomini, and Vasiliki Skreta) 

Journal of Econometrics, Vol. 231(1), pp. 188-212, 2022.


Abstract

“Rational inattention” is becoming increasingly prominent in economic modeling, but there is little empirical evidence for its central premise – that the choice of attention results from a cost-benefit optimization. Observational data typically do not allow researchers to infer attention choices from observables. We fill this gap in the literature by exploiting a unique dataset of professional forecasters who update their inflation forecasts at days of their choice. In the data we observe how many forecasters update (extensive margin of updating), the magnitude of the update (intensive margin), and the objective of optimization (forecast accuracy). There are also \shifters" in incentives: A contest that increases the benefit of accurate forecasting, and the release of official data that reduces the cost of processing information. These features allow us to link observables to attention and incentive parameters. We structurally estimate a model where the decision to update and the magnitude of the update are endogenous and the latter is the outcome of a rational inattention optimization. The empirical findings provide support for the key implication of rational inattention that information-processing efforts react to changing incentives. Counterfactuals reveal that accuracy is maximized if the contest date coincides with the release of information, aligning higher benefits with lower costs of attention.


Machine Learning and Oil Price Point and Density Forecasting (with Alexandre Costa, Pedro C. Ferreira, Wagner Gaglianone, Osmani Guillén, and Yihao Lin)

Energy Economics, vol. 102, 105494, 2021.


Abstract

The purpose of this paper is to explore machine learning techniques to forecast the oil price. In the era of big data, we investigate whether new automated tools can improve over traditional approaches in terms of forecast accuracy. Oil price point and density forecasts are built from 23 methods, including regression trees (random forest, quantile regression forest, xgboost), regularization procedures (elastic net, lasso, ridge), standard econometric models and forecast combinations, besides the structural factor model of Schwartz and Smith (2000). The database contains 315 macroeconomic and financial variables, used to build high-dimensional models. To evaluate the predictive power of each method, an extensive pseudo out-of-sample forecasting exercise is built, in monthly and quarterly frequencies, with horizons from one month up to five years. Overall, the results indicate a good performance of the machine learning methods in the short-run. Up to six months, lasso-based models, oil future prices, VECM and the Schwartz-Smith model provide the best forecasts. At longer horizons, forecast combinations also become relevant. In several cases, the accuracy gains in respect to the random walk forecast are statistically significant and reach two-digit figures, in percentage terms, using the R² out-of-sample statistic; an expressive achievement compared to the previous literature. 


Mixed Causal-Noncausal Autoregressions with Strictly Exogenous Regressors (with Alain Hecq and Sean Telg) 

Journal of Applied Econometrics, vol. 35(3), pp. 328-343, 2020.


Abstract

The mixed causal-noncausal autoregressive (MAR) model has been proposed to estimate time series processes involving explosive roots in the autoregressive part, as it allows for stationary forward and backward solutions. Possible exogenous variables are substituted into the error term to ensure the univariate MAR structure of the variable of interest. To study the impact of fundamental exogenous variables directly, we instead consider a MARX representation which allows for the inclusion of exogenous regressors. We argue that, contrary to MAR models, MARX models might be identified using second-order properties. The asymptotic distribution of the MARX parameters is derived assuming a class of non-Gaussian densities. We assume a Student’s t-likelihood to derive closed form solutions of the corresponding standard errors. By means of Monte Carlo simulations, we evaluate the accuracy of MARX model selection based on information criteria. We examine the influence of the U.S. exchange rate and industrial production index on several commodity prices.


Testing Consumption Optimality using Aggregate Data (with Fábio Augusto Reis Gomes).

Macroeconomic Dynamics, Vol. 21(5), pp. 1119-1140, 2017.


Abstract

This paper tests the optimality of consumption decisions at the aggregate level taking into account popular deviations from the canonical constant-relative-risk-aversion (CRRA) utility function model -- rule of thumb and habit. First, based on the critique in Carroll (2001) and Weber (2002) of the linearization and testing strategies using euler equations for consumption, we provide extensive empirical evidence of their inappropriateness -- a drawback for standard rule-of-thumb tests. Second, we propose a novel approach to test for consumption optimality in this context: nonlinear estimation coupled with return aggregation, where rule-of-thumb behavior and habit are special cases of an all encompassing model. We estimated 48 euler equations using GMM. At the 5% level, we only rejected optimality twice out of 48 times. Moreover, out of 24 regressions, we found the rule-of-thumb parameter to be statistically significant only twice. Hence, lack of optimality in consumption decisions represent the exception, not the rule. Finally, we found the habit parameter to be statistically significant on four occasions out of 24.


Forecasting multivariate time series under present-value model short- and long-run co-movement restrictions (with Osmani Guillén, Alain Hecq, and Diogo Saraiva)

International Journal of Forecasting, vol. 31(3), pp. 862-875, 2015.


Abstract

Using a sequence of VAR-based nested multivariate models, we discuss the different layers of restrictions that are imposed on the VAR in levels by present-value models (PVM hereafter) for series that are subject to present-value restrictions. Our focus is novel: we are interested in the short-run restrictions entailed by PVMs (Vahid and Engle, 1993, Vahid and Engle, 1997) and their implications for forecasting.Using a well-known database, maintained by Robert Shiller, we implement a forecasting competition that imposes different layers of PVM restrictions. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to the unrestricted VAR. Moreover, imposing short-run restrictions produces forecast winners 70% of the time for the target variables of PVMs and 63.33% of the time when all variables in the system are considered.


On the Welfare Costs of Business-Cycle Fluctuations and Economic-Growth Variation in the 20th Century and Beyond (with Osmani Teixeira de Carvalho Guillén and Afonso Arinos de Mello Franco Neto).

Journal of Economic Dynamics and Control, Vol. 39, pp. 62-78, 2014.


Abstract

The main objective of this paper is to propose a novel setup that allows estimating separately the welfare costs of the uncertainty stemming from business-cycle fluctuations and from economic-growth variation, when the two types of shocks associated with them (respectively, transitory and permanent shocks) hit consumption simultaneously. Separating these welfare costs requires dealing with degenerate bivariate distributions. Levi's Continuity Theorem and the Disintegration Theorem allow us to adequately define the one-dimensional limiting marginal distributions. Under Normality, we show that the parameters of the original marginal distributions are not affected, providing the means for calculating separately the welfare costs of business-cycle fluctuations and of economic-growth variation.

Our empirical results show that, if we consider only transitory shocks, the welfare cost of business cycles is much smaller than previously thought. Indeed, we found it to be negative -- -0.03% of per-capita consumption! On the other hand, we found that the welfare cost of economic-growth variation is relatively large. Our estimate for reasonable preference-parameter values shows that it is 0.71% of consumption -- US$ 208.98 per person, per year.


Using Common Features to Understand the Behavior of Metal-Commodity Prices and Forecast them at Different Horizons (with Claudia F. Rodrigues and Rafael Burjack).

Journal of International Money and Finance, Vol. 42, pp. 310-335, 2014.


Abstract

The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual frequencies, using the techniques put forth by the common-feature literature. Data consists of metal-commodity prices at a monthly and quarterly frequencies from 1957 to 2012, and annual data, provided from 1900-2010 by the U.S. Geological Survey (USGS). We show, theoretically, that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Regarding out-of-sample forecasts, our main contribution is to show the benefits of forecast-combination techniques, which outperform individual-model forecasts -- including the random-walk model. We use a variety of models (linear and non-linear, single equation and multivariate) and a variety of co-variates and functional forms to forecast the returns and prices of metal commodities, as proposed by the common-feature literature on forecast combination. Empirically, we show that models incorporating (short-run) common-cycle restrictions perform better than unrestricted models, with an important role for industrial production as a predictor for metal-price variation.


Model Selection, Estimation and Forecasting in VAR Models with Short-run and Long-run Restrictions (with George Athanasopoulos, Osmani Teixeira de Carvalho Guillén and Farshid Vahid).

Journal of Econometrics, Vol. 164, 1, pp. 116-129, 2011.


Abstract

We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We consider model selection criteria which have data-dependent penalties as well as the traditional ones. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. Our Monte Carlo simulations measure the improvements in forecasting accuracy that can arise from the joint determination of lag-length and rank using our proposed procedure, relative to an unrestricted VAR or a cointegrated VAR estimated by the commonly used procedure of selecting the lag-length only and then testing for cointegration. Two empirical applications forecasting Brazilian inflation and U.S. macroeconomic aggregates growth rates respectively show the usefulness of the model-selection strategy proposed here. The gains in different measures of forecasting accuracy are substantial, especially for short horizons.


A Panel-Data Approach to Economic Forecasting: The Bias-Corrected Average Forecast (with Luiz Renato Lima).

Journal of Econometrics, Vol. 152, 2,  pp. 153-164, 2009.


Abstract

In this paper, we propose a novel approach to econometric forecasting of stationary and ergodic time series within a panel-data framework. Our key element is to employ the (feasible) bias-corrected average forecast. Using panel-data sequential asymptotics we show that it is potentially superior to other techniques in several contexts. In particular, it is asymptotically equivalent to the conditional expectation, i.e., has an optimal limiting mean-squared error. We also develop a zero-mean test for the average bias and discuss the forecast-combination puzzle in small and large samples. Monte-Carlo simulations are conducted to evaluate the performance of the feasible bias-corrected average forecast in finite samples. An empirical exercise, based upon data from a well known survey is also presented. Overall, these results show promise for the feasible bias-corrected average forecast.


The Missing Link: Using the NBER Recession Indicator to Construct Coincident and Leading Indices of Economic Activity (with Farshid Vahid).

Journal of Econometrics, vol. 132, pp. 281-303, 2006.


Abstract

We use the information content in the decisions of the NBER Business Cycle Dating Committee to construct coincident and leading indices of economic activity for the United States. We identify the coincident index by assuming that the coincident variables have a common cycle with the unobserved state of the economy, and that the NBER business cycle dates signify the turning points in the unobserved state. This model allows us to estimate our coincident index as a linear combination of the coincident series. We compare the performance of our index with other currently popular coincident indices of economic activity.


Testing Production Functions Used in Empirical Growth Studies (with Pedro Cavalcanti Ferreira and Samuel de Abreu Pessoa).

Economics Letters, vol. 83, pp. 29-35, 2004.


Abstract

We estimate and test alternative functional forms, which have been used in the growth literature, representing the aggregate production function for a panel of countries. Functional forms are confronted using a Box–Cox test, and results favor the mincerian formulation of schooling-returns to skills.


The Importance of Common Cyclical Features in VAR Analysis: A Monte Carlo Study (with Farshid Vahid).

Journal of Econometrics, vol. 109, pp. 341-363, 2002.


Abstract

Despite the commonly held belief that aggregate data display short-run comovement, there has been little discussion about the econometric consequences of this feature of the data. We use exhaustive Monte-Carlo simulations to investigate the importance of restrictions implied by common-cyclical features for estimates and forecasts based on vector autoregressive models. First, we show that the “best” empirical model developed without common cycle restrictions need not nest the “best” model developed with those restrictions. This is due to possible differences in the lag-lengths chosen by model selection criteria for the two alternative models. Second, we show that the costs of ignoring common cyclical features in vector autoregressive modelling can be high, both in terms of forecast accuracy and efficient estimation of variance decomposition coefficients. Third, we find that the Hannan–Quinn criterion performs best among model selection criteria in simultaneously selecting the lag-length and rank of vector autoregressions.


Common cycles and the importance of transitory shocks to macroeconomic aggregates (with Farshid Vahid).

Journal of Monetary Economics, vol. 47, pp. 449-475, 2001.


Abstract

Although there has been substantial research using long-run co-movement (cointegration) restrictions in the empirical macroeconomics literature, little or no work has been done investigating the existence of short-run co-movement (common cycles) restrictions and discussing their implications. In this paper we first investigate the existence of common cycles in a aggregate data set comprising per-capita output, consumption, and investment. Later we discuss their usefulness in measuring the relative importance of transitory shocks. We show that, taking into account common-cycle restrictions, transitory shocks are more important than previously thought at business-cycle horizons. The central argument relies on efficiency gains from imposing these short-run restrictions on the estimation of the dynamic model. Finally, we discuss how the evidence here and elsewhere can be interpreted to support the view that nominal shocks may be important in the short run.


Public Debt Sustainability and Endogenous Seigniorage in Brazil: Time Series Evidence from 1947-92 (with Luiz Renato Lima).

Journal of Development Economics, vol. 62, pp. 131-147, 2000.


Abstract

In this paper, we investigate three central issues in public finance. First, was the path of public debt sustainable during 1947–1992? Second, how has the government balanced the budget after shocks to either revenues or expenditures were observed? Third, are expenditures exogenous? The results show that (i) debt is sustainable in econometric tests, with the budget being balanced almost entirely through changes in taxes, regardless of the cause of the initial imbalance. Expenditures are weakly exogenous; (ii) the behavior of a “rational” Brazilian consumer may be consistent with Ricardian Equivalence; (iii) seigniorage revenues are critical in restoring intertemporal budget equilibrium.


Estimating Common Sectoral Cycles (with Robert F. Engle).

Journal of Monetary Economics, vol. 35, pp. 83-113, 1995.


Abstract

We investigate in this paper the degree of short-run and long-run comovement in U.S. sectoral output data by estimating sectoral trends and cycles. A theoretical model based on Long and Plosser (1983) is used to derive a reduced form for sectoral outputs from first principles. Cointegration and common-cycle tests are performed; sectoral output data seem to share a relatively high number of common trends and a relatively low number of common cycles. A special trend-cycle decomposition of the data set is performed, and the results indicate a very similar cyclical behavior across sectors and very different behavior for trends. In a variance decomposition analysis, prominent sectors such as Manufacturing and Wholesale/Retail Trade exhibit relatively important transitory shocks.