Risky Oil: It's All in the Tails
Christiane Baumeister, Florian Huber, Massimiliano Marcellino
NBER Working Paper,
No. 32524,
2024
Abstract
The substantial fluctuations in oil prices in the wake of the COVID-19 pandemic and the Russian invasion of Ukraine have highlighted the importance of tail events in the global market for crude oil which call for careful risk assessment. In this paper we focus on forecasting tail risks in the oil market by setting up a general empirical framework that allows for flexible predictive distributions of oil prices that can depart from normality. This model, based on Bayesian additive regression trees, remains agnostic on the functional form of the conditional mean relations and assumes that the shocks are driven by a stochastic volatility model. We show that our nonparametric approach improves in terms of tail forecasts upon three competing models: quantile regressions commonly used for studying tail events, the Bayesian VAR with stochastic volatility, and the simple random walk. We illustrate the practical relevance of our new approach by tracking the evolution of predictive densities during three recent economic and geopolitical crisis episodes, by developing consumer and producer distress indices that signal the build-up of upside and downside price risk, and by conducting a risk scenario analysis for 2024.
Read article
Media Response
Media Response November 2024 IWH: Manchmal wäre der Schlussstrich die angemessenere Lösung in: TextilWirtschaft, 21.11.2024 IWH: Existenzgefahr Nun droht eine Pleitewelle in: DVZ…
See page
The Appropriateness of the Macroeconomic Imbalance Procedure for Central and Eastern European Countries
Geraldine Dany-Knedlik, Martina Kämpfe, Tobias Knedlik
Empirica,
No. 1,
2021
Abstract
The European Commission’s Scoreboard of Macroeconomic Imbalances is a rare case of a publicly released early warning system. It was published first time in 2012 by the European Commission as a reaction to public debt crises in Europe. So far, the Macroeconomic Imbalance Procedure takes a one-size-fits-all approach with regard to the identification of thresholds. The experience of Central and Eastern European Countries during the global financial crisis and in the resulting public debt crises has been largely different from that of other European countries. This paper looks at the appropriateness of scoreboard of the Macroeconomic Imbalances Procedure of the European Commission for this group of catching-up countries. It is shown that while some of the indicators of the scoreboard are helpful to predict crises in the region, thresholds are in most cases set too narrow since it largely disregarded the specifics of catching-up economies, in particular higher and more volatile growth rates of various macroeconomic variables.
Read article
Optimizing Policymakers’ Loss Functions in Crisis Prediction: Before, Within or After?
Peter Sarlin, Gregor von Schweinitz
Macroeconomic Dynamics,
No. 1,
2021
Abstract
Recurring financial instabilities have led policymakers to rely on early-warning models to signal financial vulnerabilities. These models rely on ex-post optimization of signaling thresholds on crisis probabilities accounting for preferences between forecast errors, but come with the crucial drawback of unstable thresholds in recursive estimations. We propose two alternatives for threshold setting with similar or better out-of-sample performance: (i) including preferences in the estimation itself and (ii) setting thresholds ex-ante according to preferences only. Given probabilistic model output, it is intuitive that a decision rule is independent of the data or model specification, as thresholds on probabilities represent a willingness to issue a false alarm vis-à-vis missing a crisis. We provide real-world and simulation evidence that this simplification results in stable thresholds, while keeping or improving on out-of-sample performance. Our solution is not restricted to binary-choice models, but directly transferable to the signaling approach and all probabilistic early-warning models.
Read article
Early-Stage Business Formation: An Analysis of Applications for Employer Identification Numbers
Kimberly Bayard, Emin Dinlersoz, Timothy Dunne, John Haltiwanger, Javier Miranda, John Stevens
NBER Working Paper,
No. 24364,
2018
Abstract
This paper reports on the development and analysis of a newly constructed dataset on the early stages of business formation. The data are based on applications for Employer Identification Numbers (EINs) submitted in the United States, known as IRS Form SS-4 filings. The goal of the research is to develop high-frequency indicators of business formation at the national, state, and local levels. The analysis indicates that EIN applications provide forward-looking and very timely information on business formation. The signal of business formation provided by counts of applications is improved by using the characteristics of the applications to model the likelihood that applicants become employer businesses. The results also suggest that EIN applications are related to economic activity at the local level. For example, application activity is higher in counties that experienced higher employment growth since the end of the Great Recession, and application counts grew more rapidly in counties engaged in shale oil and gas extraction. Finally, the paper provides a description of new public-use dataset, the “Business Formation Statistics (BFS),” that contains new data series on business applications and formation. The initial release of the BFS shows that the number of business applications in the 3rd quarter of 2017 that have relatively high likelihood of becoming job creators is still far below pre-Great Recession levels.
Read article
Tail-risk Protection Trading Strategies
Natalie Packham, Jochen Papenbrock, Peter Schwendner, Fabian Wöbbeking
Quantitative Finance,
No. 5,
2017
Abstract
Starting from well-known empirical stylized facts of financial time series, we develop dynamic portfolio protection trading strategies based on econometric methods. As a criterion for riskiness, we consider the evolution of the value-at-risk spread from a GARCH model with normal innovations relative to a GARCH model with generalized innovations. These generalized innovations may for example follow a Student t, a generalized hyperbolic, an alpha-stable or a Generalized Pareto distribution (GPD). Our results indicate that the GPD distribution provides the strongest signals for avoiding tail risks. This is not surprising as the GPD distribution arises as a limit of tail behaviour in extreme value theory and therefore is especially suited to deal with tail risks. Out-of-sample backtests on 11 years of DAX futures data, indicate that the dynamic tail-risk protection strategy effectively reduces the tail risk while outperforming traditional portfolio protection strategies. The results are further validated by calculating the statistical significance of the results obtained using bootstrap methods. A number of robustness tests including application to other assets further underline the effectiveness of the strategy. Finally, by empirically testing for second-order stochastic dominance, we find that risk averse investors would be willing to pay a positive premium to move from a static buy-and-hold investment in the DAX future to the tail-risk protection strategy.
Read article
Optimizing Policymakers' Loss Functions in Crisis Prediction: Before, Within or After?
Peter Sarlin, Gregor von Schweinitz
Abstract
Early-warning models most commonly optimize signaling thresholds on crisis probabilities. The expost threshold optimization is based upon a loss function accounting for preferences between forecast errors, but comes with two crucial drawbacks: unstable thresholds in recursive estimations and an in-sample overfit at the expense of out-of-sample performance. We propose two alternatives for threshold setting: (i) including preferences in the estimation itself and (ii) setting thresholds ex-ante according to preferences only. Given probabilistic model output, it is intuitive that a decision rule is independent of the data or model specification, as thresholds on probabilities represent a willingness to issue a false alarm vis-à-vis missing a crisis. We provide simulated and real-world evidence that this simplification results in stable thresholds and improves out-of-sample performance. Our solution is not restricted to binary-choice models, but directly transferable to the signaling approach and all probabilistic early-warning models.
Read article
Optimizing Policymakers' Loss Functions in Crisis Prediction: Before, Within or After?
Peter Sarlin, Gregor von Schweinitz
Abstract
Early-warning models most commonly optimize signaling thresholds on crisis probabilities. The ex-post threshold optimization is based upon a loss function accounting for preferences between forecast errors, but comes with two crucial drawbacks: unstable thresholds in recursive estimations and an in-sample overfit at the expense of out-of-sample performance. We propose two alternatives for threshold setting: (i) including preferences in the estimation itself and (ii) setting thresholds ex-ante according to preferences only. We provide simulated and real-world evidence that this simplification results in stable thresholds and improves out-of-sample performance. Our solution is not restricted to binary-choice models, but directly transferable to the signaling approach and all probabilistic early-warning models.
Read article
The Quantity Theory Revisited: A New Structural Approach
Makram El-Shagi, Sebastian Giesen
Macroeconomic Dynamics,
No. 1,
2015
Abstract
We propose a unified identification scheme to identify monetary shocks and track their propagation through the economy. We combine three approaches dealing with the consequences of monetary shocks. First, we adjust a state space version of the P-star type model employing money overhang as the driving force of inflation. Second, we identify the contemporaneous impact of monetary policy shocks by applying a sign restriction identification scheme to the reduced form given by the state space signal equations. Third, to ensure that our results are not distorted by the measurement error exhibited by the official monetary data, we employ the Divisia M4 monetary aggregate provided by the Center for Financial Stability. Our approach overcomes one of the major difficulties of previous models by using a data-driven identification of equilibrium velocity. Thus, we are able to show that a P-star model can fit U.S. data and money did indeed matter in the United States.
Read article
Regulation, Innovation and Technology Diffusion - Evidence from Building Energy Efficiency Standards in Germany
Makram El-Shagi, Claus Michelsen, Sebastian Rosenschon
Discussionpapers des DIW Berlin,
No. 1371,
2014
Abstract
The impact of environmental regulation on technology diffusion and innovations is studied using a unique data set of German residential buildings. We analyze how energy efficiency regulations, in terms of minimum standards, affects energy-use in newly constructed buildings and how it induces innovation in the residential-building industry. The data used consists of a large sample of German apartment houses built between 1950 and 2005. Based on this information, we determine their real energy requirements from energy performance certificates and energy billing information. We develop a new measure for regulation intensity and apply a panel-error-correction regression model to energy requirements of low and high quality housing. Our findings suggest that regulation significantly impacts technology adoption in low quality housing. This, in turn, induces improvements in the high quality segment where innovators respond to market signals.
Read article