Optimizing Policymakers' Loss Functions in Crisis Prediction: Before, Within or After?
Peter Sarlin, Gregor von Schweinitz
Abstract
Early-warning models most commonly optimize signaling thresholds on crisis probabilities. The ex-post threshold optimization is based upon a loss function accounting for preferences between forecast errors, but comes with two crucial drawbacks: unstable thresholds in recursive estimations and an in-sample overfit at the expense of out-of-sample performance. We propose two alternatives for threshold setting: (i) including preferences in the estimation itself and (ii) setting thresholds ex-ante according to preferences only. We provide simulated and real-world evidence that this simplification results in stable thresholds and improves out-of-sample performance. Our solution is not restricted to binary-choice models, but directly transferable to the signaling approach and all probabilistic early-warning models.
Read article
The Joint Dynamics of Sovereign Ratings and Government Bond Yields
Makram El-Shagi, Gregor von Schweinitz
Abstract
In the present paper, we build a bivariate semiparametric dynamic panel model to repro-duce the joint dynamics of sovereign ratings and government bond yields. While the individual equations resemble Pesaran-type cointegration models, we allow for different long-run relationships in both equations, nonlinearities in the level effect of ratings, and asymmetric effects in changes of ratings and yields. We find that the interest rate equation and the rating equation imply significantly different long-run relationships. While the high persistence in both interest rates and ratings might lead to the misconception that they follow a unit root process, the joint analysis reveals that they converge slowly to a joint equilibrium. While this indicates that there is no vicious cycle driving countries into default, the persistence of ratings is high enough that a rating shock can have substantial costs. Generally, the interest rate adjusts rather quickly to the risk premium that is in line with the rating. For most ratings, this risk premium is only marginal. However, it becomes substantial when ratings are downgraded to highly speculative (a rating of B) or lower. Rating shocks that drive the rating below this threshold can increase the interest rate sharply, and for a long time. Yet, simulation studies based on our estimations show that it is highly improbable that rating agencies can be made responsible for the most dramatic spikes in interest rates.
Read article
Switching to Exchange Rate Flexibility? The Case of Central and Eastern European Inflation Targeters
Andrej Drygalla
FIW Working Paper,
No. 139,
2015
Abstract
This paper analyzes changes in the monetary policy in the Czech Republic, Hungary, and Poland following the policy shift from exchange rate targeting to inflation targeting around the turn of the millennium. Applying a Markovswitching dynamic stochastic general equilibrium model, switches in the policy parameters and the volatilities of shocks hitting the economies are estimated and quantified. Results indicate the presence of regimes of weak and strong responses of the central banks to exchange rate movements as well as periods of high and low volatility. Whereas all three economies switched to a less volatile regime over time, findings on changes in the policy parameters reveal a lower reaction to exchange rate movements in the Czech Republic and Poland, but an increased attention to it in Hungary. Simulations for the Czech Republic and Poland also suggest their respective central banks, rather than a sound macroeconomic environment, being accountable for reducing volatility in variables like inflation and output. In Hungary, their favorable developments can be attributed to a larger extent to the reduction in the size of external disturbances.
Read article
International Side-payments to Improve Global Public Good Provision when Transfers are Refinanced through a Tax on Local and Global Externalities
Martin Altemeyer-Bartscher, A. Markandya, Dirk T. G. Rübbelke
International Economic Journal,
No. 1,
2014
Abstract
This paper discusses a tax-transfer scheme that aims to address the under-provision problem associated with the private supply of international public goods and to bring about Pareto optimal allocations internationally. In particular, we consider the example of the global public good ‘climate stabilization’, both in an analytical and a numerical simulation model. The proposed scheme levies Pigouvian taxes globally, while international side-payments are employed in order to provide incentives to individual countries for not taking a free-ride from the international Pigouvian tax scheme. The side-payments, in turn, are financed via environmental taxes. As a distinctive feature, we take into account ancillary benefits that may be associated with local public characteristics of climate policy. We determine the positive impact that ancillary effects may exert on the scope for financing side-payments via environmental taxation. A particular attractive feature of ancillary benefits is that they arise shortly after the implementation of climate policies and therefore yield an almost immediate payback of investments in abatement efforts. Especially in times of high public debt levels, long periods of amortization would tend to reduce political support for investments in climate policy.
Read article
Note on the Hidden Risk of Inflation
Makram El-Shagi, Sebastian Giesen
Journal of Economic Policy Reform,
No. 1,
2014
Abstract
The continued expansionary policy of the Federal Reserve gives rise to speculation whether the Fed will be able to maintain price stability in the coming decades. Most of the scientific work relating money to prices relies on broad monetary aggregates (i.e. M2 for the United States). In our paper, we argue that this view falls short. The historically unique monetary expansion has not yet fully reached M2. Using a cointegration approach, we aim to show the hidden risks for the future development of M2 and correspondingly prices. In a simulation analysis we show that even if the multiplier remains substantially below its pre-crisis level, M2 will exceed its current growth path with a probability of 95%.
Read article
Exploring the Evolution of Innovation Networks in Science-driven and Scale-intensive Industries: New Evidence from a Stochastic Actor-based Approach
T. Buchmann, D. Hain, Muhamed Kudic, M. Müller
IWH Discussion Papers,
No. 1,
2014
Abstract
Our primary goal is to analyse the drivers of evolutionary network change processes by using a stochastic actor-based simulation approach. We contribute to the literature by combining two unique datasets, concerning the German laser and automotive industry, between 2002 and 2006 to explore whether geographical, network-related, and techno-logical determinants affect the evolution of networks, and if so, as to what extent these determinants systematically differ for science-driven industries compared to scale-intensive industries. Our results provide empirical evidence for the explanatory power of network-related determinants in both industries. The ‘experience effect’ as well as the ‘transitivity effects’ are significant for both industries but more pronounced for laser manufacturing firms. When it comes to ‘geographical effects’ and ‘technological ef-fects’ the picture changes considerably. While geographical proximity plays an important role in the automotive industry, firms in the laser industry seem to be less dependent on geographical closeness to cooperation partners; instead they rather search out for cooperation opportunities in distance. This might reflect the strong dependence of firms in science-driven industries to access diverse external knowledge, which cannot necessarily be found in the close geographical surrounding. Technological proximity negatively influences cooperation decisions for laser source manufacturers, yet has no impact for automotive firms. In other words, technological heterogeneity seems to ex-plain, at least in science-driven industries, the attractiveness of potential cooperation partners.
Read article
Effects of Incorrect Specification on the Finite Sample Properties of Full and Limited Information Estimators in DSGE Models
Sebastian Giesen, Rolf Scheufele
Abstract
In this paper we analyze the small sample properties of full information and limited information estimators in a potentially misspecified DSGE model. Therefore, we conduct a simulation study based on a standard New Keynesian model including price and wage rigidities. We then study the effects of omitted variable problems on the structural parameters estimates of the model. We find that FIML performs superior when the model is correctly specified. In cases where some of the model characteristics are omitted, the performance of FIML is highly unreliable, whereas GMM estimates remain approximately unbiased and significance tests are mostly reliable.
Read article
Money and Inflation: Consequences of the Recent Monetary Policy
Makram El-Shagi, Sebastian Giesen
Journal of Policy Modeling,
No. 4,
2013
Abstract
We use a multivariate state space framework to analyze the short run impact of money on prices in the United States. The key contribution of this approach is that it allows to identify the impact of money growth on inflation without having to model money demand explicitly.
Using our results, that provide evidence for a substantial impact of money on prices in the US, we analyze the consequences of the Fed's response to the financial crisis. Our results indicate a raise of US inflation above 5% for more than a decade. Alternative exit strategies that we simulate cannot fully compensate for the monetary pressure without risking serious repercussions on the real economy. Further simulations of a double dip in the United States indicate that a repetition of the unusually expansive monetary policy – in addition to increased inflation – might cause growth losses exceeding the contemporary easing of the crisis.
Read article
Bottom-up or Direct? Forecasting German GDP in a Data-rich Environment
Katja Drechsel, Rolf Scheufele
Abstract
This paper presents a method to conduct early estimates of GDP growth in Germany. We employ MIDAS regressions to circumvent the mixed frequency problem and use pooling techniques to summarize efficiently the information content of the various indicators. More specifically, we investigate whether it is better to disaggregate GDP (either via total value added of each sector or by the expenditure side) or whether a direct approach is more appropriate when it comes to forecasting GDP growth. Our approach combines a large set of monthly and quarterly coincident and leading indicators and takes into account the respective publication delay. In a simulated out-of-sample experiment we evaluate the different modelling strategies conditional on the given state of information and depending on the model averaging technique. The proposed approach is computationally simple and can be easily implemented as a nowcasting tool. Finally, this method also allows retracing the driving forces of the forecast and hence enables the interpretability of the forecast outcome.
Read article
Financial Factors in Macroeconometric Models
Sebastian Giesen
Volkswirtschaft, Ökonomie, Shaker Verlag GmbH, Aachen,
2013
Abstract
The important role of credit has long been identified as a key factor for economic development (see e.g. Wicksell (1898), Keynes (1931), Fisher (1933) and Minsky (1957, 1964)). Even before the financial crisis most researchers and policy makers agreed that financial frictions play an important role for business cycles and that financial turmoils can result in severe economic downturns (see e.g. Mishkin (1978), Bernanke (1981, 1983), Diamond (1984), Calomiris (1993) and Bernanke and Gertler (1995)). However, in practice researchers and policy makers mostly used simplified models for forecasting and simulation purposes. They often neglected the impact of financial frictions and emphasized other non financial market frictions when analyzing business cycle fluctuations (prominent exceptions include Kiyotaki and Moore (1997), Bernanke, Gertler, and Gilchrist (1999) and Christiano, Motto, and Rostagno (2010)). This has been due to the fact that most economic downturns did not seem to be closely related to financial market failures (see Eichenbaum (2011)). The outbreak of the subprime crises ― which caused panic in financial markets and led to the default of Lehman Brothers in September 2008 ― then led to a reconsideration of such macroeconomic frameworks (see Caballero (2010) and Trichet (2011)). To address the economic debate from a new perspective, it is therefore necessary to integrate the relevant frictions which help to explain what we have experienced during recent years.
In this thesis, I analyze different ways to incorporate relevant frictions and financial variables in macroeconometric models. I discuss the potential consequences for standard statistical inference and macroeconomic policy. I cover three different aspects in this work. Each aspect presents an idea in a self-contained unit. The following paragraphs present more detail on the main topics covered.
Read article