Do We Need New Modelling Approaches in Macroeconomics?
Claudia M. Buch, Oliver Holtemöller
IWH Discussion Papers,
No. 8,
2014
Abstract
The economic and financial crisis that emerged in 2008 also initiated an intense discussion on macroeconomic research and the role of economists in society. The debate focuses on three main issues. Firstly, it is argued that economists failed to predict the crisis and to design early warning systems. Secondly, it is claimed that economists use models of the macroeconomy which fail to integrate financial markets and which are inadequate to model large economic crises. Thirdly, the issue has been raised that economists invoke unrealistic assumptions concerning human behaviour by assuming that all agents are self-centred, rationally optimizing individuals. In this paper, we focus on the first two issues. Overall, our thrust is that the above statements are a caricature of modern economic theory and empirics. A rich field of research developed already before the crisis and picked up shortcomings of previous models.
Read article
Liquidity in the Liquidity Crisis: Evidence from Divisia Monetary Aggregates in Germany and the European Crisis Countries
Makram El-Shagi
Economics Bulletin,
No. 1,
2014
Abstract
While there has been much discussion of the role of liquidity in the recent financial crises, there has been little discussion of the use of macroeconomic aggregation techniques to measure total liquidity available to the market. In this paper, we provide an approximation of the liquidity development in six Euro area countries from 2003 to 2013. We show that properly measured monetary aggregates contain significant information about liquidity risk.
Read article
Modelling Macroeconomic Risk: The Genesis of the European Debt Crisis
Gregor von Schweinitz
Hochschulschrift, Juristische und Wirtschaftswissenschaftliche Fakultät der Martin-Luther-Universität Halle-Wittenberg,
2013
Abstract
Diverging European sovereign bond yields after 2008 are the most visible sign of the European debt crisis. This dissertation examines in a first step, to which extent the development of yields is driven by credit and liquidity risk, and how it is influenced by general uncertainty on financial markets. It can be shown that yields are driven to a significant degree by a flight towards bonds of high liquidity in times of high market uncertainty. In a second step, high yields are interpreted as a sign of an existing crisis in the respective country. Using the signals approach, the early-warning capabilities of four different proposals for the design of the scoreboard as part of the “Macroeconomic Imbalances Procedure” (introduced in December 2011 by the European Commission) are tested, advocating a scoreboard including a variety of many different indicators. In a third step, the methodology of the signals approach is extended to include also results on significance.
Read article
Big Banks and Macroeconomic Outcomes: Theory and Cross-Country Evidence of Granularity
Franziska Bremus, Claudia M. Buch, K. Russ, Monika Schnitzer
NBER Working Paper No. 19093,
2013
Abstract
Does the mere presence of big banks affect macroeconomic outcomes? In this paper, we develop a theory of granularity (Gabaix, 2011) for the banking sector, introducing Bertrand competition and heterogeneous banks charging variable markups. Using this framework, we show conditions under which idiosyncratic shocks to bank lending can generate aggregate fluctuations in the credit supply when the banking sector is highly concentrated. We empirically assess the relevance of these granular effects in banking using a linked micro-macro dataset of more than 80 countries for the years 1995-2009. The banking sector for many countries is indeed granular, as the right tail of the bank size distribution follows a power law. We then demonstrate granular effects in the banking sector on macroeconomic outcomes. The presence of big banks measured by high market concentration is associated with a positive and significant relationship between bank-level credit growth and aggregate growth of credit or gross domestic product.
Read article
Financial Factors in Macroeconometric Models
Sebastian Giesen
Volkswirtschaft, Ökonomie, Shaker Verlag GmbH, Aachen,
2013
Abstract
The important role of credit has long been identified as a key factor for economic development (see e.g. Wicksell (1898), Keynes (1931), Fisher (1933) and Minsky (1957, 1964)). Even before the financial crisis most researchers and policy makers agreed that financial frictions play an important role for business cycles and that financial turmoils can result in severe economic downturns (see e.g. Mishkin (1978), Bernanke (1981, 1983), Diamond (1984), Calomiris (1993) and Bernanke and Gertler (1995)). However, in practice researchers and policy makers mostly used simplified models for forecasting and simulation purposes. They often neglected the impact of financial frictions and emphasized other non financial market frictions when analyzing business cycle fluctuations (prominent exceptions include Kiyotaki and Moore (1997), Bernanke, Gertler, and Gilchrist (1999) and Christiano, Motto, and Rostagno (2010)). This has been due to the fact that most economic downturns did not seem to be closely related to financial market failures (see Eichenbaum (2011)). The outbreak of the subprime crises ― which caused panic in financial markets and led to the default of Lehman Brothers in September 2008 ― then led to a reconsideration of such macroeconomic frameworks (see Caballero (2010) and Trichet (2011)). To address the economic debate from a new perspective, it is therefore necessary to integrate the relevant frictions which help to explain what we have experienced during recent years.
In this thesis, I analyze different ways to incorporate relevant frictions and financial variables in macroeconometric models. I discuss the potential consequences for standard statistical inference and macroeconomic policy. I cover three different aspects in this work. Each aspect presents an idea in a self-contained unit. The following paragraphs present more detail on the main topics covered.
Read article
The GVAR Handbook: Structure and Applications of a Macro Model of the Global Economy for Policy Analysis
Filippo di Mauro, M. Hashem Pesaran
Oxford University Press,
2013
Abstract
The recent crisis has shown yet again how the world economies are globally interlinked, via a complex net of transmission channels. When it comes, however, to build econometric frameworks aimed at analysing such linkages, modellers are faced with what is called the "curse of dimensionality": there far too many parameters to be estimated with respect to the available observations. The GVAR, a VAR based model of the global economy, offers a solution to this problem. The basic model is composed of a large number of country specific models, comprising domestic, foreign and purely global variables. The foreign variables, however, are treated as weakly exogenous. This assumption, which is typically held when empirically tested for virtually all economies - with the notable exception of the US which is treated differently - allows to estimate first the individual country models separately. Only in a second stage country-specific models are simultaneously solved, thus allowing global interactions.This volume presents - for a first time in a compact and rather easy to read format - principles and structure of the basic GVAR model and a number of its many applications and extensions developed in the last few years by a growing literature. Its main objective is to show how powerful the model can be as a tool for forecasting and scenario analysis. The clear modelling structure of the GVAR appeals to policy makers and practitioners as shown by its growing use among major institutions, as well as by econometricians, as shown by the main extensions and applications.
Read article
Testing for Structural Breaks at Unknown Time: A Steeplechase
Makram El-Shagi, Sebastian Giesen
Computational Economics,
No. 1,
2013
Abstract
This paper analyzes the role of common data problems when identifying structural breaks in small samples. Most notably, we survey small sample properties of the most commonly applied endogenous break tests developed by Brown et al. (J R Stat Soc B 37:149–163, 1975) and Zeileis (Stat Pap 45(1):123–131, 2004), Nyblom (J Am Stat Assoc 84(405):223–230, 1989) and Hansen (J Policy Model 14(4):517–533, 1992), and Andrews et al. (J Econ 70(1):9–38, 1996). Power and size properties are derived using Monte Carlo simulations. We find that the Nyblom test is on par with the commonly used F type tests in a small sample in terms of power. While the Nyblom test’s power decreases if the structural break occurs close to the margin of the sample, it proves far more robust to nonnormal distributions of the error term that are found to matter strongly in small samples although being irrelevant asymptotically for all tests that are analyzed in this paper.
Read article
Central Bank, Trade Unions, and Reputation – Is there Room for an Expansionist Manoeuvre in the European Union?
Toralf Pusch, A. Heise
A. Heise (ed.), Market Constellation Research: A Modern Governance Approach to Macroeconomic Policy. Institutionelle und Sozial-Ökonomie, Bd. 19,
2011
Abstract
The objective of this reader is manifold: On the one hand, it intends to establish a new perspective at the policy level named 'market constellations': institutionally embedded systems of macroeconomic governance which are able to explain differences in growth and employment developments. At the polity level, the question raised is whether or not market constellations can be governed and, thus, whether institutions can be created which will provide the incentives necessary for favourable market constellations.
Read article
An Evolutionary Algorithm for the Estimation of Threshold Vector Error Correction Models
Makram El-Shagi
International Economics and Economic Policy,
No. 4,
2011
Abstract
We develop an evolutionary algorithm to estimate Threshold Vector Error Correction models (TVECM) with more than two cointegrated variables. Since disregarding a threshold in cointegration models renders standard approaches to the estimation of the cointegration vectors inefficient, TVECM necessitate a simultaneous estimation of the cointegration vector(s) and the threshold. As far as two cointegrated variables are considered, this is commonly achieved by a grid search. However, grid search quickly becomes computationally unfeasible if more than two variables are cointegrated. Therefore, the likelihood function has to be maximized using heuristic approaches. Depending on the precise problem structure the evolutionary approach developed in the present paper for this purpose saves 90 to 99 per cent of the computation time of a grid search.
Read article