scispace - formally typeset
Search or ask a question
Author

Marta Banbura

Bio: Marta Banbura is an academic researcher from European Central Bank. The author has contributed to research in topics: Bayesian vector autoregression & Vector autoregression. The author has an hindex of 16, co-authored 30 publications receiving 3358 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors show that vector auto regression with Bayesian shrinkage is an appropriate tool for large dynamic models and that large VARs with shrinkage produce credible impulse responses and are suitable for structural analysis.
Abstract: This paper shows that vector auto regression (VAR) with Bayesian shrinkage is an appropriate tool for large dynamic models. We build on the results of De Mol and co-workers (2008) and show that, when the degree of shrinkage is set in relation to the cross-sectional dimension, the forecasting performance of small monetary VARs can be improved by adding additional macroeconomic variables and sectoral information. In addition, we show that large VARs with shrinkage produce credible impulse responses and are suitable for structural analysis. © 2009 John Wiley & Sons, Ltd.

813 citations

Posted Content
TL;DR: In this paper, the authors show that vector auto regression with Bayesian shrinkage is an appropriate tool for large dynamic models and that large VARs with shrinkage produce credible impulse responses and are suitable for structural analysis.
Abstract: This paper shows that vector auto regression (VAR) with Bayesian shrinkage is an appropriate tool for large dynamic models. We build on the results of De Mol and co-workers (2008) and show that, when the degree of shrinkage is set in relation to the cross-sectional dimension, the forecasting performance of small monetary VARs can be improved by adding additional macroeconomic variables and sectoral information. In addition, we show that large VARs with shrinkage produce credible impulse responses and are suitable for structural analysis. Copyright © 2009 John Wiley & Sons, Ltd.

736 citations

Journal ArticleDOI
TL;DR: The expectation maximization algorithm is modified in order to estimate the parameters of the dynamic factor model on a dataset with an arbitrary pattern of missing data and the model is extended to the case with a serially correlated idiosyncratic component.
Abstract: SUMMARY In this paper we modify the expectation maximization algorithm in order to estimate the parameters of the dynamic factor model on a dataset with an arbitrary pattern of missing data. We also extend the model to the case with a serially correlated idiosyncratic component. The framework allows us to handle efficiently and in an automatic manner sets of indicators characterized by different publication delays, frequencies and sample lengths. This can be relevant, for example, for young economies for which many indicators have been compiled only recently. We evaluate the methodology in a Monte Carlo experiment and we apply it to nowcasting of the euro area gross domestic product. Copyright © 2012 John Wiley & Sons, Ltd.

330 citations

Journal ArticleDOI
TL;DR: In this paper, the authors derive forecast weights and uncertainty measures for assessing the roles of individual series in a dynamic factor model (DFM) for forecasting the euro area GDP from monthly indicators.

327 citations

Posted Content
TL;DR: Elliott et al. as mentioned in this paper survey recent developments in economic now-casting with special focus on those models that formalize key features of how market participants and policy makers read macroeconomic data releases in real time, which involves monitoring many data, forming expectations about them and revising the assessment on the state of the economy whenever realizations diverge sizeably from those expectations.
Abstract: The term now-casting is a contraction for now and forecasting and has been used for a long-time in meteorology and recently also in economics. In this paper we survey recent developments in economic now-casting with special focus on those models that formalize key features of how market participants and policy makers read macroeconomic data releases in real time, which involves: monitoring many data, forming expectations about them and revising the assessment on the state of the economy whenever realizations diverge sizeably from those expectations. (Prepared for G. Elliott and A. Timmermann, eds., Handbook of Economic Forecasting, Volume 2, Elsevier-North Holland)

279 citations


Cited by
More filters
Book
01 Jan 2009

8,216 citations

Posted Content
TL;DR: In this paper, the authors provide a unified and comprehensive theory of structural time series models, including a detailed treatment of the Kalman filter for modeling economic and social time series, and address the special problems which the treatment of such series poses.
Abstract: In this book, Andrew Harvey sets out to provide a unified and comprehensive theory of structural time series models. Unlike the traditional ARIMA models, structural time series models consist explicitly of unobserved components, such as trends and seasonals, which have a direct interpretation. As a result the model selection methodology associated with structural models is much closer to econometric methodology. The link with econometrics is made even closer by the natural way in which the models can be extended to include explanatory variables and to cope with multivariate time series. From the technical point of view, state space models and the Kalman filter play a key role in the statistical treatment of structural time series models. The book includes a detailed treatment of the Kalman filter. This technique was originally developed in control engineering, but is becoming increasingly important in fields such as economics and operations research. This book is concerned primarily with modelling economic and social time series, and with addressing the special problems which the treatment of such series poses. The properties of the models and the methodological techniques used to select them are illustrated with various applications. These range from the modellling of trends and cycles in US macroeconomic time series to to an evaluation of the effects of seat belt legislation in the UK.

4,252 citations

Journal ArticleDOI
TL;DR: In this article, the authors show that the common factors based on maximum likelihood are consistent for the size of the cross-section (n) and the sample size (T) going to infinity along any path of n and T and therefore maximum likelihood is viable for n large.
Abstract: Is maximum likelihood suitable for factor models in large cross-sections of time series? We answer this question from both an asymptotic and an empirical perspective. We show that estimates of the common factors based on maximum likelihood are consistent for the size of the cross-section (n) and the sample size (T) going to infinity along any path of n and T and that therefore maximum likelihood is viable for n large. The estimator is robust to misspecification of the cross-sectional and time series correlation of the the idiosyncratic components. In practice, the estimator can be easily implemented using the Kalman smoother and the EM algorithm as in traditional factor analysis.

497 citations

Journal ArticleDOI
TL;DR: In this article, the authors discuss VAR, factor augmented VARs, and time-varying parameter extensions and show how Bayesian inference proceeds for state space models, including Markov chain Monte Carlo (MCMC) methods.
Abstract: Macroeconomic practitioners frequently work with multivariate time series models such as VARs, factor augmented VARs as well as time-varying parameter versions of these models (including variants with multivariate stochastic volatility). These models have a large number of parameters and, thus, over-parameterization problems may arise. Bayesian methods have become increasingly popular as a way of overcoming these problems. In this monograph, we discuss VARs, factor augmented VARs and time-varying parameter extensions and show how Bayesian inference proceeds. Apart from the simplest of VARs, Bayesian inference requires the use of Markov chain Monte Carlo methods developed for state space models and we describe these algorithms. The focus is on the empirical macroeconomist and we o¤er advice on how to use these models and methods in practice and include empirical illustrations. A website provides Matlab code for carrying out Bayesian inference in these models.

497 citations

Posted Content
TL;DR: In this article, the authors consider Bayesian regression with normal and double-exponential priors as forecasting methods based on large panels of time series and show that these forecasts are highly correlated with principal component forecasts and that they perform equally well for a wide range of prior choices.
Abstract: This paper considers Bayesian regression with normal and double-exponential priors as forecasting methods based on large panels of time series. We show that, empirically, these forecasts are highly correlated with principal component forecasts and that they perform equally well for a wide range of prior choices. Moreover, we study the asymptotic properties of the Bayesian regression under Gaussian prior under the assumption that data are quasi collinear to establish a criterion for setting parameters in a large cross-section.

488 citations