Essays in real-time forecasting
Joëlle Liebermann
ULB Institutional Repository from ULB -- Universite Libre de Bruxelles
Abstract:
This thesis contains three essays in the field of real-time econometrics, and more particularly
forecasting.
The issue of using data as available in real-time to forecasters, policymakers or financial
markets is an important one which has only recently been taken on board in the empirical
literature. Data available and used in real-time are preliminary and differ from ex-post
revised data, and given that data revisions may be quite substantial, the use of latest
available instead of real-time can substantially affect empirical findings (see, among others,
Croushore’s (2011) survey). Furthermore, as variables are released on different dates
and with varying degrees of publication lags, in order not to disregard timely information,
datasets are characterized by the so-called “ragged-edge”structure problem. Hence, special
econometric frameworks, such as developed by Giannone, Reichlin and Small (2008) must
be used.
The first Chapter, “The impact of macroeconomic news on bond yields: (in)stabilities over
time and relative importance”, studies the reaction of U.S. Treasury bond yields to real-time
market-based news in the daily flow of macroeconomic releases which provide most of the
relevant information on their fundamentals, i.e. the state of the economy and inflation. We
find that yields react systematically to a set of news consisting of the soft data, which have
very short publication lags, and the most timely hard data, with the employment report
being the most important release. However, sub-samples evidence reveals that parameter
instability in terms of absolute and relative size of yields response to news, as well as
significance, is present. Especially, the often cited dominance to markets of the employment
report has been evolving over time, as the size of the yields reaction to it was steadily
increasing. Moreover, over the recent crisis period there has been an overall switch in the
relative importance of soft and hard data compared to the pre-crisis period, with the latter
becoming more important even if less timely, and the scope of hard data to which markets
react has increased and is more balanced as less concentrated on the employment report.
Markets have become more reactive to news over the recent crisis period, particularly to
hard data. This is a consequence of the fact that in periods of high uncertainty (bad state),
markets starve for information and attach a higher value to the marginal information content
of these news releases.
The second and third Chapters focus on the real-time ability of models to now-and-forecast
in a data-rich environment. It uses an econometric framework, that can deal with large
panels that have a “ragged-edge”structure, and to evaluate the models in real-time, we
constructed a database of vintages for US variables reproducing the exact information that
was available to a real-time forecaster.
The second Chapter, “Real-time nowcasting of GDP: a factor model versus professional
forecasters”, performs a fully real-time nowcasting (forecasting) exercise of US real GDP
growth using Giannone, Reichlin and Smalls (2008), henceforth (GRS), dynamic factor
model (DFM) framework which enables to handle large unbalanced datasets as available
in real-time. We track the daily evolution throughout the current and next quarter of the
model nowcasting performance. Similarly to GRS’s pseudo real-time results, we find that
the precision of the nowcasts increases with information releases. Moreover, the Survey of
Professional Forecasters does not carry additional information with respect to the model,
suggesting that the often cited superiority of the former, attributable to judgment, is weak
over our sample. As one moves forward along the real-time data flow, the continuous
updating of the model provides a more precise estimate of current quarter GDP growth and
the Survey of Professional Forecasters becomes stale. These results are robust to the recent
recession period.
The last Chapter, “Real-time forecasting in a data-rich environment”, evaluates the ability
of different models, to forecast key real and nominal U.S. monthly macroeconomic variables
in a data-rich environment and from the perspective of a real-time forecaster. Among
the approaches used to forecast in a data-rich environment, we use pooling of bi-variate
forecasts which is an indirect way to exploit large cross-section and the directly pooling of
information using a high-dimensional model (DFM and Bayesian VAR). Furthermore forecasts
combination schemes are used, to overcome the choice of model specification faced by
the practitioner (e.g. which criteria to use to select the parametrization of the model), as
we seek for evidence regarding the performance of a model that is robust across specifications/
combination schemes. Our findings show that predictability of the real variables is
confined over the recent recession/crisis period. This in line with the findings of D’Agostino
and Giannone (2012) over an earlier period, that gains in relative performance of models
using large datasets over univariate models are driven by downturn periods which are characterized
by higher comovements. These results are robust to the combination schemes
or models used. A point worth mentioning is that for nowcasting GDP exploiting crosssectional
information along the real-time data flow also helps over the end of the great moderation period. Since this is a quarterly aggregate proxying the state of the economy,
monthly variables carry information content for GDP. But similarly to the findings for the
monthly variables, predictability, as measured by the gains relative to the naive random
walk model, is higher during crisis/recession period than during tranquil times. Regarding
inflation, results are stable across time, but predictability is mainly found at nowcasting
and forecasting one-month ahead, with the BVAR standing out at nowcasting. The results
show that the forecasting gains at these short horizons stem mainly from exploiting timely
information. The results also show that direct pooling of information using a high dimensional
model (DFM or BVAR) which takes into account the cross-correlation between the
variables and efficiently deals with the “ragged-edge”structure of the dataset, yields more
accurate forecasts than the indirect pooling of bi-variate forecasts/models.
Keywords: Macroeconomics -- Mathematical models; Bond market; Econometrics; Economic forecasting; Macroéconomie -- Modèles mathématiques; Marché obligataire; Econométrie; Prévision économique; BVAR; forecasting; nowcasting; real-time; factor model; nws (search for similar items in EconPapers)
Pages: 1 v. (x, 125 p.)
Date: 2012-09-12
Note: Degree: Doctorat en Sciences économiques et de gestion
References: Add references at CitEc
Citations:
Downloads: (external link)
https://dipot.ulb.ac.be/dspace/bitstream/2013/2096 ... a71-bfe3e3dcf6ad.txt Œuvre complète ou partie de l'œuvre (application/pdf)
https://dipot.ulb.ac.be/dspace/bitstream/2013/2096 ... be3-1d408b88707d.txt These_Liebermann_ToC (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:ulb:ulbeco:2013/209644
Ordering information: This working paper can be ordered from
http://hdl.handle.ne ... lb.ac.be:2013/209644
Access Statistics for this paper
More papers in ULB Institutional Repository from ULB -- Universite Libre de Bruxelles Contact information at EDIRC.
Bibliographic data for series maintained by Benoit Pauwels ().