EconPapers    
Economics at your fingertips  
 

Smoothing-based Initialization for Learning-to-Forecast Algorithms

Michele Berardi and Jaqueson Galimberti

No 17-425, KOF Working papers from KOF Swiss Economic Institute, ETH Zurich

Abstract: Under adaptive learning,recursive algorithms are proposed to represent how agents update their beliefs over time. For applied purposes these algorithms require initial estimates of agents perceived law of motion. Obtaining appropriate initial estimates can become prohibitive within the usual data availability restrictions of macroeconomics. To circumvent this issue we propose a new smoothing-based initialization routine that optimizes the use of a training sample of data to obtain initials consistent with the statistical properties of the learning algorithm. Our method is generically formulated to cover different specifications of the learning mechanism, such as the Least Squares and the Stochastic Gradient algorithms. Using simulations we show that our method is able to speed up the convergence of initial estimates in exchange for a higher computational cost.

Pages: 16 pages
Date: 2017-01
New Economics Papers: this item is included in nep-cmp
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
http://dx.doi.org/10.3929/ethz-a-010820132 (application/pdf)

Related works:
Journal Article: SMOOTHING-BASED INITIALIZATION FOR LEARNING-TO-FORECAST ALGORITHMS (2019) Downloads
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:kof:wpskof:17-425

Access Statistics for this paper

More papers in KOF Working papers from KOF Swiss Economic Institute, ETH Zurich Contact information at EDIRC.
Bibliographic data for series maintained by ().

 
Page updated 2025-03-30
Handle: RePEc:kof:wpskof:17-425