Model Averaging Estimation Method by Kullback–Leibler Divergence for Multiplicative Error Model
Wanbo Lu,
Wenhui Shi and
Danilo Comminiello
Complexity, 2022, vol. 2022, 1-13
Abstract:
In this paper, we propose the model averaging estimation method for multiplicative error model and construct the corresponding weight choosing criterion based on the Kullback–Leibler divergence with a hyperparameter to avoid the problem of overfitting. The resulting model average estimator is proved to be asymptotically optimal. It is shown that the Kullback–Leibler model averaging (KLMA) estimator asymptotically minimizes the in-sample Kullback–Leibler divergence and improves the forecast accuracy of out-of-sample even under different loss functions. In simulations, we show that the KLMA estimator compares favorably with smooth-AIC estimator (SAIC), smooth-BIC estimator (SBIC), and Mallows model averaging estimator (MMA), especially when some nonlinear noise is added to the data generation process. The empirical applications in the daily range of S&P500 and price duration of IBM show that the out-of-sample forecasting capacity of the KLMA estimator is better than that of other methods.
Date: 2022
References: Add references at CitEc
Citations:
Downloads: (external link)
http://downloads.hindawi.com/journals/complexity/2022/7706992.pdf (application/pdf)
http://downloads.hindawi.com/journals/complexity/2022/7706992.xml (application/xml)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hin:complx:7706992
DOI: 10.1155/2022/7706992
Access Statistics for this article
More articles in Complexity from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().