EconPapers    
Economics at your fingertips  
 

Training trees on tails with applications to portfolio choice

Guillaume Coqueret () and Tony Guida
Additional contact information
Guillaume Coqueret: EM - EMLyon Business School
Tony Guida: RAM Alternative Investments

Post-Print from HAL

Abstract: In this article, we investigate the impact of truncating training data when fitting regression trees. We argue that training times can be curtailed by reducing the training sample without any loss in out-ofsample accuracy as long as the prediction model has been trained on the tails of the dependent variable, that is, when 'average' observations have been discarded from the training sample. Filtering instances has an impact on the features that are selected to yield the splits and can help reduce overfitting by favoring predictors with monotonous impacts on the dependent variable. We test this technique in an out-of-sample exercise of portfolio selection which shows its benefits. The implications of our results are decisive for time-consuming tasks such as hyperparameter tuning and validation.

Date: 2020-05
Note: View the original document on HAL open archive server: https://hal.science/hal-04144665v1
References: View references in EconPapers View complete reference list from CitEc
Citations:

Published in Annals of Operations Research, 2020, 288 (1), pp.181-221. ⟨10.1007/s10479-020-03539-2⟩

Downloads: (external link)
https://hal.science/hal-04144665v1/document (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:hal:journl:hal-04144665

DOI: 10.1007/s10479-020-03539-2

Access Statistics for this paper

More papers in Post-Print from HAL
Bibliographic data for series maintained by CCSD ().

 
Page updated 2025-03-19
Handle: RePEc:hal:journl:hal-04144665