EconPapers    
Economics at your fingertips  
 

Response versus gradient boosting trees, GLMs and neural networks under Tweedie loss and log-link

Donatien Hainaut, Julien Trufin and Michel Denuit

Scandinavian Actuarial Journal, 2022, vol. 2022, issue 10, 841-866

Abstract: Thanks to its outstanding performances, boosting has rapidly gained wide acceptance among actuaries. To speed up calculations, boosting is often applied to gradients of the loss function, not to responses (hence the name gradient boosting). When the model is trained by minimizing Poisson deviance, this amounts to apply the least-squares principle to raw residuals. This exposes gradient boosting to the same problems that lead to replace least-squares with Poisson Generalized Linear Models (GLM) to analyze low counts (typically, the number of reported claims at policy level in personal lines). This paper shows that boosting can be conducted directly on the response under Tweedie loss function and log-link, by adapting the weights at each step. Numerical illustrations demonstrate similar or better performances compared to gradient boosting when trees are used as weak learners, with a higher level of transparency since responses are used instead of gradients.

Date: 2022
References: Add references at CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
http://hdl.handle.net/10.1080/03461238.2022.2037016 (text/html)
Access to full text is restricted to subscribers.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:taf:sactxx:v:2022:y:2022:i:10:p:841-866

Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/sact20

DOI: 10.1080/03461238.2022.2037016

Access Statistics for this article

Scandinavian Actuarial Journal is currently edited by Boualem Djehiche

More articles in Scandinavian Actuarial Journal from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().

 
Page updated 2025-03-20
Handle: RePEc:taf:sactxx:v:2022:y:2022:i:10:p:841-866