EconPapers    
Economics at your fingertips  
 

Deep Learning Diagnostics ‒ How to Avoid Being Fooled by TensorFlow, PyTorch, or MXNet with the Help of Modern Econometrics, vol 24

Frank Lehrbass

in ifes Schriftenreihe from FOM Hochschule für Oekonomie & Management, ifes Institut für Empirie & Statistik

Abstract: Training a Multi-Layer Perceptron (MLP) to achieve a minimum level of MSE is akin to doing Non-Linear Regression (NLR). Therefore, we use available econometric theory and the corresponding tools in R. Only if certain assumptions about the error term in the Data Generating Process are in place, may we enjoy the trained MLP as a consistent estimator. To verify the assumptions, careful diagnostics are necessary. Using controlled experiments we show that even in an ideal setting, an MLP may fail to learn a relationship whereas NLR performs better. We illustrate how the MLP is outperformed by Non-Linear Quantile Regression in the presence of outliers. A third situation in which the MLP is often led astray is where there is no relationship and the MLP still learns a relationship producing high levels of R². We show that circumventing the trap of spurious learning is only possible with the help of diagnostics.

Date: 2021
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.econstor.eu/bitstream/10419/249987/1/FOM-ifes-Bd24.pdf (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:zbw:fomies:24

Access Statistics for this book

More books in ifes Schriftenreihe from FOM Hochschule für Oekonomie & Management, ifes Institut für Empirie & Statistik Contact information at EDIRC.
Bibliographic data for series maintained by ZBW - Leibniz Information Centre for Economics ().

 
Page updated 2025-03-20
Handle: RePEc:zbw:fomies:24