EconPapers    
Economics at your fingertips  
 

SHAP Informed Neural Network

Jarrod Graham () and Victor S. Sheng
Additional contact information
Jarrod Graham: Department of Computer Science, College of Engineering, Texas Tech University, Lubbock, TX 79409, USA
Victor S. Sheng: Department of Computer Science, College of Engineering, Texas Tech University, Lubbock, TX 79409, USA

Mathematics, 2025, vol. 13, issue 5, 1-22

Abstract: In the context of neural network optimization, this study explores the performance and computational efficiency of learning rate adjustment strategies applied with Adam and SGD optimizers. Methods evaluated include exponential annealing, step decay, and SHAP-informed adjustments across three datasets: Breast Cancer, Diabetes, and California Housing. The SHAP-informed adjustments integrate feature importance metrics derived from cooperative game theory, either scaling the global learning rate or directly modifying gradients of first-layer parameters. A comprehensive grid search was conducted to optimize the hyperparameters, and performance was assessed using metrics such as test loss, RMSE, R 2 score, accuracy, and training time. Results revealed that while step decay consistently delivered strong performance across datasets, SHAP-informed methods often demonstrated even higher accuracy and generalization, such as SHAP achieving the lowest test loss and RMSE on the California Housing dataset. However, the computational overhead of SHAP-based approaches was significant, particularly in targeted gradient adjustments. This study highlights the potential of SHAP-informed methods to guide optimization processes through feature-level insights, offering advantages in data with complex feature interactions. Despite computational challenges, these methods provide a foundation for exploring how feature importance can inform neural network training, presenting promising directions for future research on scalable and efficient optimization techniques.

Keywords: SHAP; Adam optimizer; learning rate adjustments; neural networks; grid search; performance evaluation (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/13/5/849/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/5/849/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:5:p:849-:d:1605023

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-04-05
Handle: RePEc:gam:jmathe:v:13:y:2025:i:5:p:849-:d:1605023