EconPapers    
Economics at your fingertips  
 

Interpretable Diagnostics with SHAP-Rule: Fuzzy Linguistic Explanations from SHAP Values

Alexandra I. Khalyasmaa, Pavel V. Matrenin () and Stanislav A. Eroshenko
Additional contact information
Alexandra I. Khalyasmaa: Ural Power Engineering Institute, Ural Federal University Named After the First President of Russia B.N. Yeltsin, Ekaterinburg 620062, Russia
Pavel V. Matrenin: Ural Power Engineering Institute, Ural Federal University Named After the First President of Russia B.N. Yeltsin, Ekaterinburg 620062, Russia
Stanislav A. Eroshenko: Ural Power Engineering Institute, Ural Federal University Named After the First President of Russia B.N. Yeltsin, Ekaterinburg 620062, Russia

Mathematics, 2025, vol. 13, issue 20, 1-20

Abstract: This study introduces SHAP-Rule, a novel explainable artificial intelligence method that integrates Shapley additive explanations with fuzzy logic to automatically generate interpretable linguistic IF-THEN rules for diagnostic tasks. Unlike purely numeric SHAP vectors, which are difficult for decision-makers to interpret, SHAP-Rule translates feature attributions into concise explanations that humans can understand. The method was rigorously evaluated and compared with baseline SHAP and AnchorTabular explanations across three distinct and representative datasets: the CWRU Bearing dataset for industrial predictive maintenance, a dataset for failure analysis in power transformers, and the medical Pima Indians Diabetes dataset. Experimental results demonstrated that SHAP-Rule consistently provided clearer and more easily comprehensible explanations, achieving high expert ratings for simplicity and understanding. Additionally, SHAP-Rule exhibited superior computational efficiency and robust consistency compared to alternative methods, making it particularly suitable for real-time diagnostic applications. Although SHAP-Rule showed minor trade-offs in coverage, it maintained high global fidelity, often approaching 100%. These findings highlight the significant practical advantages of linguistic fuzzy explanations generated by SHAP-Rule, emphasizing its strong potential for enhancing interpretability, efficiency, and reliability in diagnostic decision-support systems.

Keywords: explainable artificial intelligence; Shapley additive explanations; fuzzy rules; linguistic variables; diagnostics (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/13/20/3355/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/20/3355/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:20:p:3355-:d:1776342

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-11-15
Handle: RePEc:gam:jmathe:v:13:y:2025:i:20:p:3355-:d:1776342