EconPapers    
Economics at your fingertips  
 

How Explainable AI Methods Support Data-Driven Decision-Making

Dominik Stoffels (), Susanne Grabl (), Thomas Fischer () and Marina Fiedler ()
Additional contact information
Dominik Stoffels: University of Passau
Susanne Grabl: University of Passau
Thomas Fischer: University of Passau
Marina Fiedler: University of Passau

A chapter in Conceptualizing Digital Responsibility for the Information Age, 2025, pp 325-340 from Springer

Abstract: Abstract Explainable AI (XAI) holds great potential to reveal the patterns in black-box AI models and to support data-driven decision-making. We apply four post-hoc explanatory methods to demonstrate the explanatory capabilities of these methods for data-driven decision-making using the illustrative example of unwanted job turnover and human resource management (HRM) support. We show that XAI can be a useful aid in data-driven decision-making, but also highlight potential drawbacks and limitations of which users in research and practice should be aware.

Keywords: Explainable AI; Machine learning; Data-driven decision-making (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:lnichp:978-3-031-80119-8_21

Ordering information: This item can be ordered from
http://www.springer.com/9783031801198

DOI: 10.1007/978-3-031-80119-8_21

Access Statistics for this chapter

More chapters in Lecture Notes in Information Systems and Organization from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-06-06
Handle: RePEc:spr:lnichp:978-3-031-80119-8_21