EconPapers    
Economics at your fingertips  
 

A Note on the Interpretability of Machine Learning Algorithms

Dominique Guegan ()
Additional contact information
Dominique Guegan: UP1 - Université Panthéon-Sorbonne, CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique, University of Ca’ Foscari [Venice, Italy]

Post-Print from HAL

Abstract: We are interested in the analysis of the concept of interpretability associated with a ML algorithm. We distinguish between the "How", i.e., how a black box or a very complex algorithm works, and the "Why", i.e. why an algorithm produces such a result. These questions appeal to many actors, users, professions, regulators among others. Using a formal standardized framework , we indicate the solutions that exist by specifying which elements of the supply chain are impacted when we provide answers to the previous questions. This presentation, by standardizing the notations, allows to compare the different approaches and to highlight the specificities of each of them: both their objective and their process. The study is not exhaustive and the subject is far from being closed.

Keywords: Interpretability; Counterfactual approach; Artificial Intelligence; Agnostic models; LIME method; Machine learning (search for similar items in EconPapers)
Date: 2020-07
New Economics Papers: this item is included in nep-big and nep-cmp
Note: View the original document on HAL open archive server: https://halshs.archives-ouvertes.fr/halshs-02900929
References: View references in EconPapers View complete reference list from CitEc
Citations: Track citations by RSS feed

Published in 2020

Downloads: (external link)
https://halshs.archives-ouvertes.fr/halshs-02900929/document (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:hal:journl:halshs-02900929

Access Statistics for this paper

More papers in Post-Print from HAL
Bibliographic data for series maintained by CCSD ().

 
Page updated 2020-09-23
Handle: RePEc:hal:journl:halshs-02900929