Factual and Counterfactual Explanations in Fuzzy Classification Trees

IEEE Transactions on Fuzzy Systems(2022)

引用 2|浏览20
暂无评分
摘要
Classification algorithms have recently acquired great popularity due to their efficiency to generate models capable of solving high complexity problems. Specifically, black box models are the ones that offer the best results, since they greatly benefit from the enormous amount of data available to learn models in an increasingly accurate way. However, their main disadvantage compared to other simpler algorithms, e.g., decision trees, is the loss of interpretability for both the model and the individual classifications, which may become a major drawback because of the increasing number of applications where it is advisable and even compulsory to provide an explanation. A well-accepted practice is to build an explainable model that can mimic the behavior of the (more complex) classifier in the neighborhood of the instance to be explained. Nonetheless, the generation of explanations in such white box models is not trivial either, which has generated intense research. It is common to generate two types of explanations, factual explanations and counterfactual explanations, which complement each other to justify why the instance has been classified into a certain class or category. In this work, we propose the definition of factual and counterfactual explanations in the frame of fuzzy decision trees, where multiple branches can be fired at once. Our proposal is centered around the definition of factual explanations that can contain more than a single rule, in contrast to the current standard that is limited to considering a single rule as a factual explanation. Moreover, we introduce the idea of robust factual explanation. Finally, we provide procedures to obtain counterfactual explanations from the instance and also from a factual explanation.
更多
查看译文
关键词
Counterfactual explanations,explainable arti- ficial intelligence (XAI),fuzzy reasoning,factual explanations,fuzzy decision trees,robustness
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要