SUBPLEX : A Visual Analytics Approach to Understand Local Model Explanations at the Subpopulation Level

Understanding the interpretation of machine learning (ML) models has been of paramount importance when making decisions with societal impacts, such as transport control, financial activities, and medical diagnosis. While local explanation techniques are popular methods to interpret ML models on a si...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE computer graphics and applications. - 1991. - 42(2022), 6 vom: 28. Nov., Seite 24-36
1. Verfasser: Yuan, Jun (VerfasserIn)
Weitere Verfasser: Chan, Gromit Yeuk-Yin, Barr, Brian, Overton, Kyle, Rees, Kim, Nonato, Luis Gustavo, Bertini, Enrico, Silva, Claudio T
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2022
Zugriff auf das übergeordnete Werk:IEEE computer graphics and applications
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:Understanding the interpretation of machine learning (ML) models has been of paramount importance when making decisions with societal impacts, such as transport control, financial activities, and medical diagnosis. While local explanation techniques are popular methods to interpret ML models on a single instance, they do not scale to the understanding of a model's behavior on the whole dataset. In this article, we outline the challenges and needs of visually analyzing local explanations and propose SUBPLEX, a visual analytics approach to help users understand local explanations with subpopulation visual analysis. SUBPLEX provides steerable clustering and projection visualization techniques that allow users to derive interpretable subpopulations of local explanations with users' expertise. We evaluate our approach through two use cases and experts' feedback
Beschreibung:Date Completed 06.04.2023
Date Revised 28.04.2023
published: Print
Citation Status MEDLINE
ISSN:1558-1756
DOI:10.1109/MCG.2022.3199727