MemeNet : Toward a Reliable Local Projection for Image Recognition via Semantic Featurization
When we recognize images with the help of Artificial Neural Networks (ANNs), we often wonder how they make decisions. A widely accepted solution is to point out local features as decisive evidence. A question then arises: Can local features in the latent space of an ANN explain the model output to s...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 33(2024) vom: 01., Seite 1670-1682 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2024
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | When we recognize images with the help of Artificial Neural Networks (ANNs), we often wonder how they make decisions. A widely accepted solution is to point out local features as decisive evidence. A question then arises: Can local features in the latent space of an ANN explain the model output to some extent? In this work, we propose a modularized framework named MemeNet that can construct a reliable surrogate from a Convolutional Neural Network (CNN) without changing its perception. Inspired by the idea of time series classification, this framework recognizes images in two steps. First, local representations named memes are extracted from the activation map of a CNN model. Then an image is transformed into a series of understandable features. Experimental results show that MemeNet can achieve accuracy comparable to most models' through a set of reliable features and a simple classifier. Thus, it is a promising interface to use the internal dynamics of CNN, which represents a novel approach to constructing reliable models |
---|---|
Beschreibung: | Date Revised 06.03.2024 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2024.3359331 |