Citation

BibTex format

@inproceedings{Ayoobi:2024,
author = {Ayoobi, H and Potyka, N and Toni, F},
pages = {3--15},
publisher = {CEUR Workshop Proceedings},
title = {Argumentative interpretable image classification},
url = {http://hdl.handle.net/10044/1/114928},
year = {2024}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - We propose ProtoSpArX, a novel interpretable deep neural architecture for image classification in the spirit of prototypical-part-learning as found, e.g. in ProtoPNet. While earlier approaches associate every class with multiple prototypical-parts, ProtoSpArX uses super-prototypes that combine prototypical-parts into single class representations. Furthermore, while earlier approaches use interpretable classification layers, e.g. logistic regression in ProtoPNet, ProtoSpArX improves accuracy with multi-layer perceptronswhile relying upon an interpretable reading thereof based on a form of argumentation. ProtoSpArX is customisable to user cognitive requirements by a process of sparsification of the multi-layer perceptron/argumentation component. Also, as opposed to other prototypical-part-learning approaches,ProtoSpArX can recognise spatial relations between different prototypical-parts that are from various regions in images, similar to how CNNs capture relations between patterns recognized in earlier layers.
AU - Ayoobi,H
AU - Potyka,N
AU - Toni,F
EP - 15
PB - CEUR Workshop Proceedings
PY - 2024///
SN - 1613-0073
SP - 3
TI - Argumentative interpretable image classification
UR - http://hdl.handle.net/10044/1/114928
ER -