Nested Learning for Multi-Level Classification - Département Image, Données, Signal Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Nested Learning for Multi-Level Classification

Résumé

Deep neural networks models are generally designed and trained for a specific type and quality of data. In this work, we address this problem in the context of nested learning. For many applications, both the input data, at training and testing, and the prediction can be conceived at multiple nested quality/resolutions. We show that by leveraging this multiscale information, the problem of poor generalization and prediction overconfidence, as well as the exploitation of multiple training data quality, can be efficiently addressed. We evaluate the proposed ideas in six public datasets: MNIST, Fashion-MNIST, CIFAR10, CIFAR100, Plantvillage, and DBPEDIA. We observe that coarsely annotated data can help to solve fine predictions and reduce overconfidence significantly. We also show that hierarchical learning produces models intrinsically more robust to adversarial attacks and data perturbations.
Fichier principal
Vignette du fichier
ICASSP(1).pdf (436.19 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03177336 , version 1 (23-03-2021)

Identifiants

  • HAL Id : hal-03177336 , version 1

Citer

Raphaël Achddou, J. Matias Di Martino, Guillermo Sapiro. Nested Learning for Multi-Level Classification. ICASSP, Jun 2021, Toronto (virtuel), Canada. ⟨hal-03177336⟩
111 Consultations
1041 Téléchargements

Partager

Gmail Facebook X LinkedIn More