Published on Mon Jul 06 2020

Leveraging Class Hierarchies with Metric-Guided Prototype Learning

Vivien Sainte Fare Garnot, Loic Landrieu

The severity of errors can be summarized under the form of a cost matrix. When the target classes are organized into a hierarchical structure, this matrix defines a metric. We propose to integrate this metric in a new and versatile classification layer.

0
0
0
Abstract

Not all errors are created equal. This is especially true for many key machine learning applications. In the case of classification tasks, the severity of errors can be summarized under the form of a cost matrix, which assesses the gravity of confusing each pair of classes. When the target classes are organized into a hierarchical structure, this matrix defines a metric. We propose to integrate this metric in a new and versatile classification layer in order to model the disparity of errors. Our method relies on jointly learning a feature-extracting network and a set of class representations, or prototypes, which incorporate the error metric into their relative arrangement in the embedding space. Our approach allows for consistent improvement of the severity of the network's errors with regard to the cost matrix. Furthermore, when the induced metric contains insight on the data structure, our approach improves the overall precision as well. Experiments on four different public datasets -- from agricultural time series classification to depth image semantic segmentation -- validate our approach.