Published on Tue Oct 18 2011

AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem

Peng Sun, Mark D. Reid, Jie Zhou

This paper presents an improvement to model learning when using multi-class LogitBoost for classification. Motivated by the statistical view, LogitBoost can be seen as additive tree regression. In general, this setting is too complicated for a tractable model learning algorithm.

0
0
0
Abstract

This paper presents an improvement to model learning when using multi-class LogitBoost for classification. Motivated by the statistical view, LogitBoost can be seen as additive tree regression. Two important factors in this setting are: 1) coupled classifier output due to a sum-to-zero constraint, and 2) the dense Hessian matrices that arise when computing tree node split gain and node value fittings. In general, this setting is too complicated for a tractable model learning algorithm. However, too aggressive simplification of the setting may lead to degraded performance. For example, the original LogitBoost is outperformed by ABC-LogitBoost due to the latter's more careful treatment of the above two factors. In this paper we propose techniques to address the two main difficulties of the LogitBoost setting: 1) we adopt a vector tree (i.e. each node value is vector) that enforces a sum-to-zero constraint, and 2) we use an adaptive block coordinate descent that exploits the dense Hessian when computing tree split gain and node values. Higher classification accuracy and faster convergence rates are observed for a range of public data sets when compared to both the original and the ABC-LogitBoost implementations.

Thu Mar 15 2012
Machine Learning
Robust LogitBoost and Adaptive Base Class (ABC) LogitBoost
Logitboost is an influential boosting algorithm for classification. We develop robust logitboost to provide an explicit formulation of tree-split criterion for building weak learners. We then propose abc-logitboost for multi-class classification.
0
0
0
Fri Aug 28 2009
Artificial Intelligence
ABC-LogitBoost for Multi-class Classification
Abc-logitboost is based on the prior work on abc-boost and robustLogitboost. Our extensive experiments on a variety of datasets demonstrate the considerable improvement of abC-log itboost over logitboost and abC.
0
0
0
Thu Jan 07 2010
Artificial Intelligence
An Empirical Evaluation of Four Algorithms for Multi-Class Classification: Mart, ABC-Mart, Robust LogitBoost, and ABC-LogitBoost
This empirical study is mainly devoted to comparing four tree-based boosting algorithms. The algorithms are mart, abc-mart, robust logitboost, and Abc-logitboost. These four algorithms are competitive with the best deep learning methods.
0
0
0
Sat Nov 08 2008
Machine Learning
Adaptive Base Class Boost for Multi-class Classification
The original MART (Multiple Additive Regression Trees) algorithm has been very successful in large-scale applications. We develop the concept of ABC-Boost (Adaptive Base Class Boost) for multi-class classification.
0
0
0
Fri Aug 28 2020
Machine Learning
agtboost: Adaptive and Automatic Gradient Tree Boosting Computations
agtboost is an R package implementing fast gradient tree boosting. The package automatically takes care of split/no-split decisions and selects the number of trees in the ensemble.
0
0
0
Mon Aug 29 2011
Artificial Intelligence
Datum-Wise Classification: A Sequential Approach to Sparsity
We propose a novel classification technique whose aim is to select an appropriate representation for each datapoint. We compare our classifier to classical L 1 regularized linear models (L 1-SVM and LARS) on a set of common binary and multi-class datasets.
0
0
0