Published on Fri Oct 16 2020

Universal guarantees for decision tree induction via a higher-order splitting criterion

Guy Blanc, Neha Gupta, Jane Lange, Li-Yang Tan

We propose a simple extension of top-down decision tree learning heuristics such as ID3, C4.5, and CART. Our algorithm achieves provable guarantees with respect to the uniform distribution of target functions.

0
0
0
Abstract

We propose a simple extension of top-down decision tree learning heuristics such as ID3, C4.5, and CART. Our algorithm achieves provable guarantees for all target functions with respect to the uniform distribution, circumventing impossibility results showing that existing heuristics fare poorly even for simple target functions. The crux of our extension is a new splitting criterion that takes into account the correlations between and small subsets of its attributes. The splitting criteria of existing heuristics (e.g. Gini impurity and information gain), in contrast, are based solely on the correlations between and its individual attributes. Our algorithm satisfies the following guarantee: for all target functions $f : \{-1,1\}^n \to \{-1,1\}$, sizes , and error parameters , it constructs a decision tree of size $s^{\tilde{O}((\log s)^2/\epsilon^2)}$ that achieves error , where denotes the error of the optimal size decision tree. A key technical notion that drives our analysis is the noise stability of , a well-studied smoothness measure.

Mon Nov 18 2019
Machine Learning
Top-down induction of decision trees: rigorous guarantees and inherent limitations
0
0
0
Mon Jun 15 2020
Machine Learning
Generalized and Scalable Optimal Sparse Decision Trees
Decision tree optimization is notoriously difficult from a computational perspective but essential for the field of interpretable machine learning. Despite efforts over the past 40 years, only recently have optimization breakthroughs been made that have allowed practical algorithms to find optimal decision trees.
0
0
0
Mon Jun 01 2020
Machine Learning
Provable guarantees for decision tree induction: the agnostic setting
We give strengthened provable guarantees on the performance of widely employed and empirically successful top-down decision tree learning algorithms. We show that for all monotone functions~ and parameters , these heuristics construct a decision
0
0
0
Tue Aug 10 2021
Machine Learning
On Learning and Testing Decision Tree
In this paper, we study learning and testing decision trees of size and depth that are significantly smaller than the number of attributes . Our main result addresses the problem of poly time algorithms. We also study the testability of depth- decision trees.
0
0
0
Thu Jul 11 2019
Machine Learning
On the Optimality of Trees Generated by ID3
We introduce a metric of a decision tree algorithm's performance called mean iterationistical consistency (MIC) MIC measures optimality of trees generated by ID3. MIC can differentiate between different different decision tree algorithms and compare their performance.
0
0
0
Tue Nov 03 2020
Machine Learning
Estimating decision tree learnability with polylogarithmic sample complexity
We show that top-down decision tree learning heuristics are amenable to highly efficient learnability estimation. For monotone target functions, the error of the decision tree hypothesis constructed by these heuristic can be estimated with polylogarithmically many labeled examples.
0
0
0