Published on Mon Jul 20 2020

Electre Tree A Machine Learning Approach to Infer Electre Tri B Parameters

Gabriela Montenegro de Barros, Valdecy Pereira
0
0
0
Abstract

Purpose: This paper presents an algorithm that can elicitate (infer) all or any combination of ELECTRE Tri-B parameters. For example, a decision-maker can maintain the values for indifference, preference, and veto thresholds, and our algorithm can find the criteria weights, reference profiles, and the lambda cutting level. Our approach is inspired by a Machine Learning ensemble technique, the Random Forest, and for that, we named our approach as ELECTRE Tree algorithm. Methodology: First, we generate a set of ELECTRE Tri-B models, where each model solves a random sample of criteria and alternatives. Each sample is made with replacement, having at least two criteria and between 10% to 25% of alternatives. Each model has its parameters optimized by a genetic algorithm that can use an ordered cluster or an assignment example as a reference to the optimization. Finally, after the optimization phase, two procedures can be performed, the first one will merge all models, finding in this way the elicitated parameters, and in the second procedure each alternative is classified (voted) by each separated model, and the majority vote decides the final class. Findings: We have noted that concerning the voting procedure, non-linear decision boundaries are generated, and they can be suitable in analyzing problems with the same nature. In contrast, the merged model generates linear decision boundaries. Originality: The elicitation of ELECTRE Tri-B parameters is made by an ensemble technique that is composed of a set of multicriteria models that are engaged in generating robust solutions.

Thu Nov 17 2016
Machine Learning
GENESIM: genetic extraction of a single, interpretable model
GenesIM is an algorithm that transforms an ensemble of decision trees to a single decision tree with an enhanced predictive performance. The resulting model of GENESIM has a very low model complexity, making it very interpretable.
0
0
0
Mon Apr 11 2005
Artificial Intelligence
Experimental Comparison of Classification Uncertainty for Randomised and Bayesian Decision Tree Ensembles
0
0
0
Mon May 18 2020
Machine Learning
Optimal survival trees ensemble
The method is implemented in an R package called "OSTE" The proposed method is assessed using 17 benchmark datasets. The results are compared with those of random survival forest, conditional inference forest, bagging and a non tree based method.
0
0
0
Mon Jun 24 2019
Machine Learning
Analyzing CART
Decision trees with binary splits are popularly constructed using CART methodology. This paper aims to study the bias and adaptive properties of regression trees constructed with CART. The main technical tool is an exact characterization of the conditional probability content of the nodes.
0
0
0
Mon Jan 13 2020
Machine Learning
Trees, forests, and impurity-based variable importance
Tree ensemble methods such as random forests are very popular to handle high-dimensional tabular data sets. We do not even know what these quantities estimate. We prove that if input variables are independent and in absence of interactions, MDI provides a variance that is clearly identified.
0
0
0
Fri Jan 01 2021
Neural Networks
A Multi-disciplinary Ensemble Algorithm for Clustering Heterogeneous Datasets
Clustering is a commonly used method for exploring and analysing data. The primary objective is to categorise observations into similar clusters. We propose a new evolutionary clustering algorithm (ECAStar) based on social class ranking and meta-heuristic algorithms.
0
0
0