Published on Fri Jan 19 2018

A Dirichlet Process Mixture Model of Discrete Choice

Rico Krueger, Akshay Vij, Taha H. Rashidi

The proposed model is a Dirichlet process mixture model and accommodates discrete representations of heterogeneity. It does not require the analyst to fix the number of mixture components prior to estimation. The complexity of the discrete mixing distribution is inferred from the evidence.

0
0
0
Abstract

We present a mixed multinomial logit (MNL) model, which leverages the truncated stick-breaking process representation of the Dirichlet process as a flexible nonparametric mixing distribution. The proposed model is a Dirichlet process mixture model and accommodates discrete representations of heterogeneity, like a latent class MNL model. Yet, unlike a latent class MNL model, the proposed discrete choice model does not require the analyst to fix the number of mixture components prior to estimation, as the complexity of the discrete mixing distribution is inferred from the evidence. For posterior inference in the proposed Dirichlet process mixture model of discrete choice, we derive an expectation maximisation algorithm. In a simulation study, we demonstrate that the proposed model framework can flexibly capture differently-shaped taste parameter distributions. Furthermore, we empirically validate the model framework in a case study on motorists' route choice preferences and find that the proposed Dirichlet process mixture model of discrete choice outperforms a latent class MNL model and mixed MNL models with common parametric mixing distributions in terms of both in-sample fit and out-of-sample predictive ability. Compared to extant modelling approaches, the proposed discrete choice model substantially abbreviates specification searches, as it relies on less restrictive parametric assumptions and does not require the analyst to specify the complexity of the discrete mixing distribution prior to estimation.

Mon Jul 06 2020
Machine Learning
Semi-nonparametric Latent Class Choice Model with a Flexible Class Membership Component: A Mixture Model Approach
This study presents a semi-nonparametric Latent Class Choice Model (LCCM) with a flexible class membership component. Mixture models are parametric model-based clustering techniques that have been widely used in areas such as machine learning, data mining and patter recognition.
0
0
0
Tue Jan 14 2020
Machine Learning
Sparse Covariance Estimation in Logit Mixture Models
This paper introduces a new data-driven methodology for estimating sparse covariance matrices of the random coefficients in logit mixture models. We propose a new estimator, called MISC, that uses a mixed-integer optimization program to find an optimal block diagonal structure.
0
0
0
Thu Jan 28 2021
Machine Learning
Gaussian Process Latent Class Choice Models
We present a Gaussian Process - Latent Class Choice Model (GP-LCCM) to integrate a non-parametric class of probabilistic machine learning within discrete choice models. The proposed model would assign individuals probabilistically to behaviorally homogeneous clusters (latent classes) using GPs.
0
0
0
Sat Dec 15 2007
Machine Learning
Variational inference for large-scale models of discrete choice
Discrete choice models are commonly used by applied statisticians in marketing, economics, finance, and operations research. Markov chain Monte Carlo techniques make approximate inference possible, but computational cost is prohibitive on large data sets. Variational methods provide a deterministic alternative for approximation of the posterior distribution.
0
0
0
Mon Jan 22 2018
Artificial Intelligence
Estimating Heterogeneous Consumer Preferences for Restaurants and Travel Time Using Mobile Location Data
This paper analyzes consumer choices over lunchtime restaurants using data from a sample of several thousand anonymous mobile phone users in the San Francisco Bay Area. The data is used to identify users' approximate typical morning location. We build a model where restaurants have latent characteristics, and each user has preferences for these latent characteristics.
0
0
0
Sun Apr 07 2019
Machine Learning
Bayesian Estimation of Mixed Multinomial Logit Models: Advances and Simulation-Based Evaluations
Variational Bayes (VB) methods have emerged as a fast alternative to Markov chain Monte Carlo (MCMC) methods for Bayesian estimation of mixed multinomial logit (MMNL) models. It has been established that VB is substantially faster than MCMC at practically no compromises in predictive accuracy.
0
0
0