Published on Fri Apr 10 2015

Learning Arbitrary Statistical Mixtures of Discrete Distributions

Jian Li, Yuval Rabani, Leonard J. Schulman, Chaitanya Swamy

We study the problem of learning from unlabeled samples very generalistical mixture models on large finite sets. We give the first efficient algorithms for learning this mixture model. Our model and results have applications to a variety of unsupervised learning scenarios, including collaborative filtering.

0
0
0
Abstract

We study the problem of learning from unlabeled samples very general statistical mixture models on large finite sets. Specifically, the model to be learned, , is a probability distribution over probability distributions , where each such is a probability distribution over $[n] = \{1,2,\dots,n\}$. When we sample from , we do not observe directly, but only indirectly and in very noisy fashion, by sampling from repeatedly, independently times from the distribution . The problem is to infer to high accuracy in transportation (earthmover) distance. We give the first efficient algorithms for learning this mixture model without making any restricting assumptions on the structure of the distribution . We bound the quality of the solution as a function of the size of the samples and the number of samples used. Our model and results have applications to a variety of unsupervised learning scenarios, including learning topic models and collaborative filtering.