Discriminative Densities from Maximum Contrast Estimation

Part of Advances in Neural Information Processing Systems 15 (NIPS 2002)

Bibtex Metadata Paper


Peter Meinicke, Thorsten Twellmann, Helge Ritter


We propose a framework for classifier design based on discriminative densities for representation of the differences of the class-conditional dis- tributions in a way that is optimal for classification. The densities are selected from a parametrized set by constrained maximization of some objective function which measures the average (bounded) difference, i.e. the contrast between discriminative densities. We show that maximiza- tion of the contrast is equivalent to minimization of an approximation of the Bayes risk. Therefore using suitable classes of probability den- sity functions, the resulting maximum contrast classifiers (MCCs) can approximate the Bayes rule for the general multiclass case. In particular for a certain parametrization of the density functions we obtain MCCs which have the same functional form as the well-known Support Vec- tor Machines (SVMs). We show that MCC-training in general requires some nonlinear optimization but under certain conditions the problem is concave and can be tackled by a single linear program. We indicate the close relation between SVM- and MCC-training and in particular we show that Linear Programming Machines can be viewed as an approxi- mate realization of MCCs. In the experiments on benchmark data sets, the MCC shows a competitive classification performance.