This paper presents a latent variable representation of regularized
support vector machines (SVM's) that enables EM, ECME or MCMC algorithms
to provide parameter estimates. We verify our representation by demonstrating
that minimizing the SVM optimality criterion together with the parameter reg-
ularization penalty is equivalent to ¯nding the mode of a mean-variance mixture
of normals pseudo-posterior distribution. The latent variables in the mixture rep-
resentation lead to EM and ECME point estimates of SVM parameters, as well
as MCMC algorithms based on Gibbs sampling that can bring Bayesian tools for
Gaussian linear models to bear on SVM's. We show how to implement SVM's with
spike-and-slab priors and run them against data from a standard spam ¯ltering
data set.