Ph.D. Thesis

This page contains links to compressed postscript (and now pdf) files of my Ph.D. thesis, "Latent Variable Models for Neural Data Analysis." You may download the entire document as a single file, or else you can obtain each chapter individually.

Note for windows users accessing the postsript: some browsers under certain versions of windows appear to strip the '.gz' part of these filenames without actually uncompressing the saved files. If the postscript files you obtain appear to be corrupted, try renaming them so that the filenames all end with '.ps.gz' and then running WinZip.

Entire Thesis

This contains the chapters below, as well as the front matter (acknowledgements, preface, contents etc.) and bibliography, all in the official Caltech format. You may download versions suitable for either single-sided (pdf) or double-sided (pdf) printing.

Chapter by Chapter

  1. Latent Variable Models

    This is a review chapter. Introduces statistical modelling, quickly reviews parameter estimation, discusses model selection in some depth. Introduces latent variable models and the EM algorithm. Ends with the free-energy interpretation of EM.

    22 pages postscript, pdf

  2. Clustering

    This is mostly review. Discusses k-means and similar clustering approaches. Quickly switches to likelihood-based clustering and mixture models. Derives EM algorithm for mixtures. Closes with discussion of outliers, multiple maxima, and Cheeseman-Stutz model selection.

    15 pages postscript, pdf

  3. Relaxation EM

    Reviews simulated annealing and deterministic annealing (DA). Introduces generalization of DA to generic mixture models (REM-1). Discusses phase-transition structure of relaxation likelihoods and the relation to model selection. Introduces a variant on previous algorithm (REM-2) which allows cascading model selection.

    25 pages postscript, pdf

  4. Hidden Markov Models

    Reviews markov chains, HMMs, Forward-Backward and Baum-Welch. Introduces sparse HMMs and mixtures of sparse HMMs. Derives approximate learning algorithm for mixtures of sparse HMMs.

    17 pages postscript, pdf

  5. Spike Sorting

    Analysis of spike sorting problem from latent variable point of view. Provides toolbox of new and powerful techniques to resolve spikes.

    56 pages postscript, pdf,

  6. Doubly Stochastic Poisson Models

    A randomly scaled inhomogeneous Poisson model for smoothing and clustering spike trains.

    15 pages. postscript, pdf,