Nonparametric Bayesian Modelling in Machine Learning

dc.contributor.authorHabli, Nada
dc.description.abstractNonparametric Bayesian inference has widespread applications in statistics and machine learning. In this thesis, we examine the most popular priors used in Bayesian non-parametric inference. The Dirichlet process and its extensions are priors on an infinite-dimensional space. Originally introduced by Ferguson (1983), its conjugacy property allows a tractable posterior inference which has lately given rise to a significant developments in applications related to machine learning. Another yet widespread prior used in nonparametric Bayesian inference is the Beta process and its extensions. It has originally been introduced by Hjort (1990) for applications in survival analysis. It is a prior on the space of cumulative hazard functions and it has recently been widely used as a prior on an infinite dimensional space for latent feature models. Our contribution in this thesis is to collect many diverse groups of nonparametric Bayesian tools and explore algorithms to sample from them. We also explore machinery behind the theory to apply and expose some distinguished features of these procedures. These tools can be used by practitioners in many applications.
dc.publisherUniversité d'Ottawa / University of Ottawa
dc.subjectNonparametric Bayesian
dc.subjectDirichlet process
dc.subjectGamma process
dc.subjectBeta process
dc.subjectMachine Learning
dc.subjectBeta Bernoulli process
dc.titleNonparametric Bayesian Modelling in Machine Learning
dc.contributor.supervisorZarepour, Mahmoud / Science
uottawa.departmentMathématiques et statistique / Mathematics and Statistics
CollectionThèses, 2011 - // Theses, 2011 -

Habli_Nada_2016_thesis.pdfMaster's thesis589.83 kBAdobe PDFOpen