Lecture “Bayesian Learning”
General Information
Lecture 
Thursdays 1416 (22.10, 29.10, 5.11.2015, 7.1, 14.1, 21.1.2016) 
Room 
MAR 4.062 
Teachers 
Shinichi Nakajima 
Contact 

Language 
English 
Credits 
3 ECTS, Elective Course in Machine Learning Module I (computer science M.Sc.) 
Topics
Bayesian learning is a category of machine learning methods, which are based on a basic law of probability, called Bayes’ theorem. As advantages, Bayesian learning offers assessment of the estimation quality and model selection functionality in a single framework, while as disadvantages, it requires “integral” computation, which often is a bottleneck. In this course, we introduce Bayesian learning, discuss pros and cons, how to perform the integral computation based on “conjugacy”, and how to approximate Bayesian learning when it is intractable.
The course covers
 Bayesian modeling and model selection.
 Bayesian learning in conjugate cases.
 Approximate Bayesian learning in conditionally conjugate cases:
 Gibbs sampling
 variational Bayesian learning.
 Approximate Bayesian learning in nonconjugate cases:
 MetropolisHastings algorithm.
 local variational approximation.
 expectation propagation.
The lecture will be given on a blackboard.
IntroductionBayesianLearningIntroduction.pdf.
Variational Bayesian learning VariationalBayes.pdf.
Markov chain Monte Carlo MCMC_short.pdf.
Summary Summary.pdf.
Homeworks
The students who need credits must do homeworks by the due dates.
Homework 1 (modified on 9.12.2015): BayesianLearning_HW1.pdf data.txt (due date: 7.1.2016)
Homework 2: BayesianLearning_HW2.pdf data2.txt initial.txt (due date: 18.2.2016)