Size: 1766
Comment:
|
Size: 1776
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 35: | Line 35: |
* Homework 1 [[attachment:BayesianLearning_HW1.pdf]] [[attachment:data.txt]] (deadline is 7.1.2015) * Homework 2 available after 21.1.2015 (deadline is 21.2.2015) |
* Homework 1: [[attachment:BayesianLearning_HW1.pdf]] [[attachment:data.txt]] (deadline is 7.1.2015) * Homework 2: will be available after 21.1.2015 (deadline is 21.2.2015) |
Lecture “Bayesian Learning”
General Information
Lecture |
Thursdays 14-16 (22.10, 29.10, 5.11.2015, 7.1, 14.1, 21.1.2016) |
Room |
MAR 4.062 |
Teachers |
Shinichi Nakajima |
Contact |
|
Language |
English |
Credits |
3 ECTS, Elective Course in Machine Learning Module I (computer science M.Sc.) |
Topics
Bayesian learning is a category of machine learning methods, which are based on a basic law of probability, called Bayes’ theorem. As advantages, Bayesian learning offers assessment of the estimation quality and model selection functionality in a single framework, while as disadvantages, it requires “integral” computation, which often is a bottleneck. In this course, we introduce Bayesian learning, discuss pros and cons, how to perform the integral computation based on “conjugacy”, and how to approximate Bayesian learning when it is intractable.
The course covers
- Bayesian modeling and model selection.
- Bayesian learning in conjugate cases.
- Approximate Bayesian learning in conditionally conjugate cases:
- Gibbs sampling
- variational Bayesian learning.
- Approximate Bayesian learning in non-conjugate cases:
- Metropolis-Hastings algorithm.
- local variational approximation.
- expectation propagation.
The lecture will be given on a blackboard except the introductionBayesianLearningIntroduction.pdf
Home works
The students who need credits must submit an answer sheet to the two homeworks by the deadlines.
Homework 1: BayesianLearning_HW1.pdf data.txt (deadline is 7.1.2015)
- Homework 2: will be available after 21.1.2015 (deadline is 21.2.2015)