Size: 1799
Comment:
|
← Revision 23 as of 2016-01-26 02:16:56 ⇥
Size: 2003
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 29: | Line 29: |
The lecture will be given on a blackboard except the introduction[[attachment:BayesianLearningIntroduction.pdf]] | The lecture will be given on a blackboard. |
Line 31: | Line 31: |
== Home works == | Introduction[[attachment:BayesianLearningIntroduction.pdf]]. |
Line 33: | Line 33: |
The students who need credits must submit an answer sheet to the two homeworks by the due dates. | Variational Bayesian learning [[attachment:VariationalBayes.pdf]]. |
Line 35: | Line 35: |
* Homework 1: [[attachment:BayesianLearning_HW1.pdf]] [[attachment:data.txt]] (due data: '''7.1.2016''') * Homework 2: will be available after the lecture on 21.1.2016 (due data: '''21.2.2016''') |
Markov chain Monte Carlo [[attachment:MCMC_short.pdf]]. Summary [[attachment:Summary.pdf]]. == Homeworks == The students who need credits must do homeworks by the due dates. * Homework 1 '''(modified on 9.12.2015)''': [[attachment:BayesianLearning_HW1.pdf]] [[attachment:data.txt]] (due date: '''7.1.2016''') * Homework 2: [[attachment:BayesianLearning_HW2.pdf]] [[attachment:data2.txt]] [[attachment:initial.txt]] (due date: '''18.2.2016''') |
Lecture “Bayesian Learning”
General Information
Lecture |
Thursdays 14-16 (22.10, 29.10, 5.11.2015, 7.1, 14.1, 21.1.2016) |
Room |
MAR 4.062 |
Teachers |
Shinichi Nakajima |
Contact |
|
Language |
English |
Credits |
3 ECTS, Elective Course in Machine Learning Module I (computer science M.Sc.) |
Topics
Bayesian learning is a category of machine learning methods, which are based on a basic law of probability, called Bayes’ theorem. As advantages, Bayesian learning offers assessment of the estimation quality and model selection functionality in a single framework, while as disadvantages, it requires “integral” computation, which often is a bottleneck. In this course, we introduce Bayesian learning, discuss pros and cons, how to perform the integral computation based on “conjugacy”, and how to approximate Bayesian learning when it is intractable.
The course covers
- Bayesian modeling and model selection.
- Bayesian learning in conjugate cases.
- Approximate Bayesian learning in conditionally conjugate cases:
- Gibbs sampling
- variational Bayesian learning.
- Approximate Bayesian learning in non-conjugate cases:
- Metropolis-Hastings algorithm.
- local variational approximation.
- expectation propagation.
The lecture will be given on a blackboard.
IntroductionBayesianLearningIntroduction.pdf.
Variational Bayesian learning VariationalBayes.pdf.
Markov chain Monte Carlo MCMC_short.pdf.
Summary Summary.pdf.
Homeworks
The students who need credits must do homeworks by the due dates.
Homework 1 (modified on 9.12.2015): BayesianLearning_HW1.pdf data.txt (due date: 7.1.2016)
Homework 2: BayesianLearning_HW2.pdf data2.txt initial.txt (due date: 18.2.2016)