Differences between revisions 2 and 3
Revision 2 as of 2015-10-05 22:04:39
Size: 1186
Comment:
Revision 3 as of 2015-10-05 22:07:26
Size: 1180
Comment:
Deletions are marked like this. Additions are marked like this.
Line 20: Line 20:
   * Gibbs sampling and variational Bayesian learning. * Gibbs sampling and variational Bayesian learning.
Line 22: Line 22:
   * Metropolis-Hastings algorithm, local variational approximation and expectation propagation. * Metropolis-Hastings algorithm, local variational approximation and expectation propagation.

Lecture “Bayesian Learning”

General Information

Lecture

Thursdays 14-16

Room

MAR 4.062

Teachers

Shinichi Nakajima

Contact

nakajima@tu-berlin.de

Topics

Bayesian learning is a category of machine learning methods, which are based on a basic law of probability, called Bayes’ theorem. As advantages, Bayesian learning offers assessment of the estimation quality and model selection functionality in a single framework, while as disadvantages, it requires “integral” computation, which often is a bottleneck. In this course, we introduce Bayesian learning, discuss pros and cons, how to perform the integral computation based on “conjugacy”, and how to approximate Bayesian learning when it is intractable.

The course covers * Bayesian modeling and model selection. * Bayesian learning in conjugate cases. * Approximate Bayesian learning in conditionally conjugate cases, e.g., * Gibbs sampling and variational Bayesian learning. * Approximate Bayesian learning in non-conjugate cases, e.g., * Metropolis-Hastings algorithm, local variational approximation and expectation propagation.

IDA Wiki: Main/WS15_BayesianLearning (last edited 2016-01-26 02:16:56 by ShinichiNakajima)