Probabilistic Graphical Models for Image Analysis

Dr. Brian McWilliams, Dr. Aurelien Lucchi - Autumn Semester, 2014

Jump to: Syllabus | Resources | FAQs | Contact

News:

18 Dec: Added solutions for SSVM exercises.
17 Dec: Added solutions for LBP exercises.
11 Dec: Added homework for SSVM lecture.
10 Dec: Added SSVM slides.
10 Dec: Added solutions for SVM and CRF lectures.
4 Dec: Added LBP exercises and solutions for factored Gaussians example.
3 Dec: Added SVM slides.
3 Dec: Added homework for SVM lecture.
26 Nov: Added CRF slides.
26 Nov: Added homework for CRF lecture.
17 Nov: Added Sampling slides.
5 Nov: Added solutions for belief nets and belief prop and reading for Loopy BP.
27 Oct: Added homework for belief prop and Variational slides.
22 Oct: Added solutions for Holmes/Watson network and another inference exercise.
22 Oct: Added solutions for homework 5 and 6.
13 Oct: Added solutions for homework 4.
2 Oct: Added lecture 3 slides and additional exercises for lecture 1.
23 Sept: added more reading for lecture 1.
FAQs section added.


2014 course website updated.

Course Description

This course will focus on inference with statistical models for image analysis. We use a framework called probabilistic graphical models which include Bayesian Networks and Markov Random Fields. We apply the approach to traditional vision problems such as image denoising, as well as recent problems such as object recognition. The course covers amongst others the following topics:

Time and Place

PLEASE NOTE THE NEW TIMES AND LOCATION:

Lectures Monday, 15:00-16:00 CAB G 51
Thursday, 10:00-12:00 CLA E 4

Exam

30 Minute oral exam in English.

Syllabus

Day Lecture Topics Lecture Slides Additional Exercises Reading Background Material
Sep 18 Introduction/Learning from Data Lecture 1 hw solutions: p1, p2, p3, p4 Barber Ch. 1 , notes on machine learning
probability background
Sep 22 Introduction/Learning from Data (cont.) Learning from data basics (solutions), Barber Ch. 1 , 8, 13.2 , 17.1, 18.1.1
Sep 25 Probabilistic models Lecture 2 hw solutions: p1, p2 Barber Ch. 8, 10 Ghahramani on Bayesian modeling
Nice example of a generative model
Sep 29 Probabilistic models Barber Ch. 17.4, 29.3-5
Oct 02 Belief Networks Lecture 3 worked example solutions

Inference in Belief nets (solutions)


Barber Ch. 2, 3
Oct 09 Markov Random Fields Lecture 4 hw4 solutions
Barber Ch. 4
Oct 16 Learning as Inference Lecture 5 hw5 solutions
Barber Ch. 9
Oct 16 MAP inference Lecture 6
hw6 solutions
Barber Ch. 9, 28.9 1. energy minimization via graph-cuts
2. texture synthesis
3. photomontage
Oct 23 Belief Propagation Lecture 7
Barber Ch. 5
Oct 27 Belief Propagation (cont.)

Variational Approximation



Lecture 8
Belief-prop homework
(solution)



Barber Ch. 18.2.2, 28
Nov 6 Variational Approximation (cont.)

Loopy Belief Propagation



Lecture 9
Additional exercises
Solution to factored Gaussians.


LBP exercises
(solutions)
Barber Ch. 28

Barber 28.7
Wainwright and Jordan 3-4.1.6
Challis and Barber. Gaussian Kullback-Leibler Approximate Inference
Nov 17 Sampling Lecture 10
Barber Ch. 27
Nov 27 Conditional Random Fields Lecture 11
series11.pdf
solutions11.pdf
hw11 solutions
Barber 9.6.5 and 23.4.3
Intro to CRFs
Application to image segmentation
Learning CRFs with graph cut
Dec 1 No class
Dec 4 SVMs Lecture 12
series12.pdf
solutions12.pdf
SVM tutorial
Learning the kernel
Discriminative MRFs
Dec 11 Structured SVMs Lecture 13
series13.pdf
solutions13.pdf
Dec 15 No class

Resources

Primary References

D. Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press 2012.
The main course text. Brand new book which covers many topics in graphical models and machine learning. Available for free from here.

M. Wainwright and M.I. Jordan. Graphical models, exponential families and variational inference. Foundations and Trends in Machine Learning 2008.
Advanced treatment of graphical models and variational inference. Available free from here.

David J.C. Mackay. Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2003.
Available for free from here.

Additional references

C. Bishop. Pattern Recognition and Machine Learning. Springer 2007.
This is an excellent introduction to machine learning that covers most topics which will be treated in the lecture. Contains lots of exercises, some with exemplary solutions.

D. Koller and N. Friedman. Probabilistic Graphical Models: Principles and Techniques. The MIT Press 2009.
Covers Bayesian networks and undirected graphical models in great detail.

Frequently Asked Questions

Q: What is a good reference for probability theory required for the course?
A: See Barber Ch. 1. and MacKay: Ch. 2, 3. Make sure you are comfortable with the exercises in the first week's slides too.

Q: What is the scope of the course?
A: We cover material from Part I (all), II and III (some) and V (all) of Barber. We look briefly at the first four sections of Wainwright & Jordan.

Contact

Dr. Brian McWilliams
Dr. Aurelien Lucchi