Representation Learning
This reading assignment covers techniques used for training and evaluating undirected graphical models that have intractable partition functions.
From the Deep Learning Book - Chapter 18: Confronting the Partition Function, please read these sections:
18.1 The Log-Likelihood Gradient
18.2 Stochastic Maximum Likelihood and Contrastive
Divergence
(This is the most important section.)
18.6 Noise-Contrastive Estimation
18.7 Estimating the Partition Function
(Subsections 18.7.1 and 18.7.2 are optional.)
Sections 18.3, 18.4 and 18.5 are optional. These topics will not be covered in the course.
Furthermore, please read the following sections from Deep Learning Book - Chapter 20: Deep Generative Models:
20.1 Boltzmann Machines
20.2 Restricted Boltzmann Machines
Deadline for questions to be considered in class is January 14, 7am. I will also try to accommodate things that come in later but I cannot make guarantees. The earlier you bring up questions, the better.
Like last week, we will spend about 50% of the time in class for discussing the reading assignments. This will leave less time for the course project. Therefore, if you would like to present and/or discuss progress on a specific aspect of the project, please prepare accordingly. Also, please send an email with the topic and the approximate amount of time required until January 16, 10am.