Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. Machine-learning-uiuc / docs / Probabilistic Graphical Models - Principles and Techniques.pdf Find file Copy path Zhenye-Na Add Probabilistic Graphical Models: Principles and Techniques 7e77f69 Apr 11, 2018.
Zhangyuan Wangzywang17 at stanford.edu Course Information Course DescriptionProbabilistic graphical models are a powerful framework for representing complex domains using probability distributions, with numerous applications in machine learning, computer vision, natural language processing and computational biology. Graphical models bring together graph theory and probability theory, and provide a flexible framework for modeling large collections of random variables with complex interactions. This course will provide a comprehensive survey of the topic, introducing the key formalisms and main techniques used to construct them, make predictions, and support decision-making under uncertainty.The aim of this course is to develop the knowledge and skills necessary to design, implement and apply these models to solve real problems. The course will cover: (1) Bayesian networks, undirected graphical models and their temporal extensions; (2) exact and approximate inference methods; (3) estimation of the parameters and the structure of graphical models. PrerequisitesStudents are expected to have background in basic probability theory, statistics, programming, algorithm design and analysis.
ReadingsRequired Textbook: (“PGM”) Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and Nir Friedman. MIT Press.Course Notes: Available. Student contributions welcome!Lecture Videos: on finding them.Further Readings:. (“GEV”) Graphical models, exponential families, and variational inference by Martin J. Wainwright and Michael I. Available.
Modeling and Reasoning with Bayesian Networks by Adnan Darwiche. Available (through Stanford). Pattern Recognition and Machine Learning by Chris Bishop. Available.
Machine Learning: A Probabilistic Perspective by Kevin P. Available (through Stanford). Information Theory, Inference, and Learning Algorithms by David J. Available.
Bayesian Reasoning and Machine Learning by David Barber. Available.Grading PolicyHomeworks (70%): There will be five homeworks with both written and programming parts. Each homework is centered around an application and will also deepen your understanding of the theoretical concepts. Homeworks will be posted on Piazza.Final Exam (30%): Wednesday, March 20, 2019, 8:30-11:30am.Extra Credit (+3%): You will be awarded with up to 3% extra credit if you answer other students’ questions on Piazza in a substantial and helpful way, or contribute to the course notes on GitHub with pull requests. AssignmentsWritten Assignments: Homeworks should be written up clearly and succinctly; you may lose points if your answers are unclear or unnecessarily complicated.
You are encouraged to use LaTeX to writeup your homeworks (here is a ), but this is not a requirement.Homework Submission: All students (non-SCPD and SCPD) should submit their assignments electronically via.Late Homework: You have 6 late days to use at any time during the term without penalty. For a particular homework, you can use only two late days. Once you run out of late days, you will incur a 25% penalty for each extra late day you use. Each late homework should be clearly marked as “Late” on the first page.Regrade Policy: If you believe that the course staff made an error in grading, you may submit a regrade request through Gradescope within one week of receiving your grade. Please be as specific as possible with your regrade request.Collaboration Policy and Honor Code: You are free to form study groups and discuss homeworks and projects. However, you must write up homeworks and code from scratch independently without referring to any notes from the joint session. You should not copy, refer to, or look at the solutions from previous years’ homeworks in preparing your answers.
It is an honor code violation to intentionally refer to a previous year’s solutions, either official or written up by another student. Anybody violating the will be referred to the Office of Community Standards. Syllabus WeekDatesTopicsRequired ReadingsAssignments1Jan. 8 & 10Introduction, Probability Theory, Bayesian NetworksPGM Ch. 1-3HW 1 released2Jan. 15 & 17Undirected modelsPGM Ch. 22 & 24Learning Bayes NetsPGM Ch.
16-17HW 2 released4Jan. 29 & 31Exact Inference; Message PassingPGM Ch. 9-10HW 3 released5Feb. 5 & 7SamplingPGM Ch.
12 & 14MAP Inference; Structured predictionPGM Ch. 13HW 4 released7Feb.
19 & 21Parameter LearningPGM Ch. 26 & 28Bayesian Learning; Structure LearningPGM Ch. 17-18HW 5 released9Mar. 5 & 7Exponential families; variational inferencePGM Ch.
8 & 11; GEV Section 310Mar. 12-14Advanced topics and conclusionsMany thanks to David Sontag, Adnan Darwiche, Vibhav Gogate, and Tamir Hazan for sharing material used in slides and homeworks. Other ResourcesThere are many software packages available that can greatly simplify the use of graphical models. Here are a few examples:.
.Important announcements will be posted on. Time/Location:. Lectures: Tue/Thu 9:00-10:20am, Skilling Auditorium. Office Hours: See. Final Exam: March 22, 8:30-11:30. Location: Nvidia Auditorium and Skilling Auditorium.Instructor:. Course Assistants:.
Aditya Grover. Alex Bishara. Ethan Chan.
Kratarth Goel. Xiaocheng Li. Bo Wang.Calendar: Click for detailed information of all lectures, office hours, and due dates.Contact: Please use for all questions related to lectures and coursework. For SCPD students, please email [email protected] or call 650-741-1542.
Course Description:.Probabilistic graphical models are a powerful framework for representing complex domains using probability distributions, with numerous applications in machine learning, computer vision, natural language processing and computational biology. Graphical models bring together graph theory and probability theory, and provide a flexible framework for modeling large collections of random variables with complex interactions. This course will provide a comprehensive survey of the topic, introducing the key formalisms and main techniques used to construct them, make predictions, and support decision-making under uncertainty.The aim of this course is to develop the knowledge and skills necessary to design, implement and apply these models to solve real problems.
The course will cover:(1) Bayesian networks, undirected graphical models and their temporal extensions;(2) exact and approximate inference methods;(3) estimation of the parameters and the structure of graphical models.Prerequisites: Students are expected to have background in basic probability theory, statistics, programming, algorithm design and analysis.Required Textbook: Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and Nir Friedman. MIT Press.Lecture notes: Lecture notes are available and will be periodically updated throughout the quarter. Further Readings:.Modeling and Reasoning with Bayesian networks by Adnan Darwiche.Pattern Recognition and Machine Learning by Chris Bishop.Machine Learning: a Probabilistic Perspective by Kevin P. Murphy.Information Theory, Inference, and Learning Algorithms by David J.
Mackay.Bayesian Reasoning and Machine Learning by David Barber.Graphical models, exponential families, and variational inference by Martin J. Wainwright and Michael I. Jordan. Grading Policy:.Homeworks (70%): There will be five homeworks with both written and programming parts. Each homework is centered around an application and will also deepen your understanding of the theoretical concepts. Homeworks will be posted on.Final Exam (30%): March 22, 8:30-11:30.Piazza: You will be awarded with up to 3% extra credit if you answer other students' questions in a substantial and helpful way, or contribute to the lecture notes with pull requests. Assignments:.Written Assignments:Homeworks should be written up clearly and succinctly; you may lose points if your answers are unclear or unnecessarily complicated.
You are encouraged to use LaTeX to writeup your homeworks (here is a ), but this is not a requirement.Collaboration Policy and Honor Code:You are free to form study groups and discuss homeworks and projects. However, you must write up homeworks and code from scratch independently without referring to any notes from the joint session. You should not copy, refer to, or look at the solutions in preparing their answers from previous years' homeworks. It is an honor code violation to intentionally refer to a previous year's solutions, either official or written up by another student. Anybody violating the will be referred to the Office of Judicial Affairs. Submission Instructions:.We will be using the online submission system. All students (non-SCPD and SCPD) should submit their assignments electronically via.
Students can typeset or scan their homeworks.To register for GradeScope,. Create an account on if you don't have one already. Join CS228 course using Entry Code 9VX7J9. Fill in this.Here are some for submitting through Gradescope.Late Homework:You have 6 late days which you can use at any time during the term without penalty. For a particular homework, you can use only two late days.Once you run out of late days, you will incur in a 25% penalty for each extra late day you use. Each late homework should be clearly marked as 'Late' on the first page.Regrade Policy:You may submit a regrade request if you believe that the course staff made an error in grading.
Any regrade requests should be submitted through Gradescope within one week of receiving your grade. Please try to be as specific as possible with your regrade request. WeekDateTopicReadingsAssignments1Jan.
10-12Introduction, Probability Theory, Bayesian NetworksChapters 1-3Homework 1 released. Due January 24.2Jan. 17-19Undirected modelsChapter 43Jan. 24-26Learning Bayes NetsChapters 16, 17Homework 2 released.
![Models Models](/uploads/1/2/5/3/125389688/699490205.jpg)
Due February 3.4Jan. 31- Feb 2Exact Inference; Message PassingChapters 9, 10Homework 3 released. Due February 17.5Feb. 7-9SamplingChapter 126Feb. 14-16MAP Inference; Structured predictionChapter 13;Homework 4 released.
Due March 3.7Feb. 21-23Parameter LearningChapter 20; Chapter 198Feb. 2Bayesian Learning; Structure LearningChapter 17; Chapter 18Homework 5 released. Due March 18.9Mar. 7-9Exponential families; variational inferenceChapter 8; Chapter 11; (Section 3)10Mar. 14-16Advanced topics and conclusions.Many thanks to David Sontag, Adnan Darwiche, Vibhav Gogate, and Tamir Hazan for sharing material used in slides and homeworks.There are many software packages available that can greatly simplify the use of graphical models.
Here are a few examples:.