Advanced Graph Algorithms and Optimization, Spring 2021
- Lecturers: Rasmus Kyng and Max Probst
- Teaching Assistants: Ming Ding, Simon Meierhans, and Lukas Vogl
- Time and Place: Mondays 10:00-11:00 and Tuesdays 16:00-18:00 on Zoom
- Exercise Session Time and Place: Thursdays 15:00-16:00 and 17:00-18:00 on Zoom
- ECTS credits: 8 credits
- Final deregistration date: April 1
-
INFORMATION ON ONLINE TEACHING: We will email registered students with information on how to attend lectures and exercise sessions online. Each will be held live on Zoom. Most lectures will be available as recording later the same day, except when technological issues prevent this. Exercise sessions will also be held live on Zoom, but they will *NOT* be recorded. If you are *NOT* a registered student, but wish to attend the lectures online, you should send us an email to get the necessary information.
-
Course Objective: The course will take students on a deep dive into modern approaches to graph algorithms using convex optimization techniques. By studying convex optimization through the lens of graph algorithms, students should develop a deeper understanding of fundamental phenomena in optimization. The course will cover some traditional discrete approaches to various graph problems, especially flow problems, and then contrast these approaches with modern, asymptotically faster methods based on combining convex optimization with spectral and combinatorial graph theory.
-
Course Content: Students should leave the course understanding key concepts in optimization such as first and second-order optimization, convex duality, multiplicative weights and dual-based methods, acceleration, preconditioning, and non-Euclidean optimization. Students will also be familiarized with central techniques in the development of graph algorithms in the past 15 years, including graph decomposition techniques, sparsification, oblivious routing, and spectral and combinatorial preconditioning.
-
Prerequisites: This course is targeted toward masters and doctoral students with an interest in theoretical computer science. Students should be comfortable with design and analysis of algorithms, probability, and linear algebra. Having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, but not formally required. If you are not sure whether you are ready for this class or not, please consult the lecturer.
-
Literature:
- Convex Optimization
by Stephen Boyd and Lieven Vandenberghe.
- This book is a helpful reference for convex optimization.
- Differential Calculus by Henri Cartan.
- If you have a powerful urge to learn all about multivariate calculus, you can start here. You shouldn't need it for the course though.
-
Convex Analysis by Rockafellar.
- This is the classic text on basic convex analysis (not optimization). Again, you shouldn't need this.
- Convex Optimization
by Stephen Boyd and Lieven Vandenberghe.
Questions? You can ask the course TAs and instructors about the course materials, exercises, and any other issues on the course Moodle page.
Course Moodle
You can ask the course TAs and instructors about the course materials, exercises, and any other issues on the course Moodle page (link).
Please make sure you receive Moodle notifications from this course as some announcements will be made exclusively through Moodle.
Notes
The notes are available here (link).
These notes will be updated throughout the course. There may sometimes be a delay between a being held and the corresponding Chpt. being added to the notes.
We will usually also publish the board notes for each lecture.
You can find the notes for an earlier version of the course here: AGAO 2020. Note that a lot of new material will be added this year.
Performance Assessment
Graded Homework 1 (out approx. March 30th, due April 18th): 15% of grade. The homework consists of exercises.
Graded Homework 2 (out approx. May 18th, due June 6th): 15% of grade. The homework consists of exercises.
Oral Exam (First Week of June): 70% of the grade. The exam will last 15 minutes. We will discuss topics from the lectures, but may ask questions about problems from the graded homework.
Graded Homework
Each graded homework assignment accounts for 15% of your grade. You should submit your solutions as a PDF typeset in LaTeX by the listed due date, 23:59 local time in Zurich, by submitting them through the course Moodle page. You must come up with the solutions on your own and write everything yourself.
Exercises
Most weeks, we will publish a problem set consisting of exercises designed help you understand the course material. The exercises will not count toward your grade, but you are highly encouraged to solve them all. Exercises will be published on Tuesdays on the course Moodle. Thursday the following week, the TAs will discuss solutions to the exercises in the weekly exercise sessions. If you wish to get feedback on your problem set solutions, you must submit solutions as a PDF typeset in LaTeX by Sunday, 23:59 local time in Zurich, by submitting them through the course Moodle page. You're welcome to work with your classmates on these problem sets!
Schedule, Notes, and Exercises
Date | Lecture | Topic | Problem Sets and Graded Homework (with due dates) |
---|---|---|---|
02/22 Mon | 1 | Chpt. 1: Course Introduction. | |
02/23 Tue | 2 | Chpt. 1 & Chpt. 2: Basic optimization, convex geometry, linear algebra. |
Problem Set 1 |
03/01 Mon | 3 | Chpt. 2: Basic optimization, convex geometry, linear algebra. | |
03/02 Tue | 4 | Chpt. 3: Second derivatives and convexity, gradient descent. |
Problem Set 2 |
03/08 Mon | 5 | Chpt. 3: Accelerated gradient descent. | |
03/09 Tue | 6 | Chpt. 4: Introduction to spectral graph theory. |
Problem Set 3 |
03/15 Mon | 7 | Chpt. 5: Conductance, expanders and Cheeger’s inequality. | |
03/16 Tue | 8 | Chpt. 5 contd. |
Problem Set 4 |
03/22 Mon | 9 | Chpt. 6: Random walks. | |
03/23 Tue | 10 | Chpt. 7: Pseudo-inverses and effective resistance. |
Problem Set 5 |
03/29 Mon | 11 | Chpt. 8: Different perspectives on Gaussian elimination. | |
03/30 Tue | 12 | Chpt. 8 contd. Chpt. 9: Random matrix concentration and spectral sparsification. | Graded Homework 1 |
04/12 Mon | 13 | Chpt. 9 contd. | |
04/13 Tue | 14 | Chpt. 10: Solving Laplacian linear equations. | |
04/19 Mon | 15 | Chpt. 11: Classical maximum flow algorithms I. | |
04/20 Tue | 16 | Chpt. 11 contd. Chpt. 12: Classical maximum flow algorithms II. |
Problem Set 6 |
04/26 Mon | 17 | Chpt. 12 contd. | |
04/27 Tue | 18 | Chpt. 13: Link-Cut trees. |
Problem Set 7 |
05/03 Mon | 19 | Chpt. 14: The Cut-Matching game: Expanders via maximum flow. | |
05/04 Tue | 20 | Chpt. 14 contd. Chpt. 15: Separating hyperplanes, Lagrange multipliers, and convex duality. |
Problem Set 8 |
05/10 Mon | 21 | Chpt. 15 contd. | |
05/11 Tue | 22 | Chpt. 16: Karush-Kuhn-Tucker conditions, Fenchel conjugates, Newton’s method. |
Problem Set 9 |
05/17 Mon | 23 | Chpt. 17: Interior point methods for maximum flow. | |
05/18 Tue | 24 | Chpt. 17 contd. | Graded Homework 2 |
05/24 Mon | Pfingsten - no class. | ||
05/25 Tue | 25 | Chpt. 18: Distance oracles. |