Advanced Graph Algorithms and Optimization, Spring 2021


Lecturers: Rasmus Kyng and Max Probst
Teaching Assistants: Ming Ding, Simon Meierhans, and Lukas Vogl
Lecture Time and Place: Mondays 10:00-11:00 and Tuesdays 16:00-18:00 on Zoom
Exercise Session Time and Place: Thursdays 15:00-16:00 and 17:00-18:00 on Zoom
ECTS credits: 8 credits
Final deregistration date: April 1

INFORMATION ON ONLINE TEACHING: We will email registered students with information on how to attend lectures and exercise sessions online. Each lecture will be held live on Zoom. Most lectures will be available as recording later the same day, except when technological issues prevent this. Exercise sessions will also be held live on Zoom, but they will *NOT* be recorded. If you are *NOT* a registered student, but wish to attend the lectures online, you should send us an email to get the necessary information.

Course Objective: The course will take students on a deep dive into modern approaches to graph algorithms using convex optimization techniques. By studying convex optimization through the lens of graph algorithms, students should develop a deeper understanding of fundamental phenomena in optimization. The course will cover some traditional discrete approaches to various graph problems, especially flow problems, and then contrast these approaches with modern, asymptotically faster methods based on combining convex optimization with spectral and combinatorial graph theory.

Course Content: Students should leave the course understanding key concepts in optimization such as first and second-order optimization, convex duality, multiplicative weights and dual-based methods, acceleration, preconditioning, and non-Euclidean optimization. Students will also be familiarized with central techniques in the development of graph algorithms in the past 15 years, including graph decomposition techniques, sparsification, oblivious routing, and spectral and combinatorial preconditioning.

Prerequisites: This course is targeted toward masters and doctoral students with an interest in theoretical computer science. Students should be comfortable with design and analysis of algorithms, probability, and linear algebra. Having passed the course Algorithms, Probability, and Computing (APC) is highly recommended, but not formally required. If you are not sure whether you are ready for this class or not, please consult the lecturer.

Literature:

  • Convex Optimization by Stephen Boyd and Lieven Vandenberghe.
    • This book is a helpful reference for convex optimization.
  • Differential Calculus by Henri Cartan.
    • If you have a powerful urge to learn all about multivariate calculus, you can start here. You shouldn't need it for the course though.
  • Convex Analysis by Rockafellar.
    • This is the classic text on basic convex analysis (not optimization). Again, you shouldn't need this.

Questions? You can ask the course TAs and instructors about the course materials, exercises, and any other issues on the course Moodle page.


Course Moodle

You can ask the course TAs and instructors about the course materials, exercises, and any other issues on the course Moodle page (link).

Please make sure you receive Moodle notifications from this course as some announcements will be made exclusively through Moodle.


Lecture Notes

The lecture notes are available here (link).

These notes will be updated throughout the course. There may sometimes be a delay between a lecture being held and the corresponding chapter being added to the notes.

We will usually also publish the board notes for each lecture.

You can find the notes for an earlier version of the course here: AGAO 2020. Note that a lot of new material will be added this year.


Performance Assessment

Graded Homework 1 (out approx. March 30th, due April 18th): 15% of grade. The homework consists of exercises.

Graded Homework 2 (out approx. May 18th, due June 6th): 15% of grade. The homework consists of exercises.

Oral Exam (First Week of June): 70% of the grade. The exam will last 15 minutes. We will discuss topics from the lectures, but may ask questions about problems from the graded homework.


Graded Homework

Each graded homework assignment accounts for 15% of your grade. You should submit your solutions as a PDF typeset in LaTeX by the listed due date, 23:59 local time in Zurich, by submitting them through the course Moodle page. You must come up with the solutions on your own and write everything yourself.

Exercises

Most weeks, we will publish a problem set consisting of exercises designed help you understand the course material. The exercises will not count toward your grade, but you are highly encouraged to solve them all. Exercises will be published on Tuesdays on the course Moodle. Thursday the following week, the TAs will discuss solutions to the exercises in the weekly exercise sessions. If you wish to get feedback on your problem set solutions, you must submit solutions as a PDF typeset in LaTeX by Sunday, 23:59 local time in Zurich, by submitting them through the course Moodle page. You're welcome to work with your classmates on these problem sets!


Schedule, Lecture Notes, and Exercises

Date Lecture Problem Sets and Graded Homework
(with due dates)
02/22 Mon Lecture 1. Chapter 1: Course Introduction.
02/23 Tue Lecture 2. Chapter 1 & Chapter 2: Basic optimization, convex geometry, linear algebra. Problem Set 1
03/01 Mon Lecture 3. Chapter 2: Basic optimization, convex geometry, linear algebra.
03/02 Tue Lecture 4. Chapter 3: Second derivatives and convexity, gradient descent. Problem Set 2
03/08 Mon Lecture 5. Chapter 3: Accelerated gradient descent.
03/09 Tue Lecture 6. Chapter 4: Introduction to spectral graph theory. Problem Set 3
03/15 Mon Lecture 7. Chapter 5: Conductance, Expanders and Cheeger’s Inequality.
03/16 Tue Lecture 8. Chapter 5 continued. Problem Set 4