September 13-17, 2021
Aims and Scope:
The school provides an introduction to some of the main topics of the trimester program on discrete optimization. The lectures will address the interface between tropical geometry and discrete optimization; recent developments in continuous optimization with applications to combinatorial problems; topics in approximation algorithms; and fixed parameter tractability. The lectures will be mainly directed towards PhD students and junior researchers.
Michał Pilipczuk (Warsaw University): Introduction to parameterized algorithms and applications in discrete optimization
Aaron Sidford (Stanford University): Introduction to interior point methods for discrete optimization
Ngoc Mai Tran (UT Austin): Tropical solutions to hard problems in auction theory and neural networks, semigroups and extreme value statistics
Rico Zenklusen (ETH Zürich): Approximation algorithms for hard augmentation problems
Abstracts can be found here:
Schedule can be found here:
Interested in attending the School?
Here is the link for the online (and free) registration!
This Summer School is part of the HIM trimester in Discrete Optimization
Organizers: Daniel Dadush (Amsterdam), Jesper Nederlof (Utrecht), Neil Olver (London), Laura Sanità (Eindhoven), László Végh (London)
July 26 – August 13, 2021
ADFOCS is an international summer school held annually at the Max Planck Institute for Informatics (MPII). The topic of this year’s edition is Convex Optimization and its applications on Graph Algorithms. The event will take place online over the span of three weeks from July 26 to August 13 with a daily 2 hour lecture.
We are delighted to have the following speakers:
Alina Ene (Boston University): Adaptive gradient descent
Rasmus Kyng (ETH Zurich): Graphs, sampling, and iterative methods Aaron Sidford (Stanford University): Optimization Methods for Maximum Flow
Registration is mandatory but free.
May 24-28, 2021
An advanced school for young researchers featuring three minicourses in vibrant areas of mathematics and computer science. The target audience includes graduate, master, and senior bachelor students of any mathematical speciality.
August 24-28, 2020
Prague, Czech Republic
Registration deadline: March 22, 2020
The third edition of Prague Summer School on Discrete Mathematics will feature two lecture series:
Subhash Khot (New York University): Hardness of Approximation: From the PCP Theorem to the 2-to-2 Games Theorem, and
Shayan Oveis Gharan (University of Washington): Polynomial Paradigm in Algorithm Design.
The School is primarily intended for PhD students and early career researchers. There is no registration fee and a number of travel stipends will be available.
January 14-16, 2020
University of Technology Sydney, Australia
The Quantum Computer Science School 2020 will consist of three days of lectures and academic activities, targeting at senior undergraduates and graduate students in Australian and Asian universities. The lecturers are Mingsheng Ying, Luming Duan, Michael Bremner, Troy Lee, and Ran Duan, and the topics include quantum algorithms, quantum programming, quantum supremacy, and matrix multiplication algorithms. This school is co-organised by the University of Technology Sydney and Tsinghua University.
August 24, 2012 – August 28, 2020
Registration deadline: March 15, 2020
A one-week summer school primarily for PhD students and postdocs. Lecture courses given by Subhash Khot (New York University) and Shayan Oveis Gharan (University of Washington).
September 2-4, 2019
Trinity College Dublin, Dublin, Ireland
The Summer School will include a technical track with lectures and workshops on various aspects of Machine Learning, as well as associated training in areas such as research ethics, career development, communicating research, innovation, and public engagement. It will also include social and networking events. Speakers will include Prof Marco Di Renzo (CentraleSupélec/CNRS), Dr Jakob Hoydis (Nokia Bell Labs France), Dr Irene Macaluso (Trinity College Dublin), Dr Panayotis Mertikopoulos (CNRS, Grenoble), Michaela Blott (Xilinx Research Labs, Ireland), Prof Yong Li (Tsinghua University, Beijing) and Prof Paul Patras (University of Edinburgh).
December 15-19, 2019
IIAS @ Hebrew University of Jerusalem, Israel
Registration deadline: August 23, 2019
The school will introduce TCS and math students and faculty, who are interested in the theoretical aspects of quantum computation, to the beautiful and fascinating mathematical and computational open questions in the area, starting from scratch. No prior knowledge on quantum will be assumed. We hope to reach a point where participants gain initial tools and basic perspective to start working in this area.
The school will consist of several mini-courses, each of two or three hours, about central topics in the area. These include quantum algorithms, quantum error correction, quantum supremacy, delegation and verification, interactive proofs, cryptography, and Hamiltonian complexity. We will emphasize concepts, open questions, and links to mathematics. We will have daily TA sessions with hands-on exercises, to allow for a serious process of learning.
Main speakers: Adam Bouland, Sergey Bravyi, Matthias Christandl, Sandy Irani, Avishay Tal, Thomas Vidick. Additional speakers: Dorit Aharonov, Zvika Brakerski, Or Sattath. (Lists are subject to changes and additions.)
October 3-11, 2019
Registration deadline: October 3, 2019
The school will be organized by the International Black Sea University with the support of Shota Rustaveli National Science Foundation of Georgia (SRNSFG). The intended audience of the autumn school includes BSc, MSc and PhD students, researchers as well as industry professionals from the fields of computer science and mathematics.
August 12-15, 2019
UC San Diego
Registration deadline: July 31, 2019
Robust statistics and related topics offer ways to stress test estimators to the assumptions they are making. It offers insights into what makes some estimators behave well in the face of model misspecification, while others do not. In this summer school, we will revisit classic topics in robust statistics from an algorithmic perspective. We will cover recent progress on provably robust and computationally efficient parameter estimation in high-dimensions. We will compare this to other popular models, like agnostic learning and outlier detection. With the foundations in hand, we will explore modern topics like federated learning, semi-random models and connections to decision theory where being robust is formulated in alternative ways. We hope to have time for discussion about open questions like adversarial examples in deep learning, and invite the audience to help us muse about the right definitions to adopt in the first place.