August 24-28, 2020
Prague, Czech Republic
Registration deadline: March 22, 2020
The third edition of Prague Summer School on Discrete Mathematics will feature two lecture series:
Subhash Khot (New York University): Hardness of Approximation: From the PCP Theorem to the 2-to-2 Games Theorem, and
Shayan Oveis Gharan (University of Washington): Polynomial Paradigm in Algorithm Design.
The School is primarily intended for PhD students and early career researchers. There is no registration fee and a number of travel stipends will be available.
January 14-16, 2020
University of Technology Sydney, Australia
The Quantum Computer Science School 2020 will consist of three days of lectures and academic activities, targeting at senior undergraduates and graduate students in Australian and Asian universities. The lecturers are Mingsheng Ying, Luming Duan, Michael Bremner, Troy Lee, and Ran Duan, and the topics include quantum algorithms, quantum programming, quantum supremacy, and matrix multiplication algorithms. This school is co-organised by the University of Technology Sydney and Tsinghua University.
August 24, 2012 – August 28, 2020
Registration deadline: March 15, 2020
A one-week summer school primarily for PhD students and postdocs. Lecture courses given by Subhash Khot (New York University) and Shayan Oveis Gharan (University of Washington).
September 2-4, 2019
Trinity College Dublin, Dublin, Ireland
The Summer School will include a technical track with lectures and workshops on various aspects of Machine Learning, as well as associated training in areas such as research ethics, career development, communicating research, innovation, and public engagement. It will also include social and networking events. Speakers will include Prof Marco Di Renzo (CentraleSupélec/CNRS), Dr Jakob Hoydis (Nokia Bell Labs France), Dr Irene Macaluso (Trinity College Dublin), Dr Panayotis Mertikopoulos (CNRS, Grenoble), Michaela Blott (Xilinx Research Labs, Ireland), Prof Yong Li (Tsinghua University, Beijing) and Prof Paul Patras (University of Edinburgh).
December 15-19, 2019
IIAS @ Hebrew University of Jerusalem, Israel
Registration deadline: August 23, 2019
The school will introduce TCS and math students and faculty, who are interested in the theoretical aspects of quantum computation, to the beautiful and fascinating mathematical and computational open questions in the area, starting from scratch. No prior knowledge on quantum will be assumed. We hope to reach a point where participants gain initial tools and basic perspective to start working in this area.
The school will consist of several mini-courses, each of two or three hours, about central topics in the area. These include quantum algorithms, quantum error correction, quantum supremacy, delegation and verification, interactive proofs, cryptography, and Hamiltonian complexity. We will emphasize concepts, open questions, and links to mathematics. We will have daily TA sessions with hands-on exercises, to allow for a serious process of learning.
Main speakers: Adam Bouland, Sergey Bravyi, Matthias Christandl, Sandy Irani, Avishay Tal, Thomas Vidick. Additional speakers: Dorit Aharonov, Zvika Brakerski, Or Sattath. (Lists are subject to changes and additions.)
October 3-11, 2019
Registration deadline: October 3, 2019
The school will be organized by the International Black Sea University with the support of Shota Rustaveli National Science Foundation of Georgia (SRNSFG). The intended audience of the autumn school includes BSc, MSc and PhD students, researchers as well as industry professionals from the fields of computer science and mathematics.
August 12-15, 2019
UC San Diego
Registration deadline: July 31, 2019
Robust statistics and related topics offer ways to stress test estimators to the assumptions they are making. It offers insights into what makes some estimators behave well in the face of model misspecification, while others do not. In this summer school, we will revisit classic topics in robust statistics from an algorithmic perspective. We will cover recent progress on provably robust and computationally efficient parameter estimation in high-dimensions. We will compare this to other popular models, like agnostic learning and outlier detection. With the foundations in hand, we will explore modern topics like federated learning, semi-random models and connections to decision theory where being robust is formulated in alternative ways. We hope to have time for discussion about open questions like adversarial examples in deep learning, and invite the audience to help us muse about the right definitions to adopt in the first place.