STOC 2022 Workshops: Call for Proposals

June 20-24, 2022
Rome, Italy

Submission deadline: January 15, 2022

STOC/TheoryFest 2022 will hold workshops during the conference week, June 20–24, 2022. We invite groups of interested researchers to submit workshop proposals. The due date for proposals is February 15, 2022. Submission instructions can be found on the STOC’22 website.

Quarterly Theory Workshop: 2021 Junior Theorists Workshop

December 9-10, 2021
Virtual(on Gather.Town) Please register here(free): ( to get access to the login information (we will send it the day before the event).

The 2021 Junior Theorists Workshop is part of the Northwestern CS Quarterly Theory Workshop Series. The focus of this workshop will be on junior researchers in all areas of theoretical computer science.

IDEAL mini-workshop on “New Directions on Robustness in ML”

November 16, 2021

As machine learning systems are being deployed in almost every aspect of decision-making, it is vital for them to be reliable and secure to adversarial corruptions and perturbations of various kinds. This workshop will explore newer notions of robustness and the different challenges that arise in designing reliable ML algorithms. Topics include test-time robustness, adversarial perturbations, distribution shifts, and explore connections between robustness and other areas. The workshop speakers are Aleksander Madry, Gautam Kamath, Kamalika Chaudhuri, Pranjal Awasthi and Sebastien Bubeck. Please register at the webpage given below for free to participate in the virtual workshop.

IDEAL mini-workshop on “Statistical and Computational Aspects of Robustness in High-dimensional Estimation”

October 19, 2021

Registration deadline: October 18, 2021

Today’s data pose unprecedented challenges to statisticians. It may be incomplete, corrupted or exposed to some unknown source of contamination or adversarial attack. Robustness is one of the revived concepts in statistics and machine learning that can accommodate such complexity and glean useful information from modern datasets. This virtual workshop will address several aspects of robustness such as statistical estimation and computational efficiency in the context of modern high-dimensional data analysis. The workshop speakers are Ankur Moitra, Ilias Diakonikolas, Jacob Steinhardt and Po-Ling Loh. Please register at the webpage given below for free to participate in the virtual workshop.

IDEAL Special Quarter Fall 2021 (Robustness in High-dimensional Statistics and Machine Learning)

September 21 – December 10, 2021

IDEAL – The Institute for Data, Econometrics, Algorithms, and Learning (an NSF-funded collaborative institute across Northwestern, TTIC and U Chicago) is organizing a Fall 2021 special quarter on “Robustness in High-dimensional Statistics and Machine Learning”.
The special-quarter activities include mini-workshops, seminars, graduate courses, and a reading group. The research goal is to explore several theoretical frameworks and directions towards designing learning algorithms and estimators that are tolerant to errors, contamination, and misspecification in data. Many of these activities will be virtual.

The kick-off event ( ) for this quarter will be held on Tuesday, September 21, 2021 at 3 pm Central time. We will briefly introduce the institute, the key personnel and information about the various activities during the special quarter. To find more information, or participate in the special quarter, please visit the webpage.

School on Modern Directions in Discrete Optimization

September 13-17, 2021

Aims and Scope:
The school provides an introduction to some of the main topics of the trimester program on discrete optimization. The lectures will address the interface between tropical geometry and discrete optimization; recent developments in continuous optimization with applications to combinatorial problems; topics in approximation algorithms; and fixed parameter tractability. The lectures will be mainly directed towards PhD students and junior researchers.


Michał Pilipczuk (Warsaw University): Introduction to parameterized algorithms and applications in discrete optimization

Aaron Sidford (Stanford University): Introduction to interior point methods for discrete optimization

Ngoc Mai Tran (UT Austin): Tropical solutions to hard problems in auction theory and neural networks, semigroups and extreme value statistics

Rico Zenklusen (ETH Zürich): Approximation algorithms for hard augmentation problems

Abstracts can be found here:

Schedule can be found here:

Interested in attending the School?

Here is the link for the online (and free) registration!

This Summer School is part of the HIM trimester in Discrete Optimization

Organizers: Daniel Dadush (Amsterdam), Jesper Nederlof (Utrecht), Neil Olver (London), Laura Sanità (Eindhoven), László Végh (London)

Workshop on Algorithms for Large Data (Online) 2021

August 23-25, 2021

Registration deadline: August 20, 2021

This workshop aims to foster collaborations between researchers across multiple disciplines through a set of central questions and techniques for algorithm design for large data. We will focus on topics such as sublinear algorithms, randomized numerical linear algebra, streaming and sketching, and learning and testing.

5th SIAM Symposium on Simplicity in Algorithms

January 10-11, 2022
Alexandria, Virginia, U.S.

Submission deadline: August 9, 2021

Symposium on Simplicity in Algorithms is a conference in theoretical computer science dedicated to advancing algorithms research by promoting simplicity and elegance in the design and analysis of algorithms. The benefits of simplicity are manifold: simpler algorithms manifest a better understanding of the problem at hand; they are more likely to be implemented and trusted by practitioners; they can serve as benchmarks, as an initialization step, or as the basis for a “state of the art” algorithm; they are more easily taught and are more likely to be included in algorithms textbooks; and they attract a broader set of researchers to difficult algorithmic problems. Co-located with SODA 2022.

Paper registration: August 9; Submission deadline: August 16.

ADFOCS 2021: Convex Optimization and Graph Algorithms

July 26 – August 13, 2021

ADFOCS is an international summer school held annually at the Max Planck Institute for Informatics (MPII). The topic of this year’s edition is Convex Optimization and its applications on Graph Algorithms. The event will take place online over the span of three weeks from July 26 to August 13 with a daily 2 hour lecture.

We are delighted to have the following speakers:

Alina Ene (Boston University): Adaptive gradient descent
Rasmus Kyng (ETH Zurich): Graphs, sampling, and iterative methods Aaron Sidford (Stanford University): Optimization Methods for Maximum Flow

Registration is mandatory but free.

Annual Symposium on Combinatorial Pattern Matching (summer school + conference))

July 4, 2021 – July 7, 2021
Wrocław, Poland and online

The Annual Symposium on Combinatorial Pattern Matching (CPM) has by now over 30 years of tradition and is considered to be the leading conference for the community working on Stringology. The objective of the annual CPM meetings is to provide an international forum for research in combinatorial pattern matching and related applications such as computational biology, data compression and data mining, coding, information retrieval, natural language processing, and pattern recognition. The conference will be preceded by a student summer school on July 4, 2021. Registration is free but mandatory.