IDEAL mini-workshop on “New Directions on Robustness in ML”

November 16, 2021
Virtual
https://www.ideal.northwestern.edu/events/mini-workshop-on-new-directions-on-robustness-in-ml/

As machine learning systems are being deployed in almost every aspect of decision-making, it is vital for them to be reliable and secure to adversarial corruptions and perturbations of various kinds. This workshop will explore newer notions of robustness and the different challenges that arise in designing reliable ML algorithms. Topics include test-time robustness, adversarial perturbations, distribution shifts, and explore connections between robustness and other areas. The workshop speakers are Aleksander Madry, Gautam Kamath, Kamalika Chaudhuri, Pranjal Awasthi and Sebastien Bubeck. Please register at the webpage given below for free to participate in the virtual workshop.

IDEAL mini-workshop on “Statistical and Computational Aspects of Robustness in High-dimensional Estimation”

October 19, 2021
Virtual
https://www.ideal.northwestern.edu/events/mini-workshop-on-statistical-and-computational-aspects-of-robustness-in-high-dimensional-estimation/

Registration deadline: October 18, 2021

Today’s data pose unprecedented challenges to statisticians. It may be incomplete, corrupted or exposed to some unknown source of contamination or adversarial attack. Robustness is one of the revived concepts in statistics and machine learning that can accommodate such complexity and glean useful information from modern datasets. This virtual workshop will address several aspects of robustness such as statistical estimation and computational efficiency in the context of modern high-dimensional data analysis. The workshop speakers are Ankur Moitra, Ilias Diakonikolas, Jacob Steinhardt and Po-Ling Loh. Please register at the webpage given below for free to participate in the virtual workshop.

IDEAL Special Quarter Fall 2021 (Robustness in High-dimensional Statistics and Machine Learning)

September 21 – December 10, 2021
Virtual
https://www.ideal.northwestern.edu/special-quarters/fall-2021/

IDEAL – The Institute for Data, Econometrics, Algorithms, and Learning (an NSF-funded collaborative institute across Northwestern, TTIC and U Chicago) is organizing a Fall 2021 special quarter on “Robustness in High-dimensional Statistics and Machine Learning”.
The special-quarter activities include mini-workshops, seminars, graduate courses, and a reading group. The research goal is to explore several theoretical frameworks and directions towards designing learning algorithms and estimators that are tolerant to errors, contamination, and misspecification in data. Many of these activities will be virtual.

The kick-off event ( https://www.ideal.northwestern.edu/events/fall-2021-kickoff-event/ ) for this quarter will be held on Tuesday, September 21, 2021 at 3 pm Central time. We will briefly introduce the institute, the key personnel and information about the various activities during the special quarter. To find more information, or participate in the special quarter, please visit the webpage.

School on Modern Directions in Discrete Optimization

September 13-17, 2021
Online
https://www.him.uni-bonn.de/programs/future-programs/future-trimester-programs/discrete-optimization/discrete-optimization-school/

Aims and Scope:
The school provides an introduction to some of the main topics of the trimester program on discrete optimization. The lectures will address the interface between tropical geometry and discrete optimization; recent developments in continuous optimization with applications to combinatorial problems; topics in approximation algorithms; and fixed parameter tractability. The lectures will be mainly directed towards PhD students and junior researchers.

Lecturers:

Michał Pilipczuk (Warsaw University): Introduction to parameterized algorithms and applications in discrete optimization

Aaron Sidford (Stanford University): Introduction to interior point methods for discrete optimization

Ngoc Mai Tran (UT Austin): Tropical solutions to hard problems in auction theory and neural networks, semigroups and extreme value statistics

Rico Zenklusen (ETH Zürich): Approximation algorithms for hard augmentation problems

Abstracts can be found here:
https://www.him.uni-bonn.de/fileadmin/him/Workshops/template_abstracts_him_School_Sept_I.pdf

Schedule can be found here:
https://www.him.uni-bonn.de/fileadmin/him/Workshops/template_schedule_him_School_Sept_I.pdf

Interested in attending the School?

Here is the link for the online (and free) registration!
https://www.him.uni-bonn.de/index.php?id=4847

This Summer School is part of the HIM trimester in Discrete Optimization
https://www.him.uni-bonn.de/programs/future-programs/future-trimester-programs/discrete-optimization/description/

Organizers: Daniel Dadush (Amsterdam), Jesper Nederlof (Utrecht), Neil Olver (London), Laura Sanità (Eindhoven), László Végh (London)

Workshop on Algorithms for Large Data (Online) 2021

August 23-25, 2021
Online
https://waldo2021.github.io/

Registration deadline: August 20, 2021

This workshop aims to foster collaborations between researchers across multiple disciplines through a set of central questions and techniques for algorithm design for large data. We will focus on topics such as sublinear algorithms, randomized numerical linear algebra, streaming and sketching, and learning and testing.

5th SIAM Symposium on Simplicity in Algorithms

January 10-11, 2022
Alexandria, Virginia, U.S.
https://www.siam.org/conferences/cm/conference/sosa22

Submission deadline: August 9, 2021

Symposium on Simplicity in Algorithms is a conference in theoretical computer science dedicated to advancing algorithms research by promoting simplicity and elegance in the design and analysis of algorithms. The benefits of simplicity are manifold: simpler algorithms manifest a better understanding of the problem at hand; they are more likely to be implemented and trusted by practitioners; they can serve as benchmarks, as an initialization step, or as the basis for a “state of the art” algorithm; they are more easily taught and are more likely to be included in algorithms textbooks; and they attract a broader set of researchers to difficult algorithmic problems. Co-located with SODA 2022.

Paper registration: August 9; Submission deadline: August 16.

ADFOCS 2021: Convex Optimization and Graph Algorithms

July 26 – August 13, 2021
Online
https://conferences.mpi-inf.mpg.de/adfocs-22/

ADFOCS is an international summer school held annually at the Max Planck Institute for Informatics (MPII). The topic of this year’s edition is Convex Optimization and its applications on Graph Algorithms. The event will take place online over the span of three weeks from July 26 to August 13 with a daily 2 hour lecture.

We are delighted to have the following speakers:

Alina Ene (Boston University): Adaptive gradient descent
Rasmus Kyng (ETH Zurich): Graphs, sampling, and iterative methods Aaron Sidford (Stanford University): Optimization Methods for Maximum Flow

Registration is mandatory but free.

Annual Symposium on Combinatorial Pattern Matching (summer school + conference))

July 4, 2021 – July 7, 2021
Wrocław, Poland and online
https://cpm2021.ii.uni.wroc.pl/

The Annual Symposium on Combinatorial Pattern Matching (CPM) has by now over 30 years of tradition and is considered to be the leading conference for the community working on Stringology. The objective of the annual CPM meetings is to provide an international forum for research in combinatorial pattern matching and related applications such as computational biology, data compression and data mining, coding, information retrieval, natural language processing, and pattern recognition. The conference will be preceded by a student summer school on July 4, 2021. Registration is free but mandatory.

Workshop on Machine Learning for Algorithms

July 13-14, 2021
Foundations of Data Science Institute (FODSI)
https://fodsi.us/ml4a.html

In recent years there has been increasing interest in using machine learning to improve the performance of classical algorithms in computer science, by fine-tuning their behavior to adapt to the properties of the input distribution. This “data-driven” or “learning-based” approach to algorithm design has the potential to significantly improve the efficiency of some of the most widely used algorithms. For example, it has been used to design better data structures, online algorithms, streaming and sketching algorithms, market mechanisms and algorithms for combinatorial optimization, similarity search and inverse problems. This virtual workshop will feature talks from experts at the forefront of this exciting area.

International School-conference in Algorithms, Combinatorics, and Complexity

May 24-28, 2021
Online
https://indico.eimi.ru/event/199/

An advanced school for young researchers featuring three minicourses in vibrant areas of mathematics and computer science. The target audience includes graduate, master, and senior bachelor students of any mathematical speciality.