Algorithmic Robust Statistics presented by Ilias Diakonikolas
Abstract: The field of Robust Statistics studies the problem of designing estimators that perform well even when the data significantly deviates from the idealized modeling assumptions. The classical statistical theory, going back to the pioneering works by Tukey and Huber in the 1960s, characterizes the information-theoretic limits of robust estimation for a number of statistical tasks. On the other hand, until fairly recently, the computational aspects of this field were poorly understood. Specifically, no scalable robust estimation methods were known in high dimensions, even for the most basic task of mean estimation.
A recent line of work in computer science developed the first computationally efficient robust estimators in high dimensions for a range of learning tasks. This talk will provide an overview of these algorithmic developments and discuss some open problems in the area.