Talk: The computational cost of detecting hidden structures: from random to deterministic
Tim Kunisky: Postdoctoral Associate, Computer Science, Yale University
Event Details
LIVE STREAM: https://uwmadison.zoom.us/j/97650281911?pwd=RjZLalBDV25YVm84Wnp6WEtNQTNtZz09
Abstract: I will present a line of work on the computational complexity of algorithmic tasks on random inputs, including hypothesis testing, sampling, and certifying bounds on optimization problems. Surprisingly, these diverse tasks admit a unified analysis involving the same two main ingredients. The first is the study of algorithms that output low-degree polynomial functions of their inputs. Such algorithms are believed to be optimal for many statistical tasks and can be understood with the theory of orthogonal polynomials, leading to strong evidence for the difficulty of certain hypothesis testing problems. The second is a strategy of "planting" unusual structures in problem instances, which shows that algorithms for sampling and certification can be interpreted as implicitly performing hypothesis testing. I will focus on examples of hypothesis testing related to principal component analysis (PCA), and their connections with problems motivated by statistical physics: (1) sampling from Ising models, and (2) certifying bounds on random functions associated with models of spin glasses.
Next, I will describe more recent results probing the computational cost of certification not just in random settings under strong distributional assumptions, but also for more generic problem instances. As an extreme example, by considering the sum-of-squares hierarchy of semidefinite programs, I will show how some of the above ideas may be completely derandomized and applied in a deterministic setting. Using as a testbed the long-standing open problem of computing the clique number of the number-theoretic Paley graph, I will give an analysis of semidefinite programming that leads both to new approaches to this combinatorial optimization problem and to refined notions of pseudorandomness capturing deterministic versions of phenomena from random matrix theory and free probability.
Bio: Tim Kunisky is a postdoctoral associate at Yale University, hosted by Dan Spielman in the Department of Computer Science. He previously graduated with a bachelor's degree in Mathematics from Princeton University, worked on machine learning for ranking problems and natural language processing at Google, and received his PhD in Mathematics from the Courant Institute of Mathematical Sciences at New York University, where he was advised by Afonso Bandeira and Gérard Ben Arous. His main research interests concern how probability theory and mathematical statistics interact with computational complexity and the theory of algorithms.