Skip to main content

CS Theory Seminar: How fast can you find a good hypothesis? (Anders Aamand, Copenhagen and Justin Chen, MIT)

Event Details

Date
Wednesday, November 12, 2025
Time
2:30-3:30 p.m.
Location
Description

In the hypothesis selection problem, we are given a finite set of candidate distributions (hypotheses), H_1, . . . , H_n, and samples from an unknown distribution P. Our goal is to find a hypothesis H_i whose total variation distance to P is comparable to that of the nearest hypothesis in H. Specifically, if the minimum distance is OPT, we aim to output an H_i such that, with probability at least 1 − δ, its total variation distance to P is at most C · OPT + ε. Key aspects of this problem remain unresolved, especially regarding the computational efficiency of statistically optimal hypothesis selection. In this talk, we will highlight recent progress on two questions: (a) Does there exists an algorithm that simultaneously runs in near-linear time, achieves the best possible approximation factor, and succeeds with high probability? Intriguingly, such algorithms exist if we relax any one of these criteria. (b) Can the approximation factor be improved in expectation (similarly, if we are allowed to output a mixture of the hypotheses)? This question is closely related to the challenge of achieving the optimal approximation for improper hypothesis selection in finite time on real-valued domains. This talk is based joint work by Anders Aamand, Maryam Aliakbarpour, Justin Y. Chen, and Sandeep Silwal.

Cost
Free

Tags