Skip to main content

Statistics Seminar

Characterizing the Type 1-Type 2 Error Trade-off for SLOPE by Cynthia Rush

Event Details

Date
Wednesday, September 7, 2022
Time
4-5 p.m.
Description

Date: September 7, 2022

Speaker: Cynthia Rush

Website: http://www.columbia.edu/~cgr2130/

Title: Characterizing the Type 1-Type 2 Error Trade-off for SLOPE

Abstract: Sorted L1 regularization has been incorporated into many methods for solving high-dimensional statistical estimation problems, including the SLOPE estimator in linear regression. In this talk, we study how this relatively new regularization technique improves variable selection by characterizing the optimal SLOPE trade-off between the false discovery proportion (FDP) and true positive proportion (TPP) or, equivalently, between measures of type I and type II error. Additionally, we show that on any problem instance, SLOPE with a certain regularization sequence outperforms the Lasso, in the sense of having a smaller FDP, larger TPP and smaller L2 estimation risk simultaneously. Our proofs are based on a novel technique that reduces a variational calculus problem to a class of infinite-dimensional convex optimization problems and a very recent result from approximate message passing (AMP) theory. With SLOPE being a particular example, we discuss these results in the context of a general program for systematically deriving exact expressions for the asymptotic risk of estimators that are solutions to a broad class of convex optimization problems via AMP. Collaborators on this work include Zhiqi Bu, Jason Klusowski, and Weijie Su (

https://arxiv.org/abs/1907.07502

 and 

https://arxiv.org/abs/2105.13302

) and Oliver Feng, Ramji Venkataramanan, and Richard Samworth (

https://arxiv.org/abs/2105.02180

).

Cost
Free

Tags