Edgeworth expansions for network moments presented by Yuan Zhang Assistant Professor Department of Statistics The Ohio State University
Abstract: Network method of moments is an important tool for nonparametric inference of relational data, but a long-existing open challenge is fast and accurate approximation to the sampling distributions of network moments. In this paper, we present the first result with provable higher-order accuracy. Sharply contrasting the classical scenario of noiseless U-statistics, we discover, with surprise, that in the network setting, two typically-hated factors -- sparsity and observational errors -- can jointly contribute a blessing "self-smoothing" effect that reinstates the validity of Edgeworth expansions under much weaker assumptions. For practitioners, our easy-to-implement empirical method is faster and more accurate than other state-of-art methods. It is also versatile, by making no substantial assumption on network structure, apart from exchangeability and (conditionally) independent edge generation.
We showcase several applications of our results to inference on network moments: 1. providing the first proof that some popular network bootstrap schemes have higher-order accuracy; 2. explicitly formulating Cornish-Fisher confidence intervals and one-sample tests, both with accurate level controls. If time permits, I will also discuss the application to network two-sample moment method.