The Statistics Seminar speaker for Wednesday, March 14, 2018 is Jelena Bradic, assistant professor in the departmennt of mathematics at the University of California at San Diego. She directs the Statistical Lab for Learning Large-Scale and Complex Data. Her interests are in machine learning, high dimensional statistics, ensemble learning, robust statistics and survival analysis. Her application areas are in big data analysis, time-to-event analysis, econometrics and computational biology.
Talk: Minimax testing and adaptivity in high-dimensional models with non-sparse structures (high-dimensional learning without sparsity)
Abstract: In this paper, we focus on hypothesis testing and confidence interval construction in high-dimensional linear models. We develop new concepts of uniform and essentially uniform non-testability that allow the study of limitations of tests across a broad set of alternatives. Uniform non-testability identifies an extensive collection of alternatives such that the power of any test, against any alternative in this group, is asymptotically at most equal to the nominal size. Where minimax proves the existence of one particularly “bad” alternative, we explicitly identify a large set of “bad” alternatives. Implications of the new constructions include new minimax testability results that are in sharp contrast to existing results, do not depend on the sparsity of the model parameters. We identify new tradeoffs between testability and feature correlation. In particular, we show that in models with weak feature correlations a minimax lower bound can be attained by a confidence interval whose width has the parametric rate regardless of the size of the model sparsity. We also discover that whenever the feature correlation is known, inference at the parametric rate is achievable, irrespective of how big that correlation is or how sparse the model is.