Statistics Seminar

Spring 2019

Seminars are held from 4:00 p.m. – 5:00 p.m. in Griffin-Floyd 100 unless otherwise noted.Refreshments are available before the seminars from 3:30 p.m. – 4:00 p.m. in Griffin-Floyd Hall 103.

Date Speaker    Title
Mar 14 Lingzhou Xue 

(Penn State University)

Fisher’s Combined Probability Tests for Complex, High Dimensional Data

Mar 21 Anindya Bhadra

(Purdue University)

Horseshoe Regularization for Prediction and Inverse Covariance Estimation

Apr 16 Joseph W Hogan

(Brown University)

TBD

Abstracts

 

Fisher’s Combined Probability Tests for Complex, High Dimensional Data
Lingzhou Xue, Penn State University

 

In the past decade, two sets of test statistics are widely used for high-dimensional hypothesis tests: 1) using extreme-value form statistics to test against sparse alternatives, and 2) using quadratic form statistics to test against dense alternatives with small disturbances. However, quadratic form statistics suffer from low power against sparse alternatives, and extreme-value form statistics suffer from low power against dense alternatives. For various real-world applications, it is important to develop the power enhancement testing procedures. In this talk, we provide a completely new perspective by studying the asymptotic joint distribution of quadratic form statistics and extreme-value form statistics. Based on their explicit joint limiting laws, we follow the philosophy of Fisher’s method to develop power enhancement tests for high-dimensional means, banded covariances, spiked covariances, and multi-factor pricing models. We prove that Fisher’s combined probability tests boost the power against more general alternatives while retaining the correct asymptotic size. We demonstrate the finite-sample performance of our proposed testing procedures in both simulation studies and real applications.

Horseshoe Regularization for Prediction and Inverse Covariance Estimation
Anindya Bhadra, Purdue University

Since the advent of the horseshoe priors for regularization, global-local shrinkage methods have proved to be a fertile ground for the development of Bayesian theory and methodology in high-dimensional problems. They have achieved remarkable success in computation, and enjoy strong theoretical support. Much of the existing literature, however, has focused on estimation and variable selection results in the sparse normal means model. The purpose of the current talk is to demonstrate that the horseshoe priors are useful more broadly, by considering two different directions. First, we consider finite sample (finite n, finite p>n) prediction risk results in linear regression and explicitly point out under what circumstances can horseshoe regression be expected to outperform global shrinkage methods, such as ridge regression in prediction. Second, we develop a multivariate extension of the horseshoe for estimating the precision matrix in graphical models that we term the graphical horseshoe estimator.