Ou Zhao

Central Limit Theorem -- as I know it

In this talk I will quickly survey some recent progresses on the (conditional) central limit questions for stationary processes. One recent Markov chain example will be focused on, and its analysis is leading to some interesting phenomenon, which apparently we have not understood well yet. In vague terms, the partial sum of this Markov chain has variance growing faster than n, which delivers some challenge to prove the conditional CLT. But when we managed to show a CLT, it was found there is a `mass escaping' --- limiting variance

Thursday, September 30, 2010 - 3:30pm
Type: 

Thomas A. Louis

Why Bother With Bayes?

The use of Bayesian designs and analyses in biomedical and many other applications has burgeoned, even though its use entails additional overhead. Consequently, it is evident that statisticians and collaborators are increasingly finding the approach worth the bother. To help explain this increase in prevalence, I highlight a subset of the potential advantages of the Bayesian formalism and Bayesian philosophy in study design (“Everyone is a Bayesian in the design phase”), conduct, analysis and reporting.

Thursday, October 14, 2010 - 3:30pm
Type: 

Greg Rempala

Statistical Inference for Markov Jump Processes and Stochastic Epidemic Models
The presentation shall describe some conditions on the data process and the underlying likelihood function which guarantee the identifiability of the process parameters and the consistency of the maximum likelihood estimates.

The theory of Markov jump processes has broad applications in molecular biology and population dynamics modeling. One of the most important practical aspects of analyzing models based on Markov jump processes under the so called “mass-action" kinetics, is the inference on the reaction rate constants. The presentation shall describe some conditions on the data process and the underlying likelihood function which guarantee the identifiability of the process parameters and the consistency of the maximum likelihood estimates.

Thursday, September 23, 2010 - 3:30pm
Type: 

Dennis D. Boos

Variable Selection in Second-Order Models using the False Selection Rate (FSR) Approach

Variable selection of main effects is often a first step of model building followed by consideration of interactions and nonlinearities. We consider selection of second-order models under various hierarchy restrictions between main effects and second-order terms (squares and interactions) using the FSR approach of Wu et al. (2007, JASA) and Boos et al. (2009, Biometrics). The basic idea is to control the proportion of uninformative variables in the final model.

Thursday, September 16, 2010 - 3:30pm
Type: 

Hui Zou

Non-concave Penalized Composite Likelihood Estimation of Sparse Ising Models

The Ising model is a useful tool for studying complex interactions within a system. The estimation of such a model, however, is rather challenging especially in the presence of high dimensional parameters. In this work, we propose efficient procedures for learning a sparse Ising model based on a penalized composite likelihood with non-concave penalties.

Thursday, September 2, 2010 - 3:30pm
Type: 

Yongdai Kim

Seoul National University, South Korea

Sparse Regression with Incentive

Spare regularization methods for high dimensional regression have received much attention recently as an alternative of subset selection methods. Examples are lasso (Tibshirani 1996), bridge regression (1993), scad (2001), to name just few. An advantage of sparse regularization methods is that it gives a stable estimator with automatic variable selection and hence the resulting estimator performs well in prediction. Also, sparse regularization methods have many desirable properties when the true model is sparse.

Thursday, August 26, 2010 - 3:30pm
Type: 

Pritam Ranjan

Tikhonov Regularization for Emulating Deterministic Computer Simulators

For many expensive computer simulators, the outputs are deterministic and thus the desired statistical surrogate (emulator) is an interpolator of the observed data. Gaussian spatial process (GP) is commonly used to model such simulator outputs. Fitting a GP model to n data points requires numerous inversion of a correlation matrix R. This becomes computationally unstable due to near-singularity of R. The popular approach to overcome near-singularity introduces over-smoothing of the data.

Thursday, August 19, 2010 - 3:30pm
Type: 

Yichuan Zhao

Empirical Likelihood Intervals for the Difference of Two Quantile Functions with Right Censoring

In this talk, we study two independent samples under right censoring. Using a smoothed empirical likelihood method, we investigate the difference of quantiles in the two samples and construct the pointwise confidence intervals from it as well. The empirical log-likelihood ratio is proposed and its asymptotic limit is shown as a standard chi-squared distribution. In the simulation studies, we compare the empirical likelihood method and the normal approximation method in terms of coverage accuracy and average length of confidence intervals.

Thursday, October 8, 2009 - 3:30pm
Type: 

Ian Dryden

University of South Carolina

Non-Euclidean statistics for covariance matrices, with applications to diffusion tensor imaging

The statistical analysis of covariance matrices occurs in many important applications, e.g. in diffusion tensor imaging or longitudinal data analysis. Methodology is discussed for estimating covariance matrices which takes into account the non-Euclidean nature of the space of positive semi-definite symmetric matrices. We make connections with the use of Procrustes methods in shape analysis, and comparisons are made with other estimation techniques, including using the matrix logarithm, Riemannian metric, matrix square root and Cholesky decomposition.

Thursday, October 1, 2009 - 3:30pm
Type: 

Pages

Subscribe to RSS - Colloquium