We present a Bayesian approach for modeling multivariate, dependent functional data. To account for the three dominant structural features in the data--functional, time dependent, and multivariate components--we extend hierarchical dynamic linear models for multivariate time series to the functional data setting. We also develop Bayesian spline theory in a more general constrained optimization framework.
The Statistics Department hosts weekly colloquia on a variety of statistcal subjects, bringing in speakers from around the world.
Type of Event:
We propose a learning algorithm for a class of random field models of natural image patterns, where the energy functions of the random fields are in the form of linear combinations of rectified filter responses from subsets of wavelets selected from a given over-complete dictionary. The algorithm consists of the following two components. (1) We propose to induce the wavelets into the random field model by a generative version of the epsilon-boosting algorithm.
Understanding the complex dynamics of Earth's climate system is a grand scientific challenge. Projecting climate for 50 or 100 years into the future is, however, complicated by the fact that the behavior of the Earth system over such time scales is not well characterized over the modern instrumental interval, which only stretches back about 100-150 years with global extent.
We propose a nonparametric estimator of the dynamics of monotonically increasing or decreasing trajectories defined on a finite time interval. Such trajectories can be described as solutions of autonomous ODEs. Under suitable regularity conditions, we derive the optimal rate of convergence for the proposed estimator and show that it is the same as that for estimating the derivative of a trajectory. We also show that commonly used two-stage estimation schemes are typically inefficient.
We propose a general theory and the estimation procedures for nonlinear sufficient dimension reduction where the predictor or the response, or both, are random functions. The relation between the response and predictor can be arbitrary and the sets of observed time points can vary from subject to subject. The functional and nonlinear nature of the problem leads naturally to consideration of two levels of functional spaces: the first space consisting of functions of time; the second space consisting of functions defined on the first space.
Much of forensic laboratory work is based on comparison of evidence from a crime scene with analogous material associated with a suspect.
Complex diseases such as cancer have often heterogeneous responses to treatment, and this has attracted much interest in developing individualized treatment rules to tailor therapies to an individual patient according to the patient-specific characteristics. In this talk, we discuss how to use Bayesian neural networks to achieve this goal, including how to select disease related features.
First, I will review the lasso method and show an example of its utility in cancer diagnosis via mass spectometry. Then I will consider the testing the significance of the terms in a fitted regression, fit via the lasso or forward stepwise regression. I will present a novel statistical framework for this problem, one that provides p-values and confidence intervals that properly account for the inherent selection in the fitting procedure. I will give other examples of this procedure, including graphical models and PCA, and describe an R language package for its computation.
Statistics has played a key role in the development and validation of forensic methods, as well as in the inferences (conclusions) obtained from forensic evidence. Further, statisticians have been important contributors to many areas of science, such as chemistry (chemometrics), biology (genomics), medicine (clinical trials), and agriculture (crop yield), leading to valuable advances that extend to multiple fields (spectral analysis, penalized regression, sequential analysis, experimental design). The involvement of statistics specifically in forensic science has demonstrated its value in
William Brenneman is a Research Fellow at Procter & Gamble in the Global Statistics and Data Management Department and an Adjunct Professor at Georgia Tech in the Industrial and Systems Engineering Department. Since joining P&G in 2000, he has worked on a wide range of projects that deal with statistics applications in his areas of expertise: design and analysis of experiments, robust parameter design, reliability engineering, statistical process control, computer experiments, and general statistical thinking. He was also instrumental in the development of an in-house statistics c