Longitudinal High-Dimensional Data Analysis
Vadim
Zipunnikov
Tuesday, January 31, 2012 - 3:30pm
AttachmentSize
PDF icon 20120131Zipunnikov.pdf26.27 KB

We introduce a flexible inferential framework for the longitudinal analysis of ultra high dimensional data. Typical examples of such data structures include, but are not limited to, observational studies that collect imaging data longitudinally on large cohorts of subjects. The approach decomposes the observed variability into three high dimensional components: a subject-specific random intercept that quantifies the cross-sectional variability, a subject-specific slope that quantifies the dynamic irreversible deformation over multiple visits, and a subject-visit specific imaging deviation that quantifies exchangeable or reversible visit-to-visit changes. The model could be viewed as the ultra high dimensional counterpart of random intercept/random slope mixed effects model. The proposed inferential method is very fast, scalable to studies including ultra-high dimensional data, and can easily be adapted to and executed on modest computing infrastructures. The method is applied to the longitudinal analysis of diffusion tensor imaging (DTI) data of the corpus callosum of multiple sclerosis (MS) subjects. The study includes 176 subjects observed at a total of 466 visits. For each subject and visit the study contains a registered DTI scan of the corpus callosum at roughly 30,000 voxels.