Data processing and source identification using lower dimensional hidden structure plays an essential role in many fields of applications, including image processing, neural networks, genome studies, signal processing and other areas where large datasets are often encountered. Representations of higher dimensional random vector using a lower dimensional vector provide a statistical framework to the identification and separation of the sources. One of the common methods for source separation using lower dimensional structure involves the use of Independent Component Analysis (ICA), which is based on a linear representation of the observed data in terms of independent hidden sources. A distinguishing feature of the ICA compared with other source separation methods is that the lower dimensional random variables are extracted as independent sources in contrast to uncorrelated variables (e.g., as in PCA). The problem thus involves the estimation of the linear mixing matrix and the densities of the independent latent sources. However, the solution to the problem depends on the identifiability of the sources. First, a set of sufficient conditions are established to resolve the identifiability of the sources using moment restrictions of the hidden source variables. Under such sufficient conditions a semi-parametric maximum likelihood estimate of the mixing matrix and source densities are derived. The consistency of the proposed estimate is established under additional mild regularity conditions. The proposed method is illustrated and compared with existing methods using simulated data scenarios and also using datasets on brain imaging that are commonly used in practice.
(This is a joint work with Dr. Ani Eloyan, Johns Hopkins University)
More information about Sujit Ghosh may be found at http://www4.stat.ncsu.edu/~ghosh/