Skip to main content
Skip to main menu


Yuanzhe Xi

Yuanzhe Xi
Yuanzhe Xi
Department of Mathematics
Emory University
Caldwell Building Room 204
Event Flyer (219.22 KB)

Accelerating Gaussian process regressions with preconditioning

Kernel matrices can help handle nonlinearities in the data in many machine learning applications. The entries of the kernel matrix are defined as the values of a kernel function at all pairs of points from a given dataset. Since the spectrum of the kernel matrix associated with the same dataset can vary dramatically as the parameters of the kernel function change, developing robust solution schemes for kernel matrices associated with a wide range of parameters is a challenging task. In this talk we focus on the Gaussian kernel functions and propose an adaptive structured block preconditioning technique. The proposed method is constructed based on the Nystrom approximation to the Gaussian kernel matrix. We show that the condition number of the preconditioned matrix decreases as the Nystrom approximation accuracy increases. We also show the relation between the geometry of the landmark points used in Nystrom approximation and the resulting approximation error. Based on this relation, we design an efficient rank estimation algorithm to automatically match the selected number of landmark points with the numerical rank of the kernel matrix. Experiments on various synthetic and real datasets ranging from low to high dimensions verify the effectiveness and robustness of the proposed method.


Dr. Yuanzhe Xi is currently an assistant professor in the Department of Mathematics at Emory University. He recieved his Ph.D. degree in Mathematics at Purdue University, under the guidance of Dr. Jianlin Xia and worked as a postdoc in the Department of Computer Scienece and Engineering at the University of Minnesota under the guidance of Dr. Yousef Saad. His research interests lie primarily in numerical linear algebra, scientific computing and data science. Recently, he is working on fast algorithms for kernel methods (Gaussian processes, kernel ridge regression) in data science and nonlinear acceleration methods for tensor decomposition, deep generative models and neural networks.

Support us

We appreciate your financial support. Your gift is important to us and helps support critical opportunities for students and faculty alike, including lectures, travel support, and any number of educational events that augment the classroom experience. Click here to learn more about giving.

Every dollar given has a direct impact upon our students and faculty.