Latent variable models lay the statistical foundation for data science problems with unstructured, incomplete and heterogeneous information. Spectral methods extract low-dimensional geometric structures for downstream tasks in a computationally efficient way. Despite their conceptual simplicity and wide applicability, theoretical understanding is lagging far behind and that hinders development of principled approaches. In this talk, I will first talk about the bias and variance of PCA, and apply the results to distributed estimation of principal eigenspaces. Then I will present an $ell_p$ theory of eigenvector analysis that yields optimal recovery guarantees for spectral methods in many challenging problems. The results find applications in dimensionality reduction, mixture models, network analysis, recommendation systems, ranking and beyond.
20 Mar 2020
9:30am - 10:30am
Where
https://hkust.zoom.com.cn/j/5616960008
Speakers/Performers
Dr. Kaizheng WANG
Princeton University
Organizer(S)
Department of Mathematics
Contact/Enquiries
mathseminar@ust.hk
Payment Details
Audience
Alumni, Faculty and Staff, PG Students, UG Students
Language(s)
English
Other Events
21 Jun 2024
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Alzheimer’s Disease is Likely a Lipid-disorder Complication: an Example of Functional Lipidomics for Biomedical and Biological Research
Abstract Functional lipidomics is a frontier in lipidomics research, which identifies changes of cellular lipidomes in disease by lipidomics, uncovers the molecular mechanism(s) leading to the chan...
24 May 2024
Seminar, Lecture, Talk
IAS / School of Science Joint Lecture - Confinement Controlled Electrochemistry: Nanopore beyond Sequencing
Abstract Nanopore electrochemistry refers to the promising measurement science based on elaborate pore structures, which offers a well-defined geometric confined space to adopt and characterize sin...