Zihang Liu
This thesis project will investigate the theory and practices for efficient representation and recovery of high dimensional data with hidden low dimensional structure in the scope of large-scale optimization. Centering on sparse and low rank models-from classical compressed sensing and matrix completion settings to modern overparameterized architectures-the work will develop rigorous error bounds and sample complexity guarantees, probe how sparsity and rank constraints sculpt optimization landscapes, and develop efficient methods for practical settings. Drawing on random matrix theory, high dimensional probability, and numerical optimization, the project will study how structural constraints influence spectral properties and approximation power while promoting algorithms with provable convergence. By unifying these analytic tools, the research will clarify why existing methods succeed, guide the design of numerically stable and computationally efficient techniques, and deepen our understanding of the interplay between dimension, structure, and recoverability in large-scale optimization.