Final ACO Doctoral Examination and Defense of Dissertation of Mengqi Lou: 22 April, 2026

Title: Two Aspects of Statistical Learning in High Dimensions: Iterative Algorithms and Average-case Reductions

Mengqi Lou
ACO PhD student
School of Industrial and Systems Engineering

Date: April 22, 2026
Time: 12:00 PM – 2:00 PM (EST)
Location: Groseclose 226, Georgia Tech Campus
Zoom Link: TBA

Thesis: https://drive.google.com/file/d/1dUQuAif01CY-8o2ccTlNoDsVtQHR-Pqx/view?u...

Committee:
Dr. Ashwin Pananjady (Advisor), Schools of Industrial and Systems Engineering & Electrical and Computer Engineering, Georgia Tech
Dr. Cheng Mao (Reader), School of Mathematics, Georgia Tech
Dr. Will Perkins, School of Computer Science, Georgia Tech
Dr. Justin Romberg, School of Electrical and Computer Engineering, Georgia Tech
Dr. Guy Bresler, Department of Electrical Engineering and Computer Science , MIT

Abstract:
The task of learning the underlying parameters of a statistical model from noisy samples is ubiquitous in modern signal processing and data science. Both computational and statistical challenges arise, especially in high-dimensional settings where the number of parameters is comparable to (or exceeds) the sample size. On the computational side, iterative algorithms are commonly used to fit complex models to random data, but their design and analysis are often guided by worst-case upper bounds that may not reflect practical performance. On the statistical side, classical information-theoretic limits on sample complexity or signal-to-noise ratio may be unattainable by any polynomial-time procedure, making these limits an impractical benchmark for modern high-dimensional problems.

In this thesis, we discuss two general frameworks that address these computational and statistical challenges. In the first part, I will present a toolkit that yields sharp, iterate-by-iterate characterizations of solution quality for complex iterative algorithms on several non-convex model-fitting problems with random data. In the second part, I will present a toolkit to derive average-case “reductions’’ between different statistical models, illustrating how such reductions reveal the computational limits of several structured high-dimensional problems.