This will be a one-quarter (Winter-2024) seminar course discussing a recent result (I,II) by Jonathan Shi and myself. The main contribution of this result is that it introduces a new SoS hierarchy to optimize certain Gaussian processes and efficiently certify the values of their expected supremum to some (constant) threshold of approximability. The threshold is widely-believed to be optimal among polynomial-time algorithms. The paper also makes a conjecture about the optimality of this hierarchy for a broad class of Gaussian processes, providing examples of modified Gaussian processes where the HES hierarchy provably outperforms local-iterative algorithms such as Hessian ascent (or provides strong evidence of doing so). Proving (or disproving) this conjecture would unravel interesting connections between high-dimensional probability, sum-of-squares, random matrices and free probability.

Meeting Time: Tuesdays, 11:40 AM - 1:40 PM PST.

Location: Social Sciences Building 2, Room 137. (Meetings will be zoomed; Contact me for the link and/or access to the recordings).

Lectures: Video recordings



As the lectures progress, I will upload PDFs of my (poorly) hand-written notes from my e-notebook. I will also add relevant papers, SoS & spin-glass resources soon.

Note: The second paper will be available on arXiv starting February-15, 2024. Until then, I will share a (nearly) camera-ready draft with enrolled students.