Seminars and Colloquia by Series

TBA : Hung Nguyen

Series
Stochastics Seminar
Time
Thursday, March 19, 2026 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Hung NguyenUniversity of Tennessee, Knoxville

Joint parameter estimation of spin glasses

Series
Stochastics Seminar
Time
Thursday, March 12, 2026 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Qiang WuUniversity of Minnesota

Spin glasses are disordered statistical physics system with both ferromagnetic and anti-ferromagnetic spin interactions. The Gibbs measure belongs to the exponential family with parameters, such as inverse temperature $\beta>0$ and external field $h\in R$.  A fundamental statistical problem is to estimate the system parameters from a single sample of the ground truth. In 2007, Chatterjee first proved that under reasonable conditions, for spin glass models with $h=0$, the maximum pseudo-likelihood estimator for $\beta$ is $\sqrt{N}$-consistent. This is in contrast to the existing estimation results for classical non-disordered models. However, Chatterjee's approach has been restricted to the single parameter estimation setting.  The joint parameter estimation of $(\beta,h)$ for spin glasses has remained open since then. In this talk, I will introduce a new idea to show that under some easily verifiable conditions,  the bi-variate maximum pseudo-likelihood estimator is jointly $\sqrt{N}$-consistent for a large collection of spin glasses, including the Sherrington-Kirkpatrick model and its diluted variants. Based on joint work with Wei-Kuo Chen, Arnab Sen. 

Spectral gaps and measure decompositions

Series
Stochastics Seminar
Time
Thursday, March 5, 2026 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
March BoedihardjoMichigan State University

I'll introduce a new set of computable and orthogonally invariant quantities for a given probability measure on a Euclidean space. We show how these quantities can determine the extent to which the given probability measure can be decomposed as an equal weight mixture of two probability measures with significantly different second order statistics. Joint work with Joe Kileel and Vandy Tombs.

 

Continuous directed polymers in a Gaussian environment

Series
Stochastics Seminar
Time
Thursday, February 26, 2026 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Le ChenAuburn University

We study a broad class of space–time continuous directed polymers in a Gaussian environment that is white in time and spatially correlated (Itô sense). Under Dalang’s condition, we prove key properties of the partition function—positivity, stationarity, scaling, homogeneity, and a Chapman–Kolmogorov relation—and establish pathwise regularity of the polymer (Hölder continuity and quadratic variation). We give a sharp singularity criterion: the polymer measure is singular w.r.t. Wiener measure iff the spectral measure has infinite total mass. Finally, for d≥3, we prove diffusive large-time behavior in the high-temperature regime, providing a unified framework for polymers driven by singular Gaussian noise.

 

Joint work with Cheng Ouyang (UIC), Samy Tindel (Purdue), and Panqiu Xia (Cardiff).

Beyond propagation of chaos: A stochastic algorithm for mean-field optimization

Series
Stochastics Seminar
Time
Thursday, February 5, 2026 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Chandan TankalaUniversity of Oregon

Sampling and mean-field optimization can be viewed as optimization in the space of probability distributions. Stochastic optimization algorithms like stochastic gradient descent have been immensely successful for optimization over Euclidean spaces. However, the infinite-dimensional space of probability distributions poses unique challenges. In this talk, I will discuss my recent work on the design and analysis of a stochastic algorithm for mean-field optimization with applications to the increasingly studied area of mean-field neural networks.

Partial identification with Schrödinger bridges

Series
Stochastics Seminar
Time
Thursday, January 29, 2026 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Florian GunsiliusEmory University

Partial identification provides an alternative to point identification: instead of pinning down a unique parameter estimate, the goal is to characterize a set guaranteed to contain the true parameter value. Many partial identification approaches take the form of linear optimization problems, which seek the "best- and worst-case scenarios" of a proposed model subject to the constraint that the model replicates correct observable information. However, such linear programs become intractable in settings with multivalued or continuous variables. This paper introduces a novel method to overcome this computational and statistical curse of cardinality: an entropy penalty transforms these potentially infinite-dimensional linear programs into general versions of multi-marginal Schrödinger bridges, enabling efficient approximation of their solutions. In the process, we establish novel statistical and mathematical properties of such multi-marginal Schrödinger bridges---including an analysis of the asymptotic distribution of entropic approximations to infinite-dimensional linear programs. We illustrate this approach by analyzing  instrumental variable models with continuous variables, a setting that has been out of reach for existing methods.

Similarities and Differences between the Longest Common and Longest Common and Increasing Subsequences in Random Words

Series
Stochastics Seminar
Time
Thursday, January 22, 2026 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Christian HoudréGeorgia Institute of Technology

Let $LC_n$ be the length of the longest common subsequences of two independent random words whose letters are taken  in a finite alphabet and when the alphabet is totally ordered and let $LCI_n$ be the length of the longest common and increasing subsequences of the words.   Results on the asymptotic means, variances and limiting laws of these well-known random objects will be described and compared.

How trustworthy AI enables a paradigm shift in classical statistics for particle physics

Series
Stochastics Seminar
Time
Thursday, January 15, 2026 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Aishik GhoshGeorgia Tech

Particle physics research relies on making statistical statements about Nature. The field is one of the last bastions of classical statistics and certainly among its most rigorous users, relying on a worldwide computing grid to process zettabyte-scale data. Recent AI-enabled developments have reinvigorated research in classical statistics, particularly by removing the need for asymptotic approximations in many calculations.

 

In this talk, I will discuss how AI has allowed us to question core assumptions in our statistical inference techniques. Neural networks enable high-dimensional statistical inference, avoiding aggressive data reduction or the use of unnecessary assumptions. However, they also introduce new sources of systematic uncertainty that require novel uncertainty quantification tools. AI further enables more robust statistical inference by accelerating Neyman inversion and confidence-interval calibration. These advances allow the design of new test statistics that leverage Bayesian mathematical tools while still guaranteeing frequentist coverage, an approach that was previously considered computationally infeasible. These new techniques raise questions about practical methods for handling nuisance parameters, the definition of point estimators, and the computationally efficient implementation of mathematical solutions. If time permits, I will also introduce the emerging challenge of non-nestable hypothesis testing in particle physics.

 

My group is among the teams leading this revitalization of classical statistical research in particle physics, and I look forward to connecting with students and senior colleagues at Georgia Tech who are interested in contributing to this emerging field.

 

Bio: Aishik Ghosh is an assistant professor in the School of Physics at Georgia Tech with a focus on developing AI methods to accelerate fundamental physics and astrophysics. His group works on theoretical physics, statistical methods, and experiment design. For robust scientific applications, Dr. Ghosh focuses on uncertainty quantification, interpretability, and verifiability of AI algorithms, targeting publications in physics journals and ML conferences.

Precise Error Rates for Computationally Efficient Testing

Series
Stochastics Seminar
Time
Thursday, November 20, 2025 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Alex WeinUC Davis

We consider one of the most basic high-dimensional testing problems: that of detecting the presence of a rank-1 "spike" in a random Gaussian (GOE) matrix. When the spike has structure such as sparsity, inherent statistical-computational tradeoffs are expected. I will discuss some precise results about the computational complexity, arguing that the so-called "linear spectral statistics" achieve the best possible tradeoff between type I & II errors among all polynomial-time algorithms, even though an exponential-time algorithm can do better. This is based on https://arxiv.org/abs/2311.00289 with Ankur Moitra which uses a version of the low-degree polynomial heuristic, as well as forthcoming work with Ansh Nagda which gives a stronger form of reduction-based hardness.

Pages