NO.186 Markov Chain Monte Carlo 2.0
May 30 - June 2, 2022 (Check-in: May 29, 2022 )
- Heng Guo
- University of Edinburgh, UK
- Yitong Yin
- Nanjing University, China
Markov chains and the related Monte Carlo method are ubiquitous algorithmic techniques across various scientific areas. Despite significant practical successes, theoretically analyzing the convergence rate (technically known as the mixing rate) remains a challenging mathematical question for many fundamental problems, especially in discrete settings. In theoretical computer science, these problems are studied as computational counting and sampling problems. In recent years, a number of new techniques emerged to tackle long-standing counting and sampling challenges. These techniques include both new ways to analyze Markov chains, and alternative algorithmic approaches to MCMC. In this meeting, we aim to bring together people in these directions to uncover deeper connections of the exciting new advances.
To be more specific, the typical goal is to sample from a complicate distribution, such as the Gibbs distribution, or to estimate the normalizing constant (known as the partition function), marginal probabilities, and other quantities of interest for the said distribution. In most situations, counting and sampling are equivalent for polynomial-time computation. One may approach this goal via probabilistic methods, such as Markov chains and other resampling based techniques, or via algebraic methods, such as Barvinok’s interpolation method, especially combined with cluster expansions. One aim of the meeting is to find unifying viewpoints and hidden links between probabilistic approaches and algebraic approaches.
Next we list a few main research topics to be discussed in the workshop.
High dimensional expanders
The notion of High dimensional expanders is a generalisation of expander graphs to simplicial complexes and hypergraphs. One important aspects of this perspective is to make the local-toglobal arguments possible, which allows us to analyse random walks over a global structure via analysing the spectral gaps and other properties of local structures. This technique has led to the breakthrough result of the rapid mixing of the bases-exchange walks for matroids.
Resampling methods provide alternative sampling techniques to Markov chains. In particular, the recent progress of resampling methods has strong connections to the Lovász local lemma, particularly the constructive version of Moser and Tardos. Resampling methods have led to the first polynomial-time approximation algorithm for network reliability, and have other unique features comparing to Markov chain methods. For example, they are usually dynamic and distributed by design.
Zeros of partition functions
Zeros of partition functions are a traditional topic in statistical physics, where the motivation is to study phase transitions via the analyticity of partition functions. More recently, Barvinok has introduced an algorithmic technique which can utilize the lack of zeros of the partition function in certain regions of the complex plane. In many occasions, this algorithm is known to be efficient under conditions equivalent or similar to rapid mixing conditions of Markov chains. It is an interesting direction to uncover conditions that unify both algorithmic techniques.
Cluster expansion is a technique in statistical physics to analyze systems in the low-temperature regime. On the algorithmic side, cluster expansion can be combined with both Markov chains and Barvinok’s method to obtain efficient algorithms. While most other techniques only works in the high-temperature regime, cluster expansion provides unique algorithmic applications in the low-temperature regime. It is especially powerful when there are special structures that can be utilized in the underlying graphs, such as expander graphs or lattice graphs.