Seminars

View all Seminars  |  Download ICal for this event

Locally Stationary Distributions: A Framework for Analyzing Slow-Mixing Markov Chains

Series: Bangalore Theory Seminars

Speaker: Sidhanth Mohanty,UC Berkeley

Date/Time: Jun 14 21:00:00

Location: Online Talk (See Teams link below)

Abstract:
A common algorithmic task is to sample from some probability distribution of choice D. The most popular algorithm for such tasks is to run a Markov chain with stationary distribution D, and when the Markov chain mixes rapidly, it does indeed produce samples from D. But what if the Markov chain does not mix fast? It is empirically observed that sometimes the output produced by the Markov chain is nevertheless still a "good" solution for the downstream applications of sampling, even though its not faithfully sampling from the correct distribution. For example, it appears to find planted cuts in random graphs, or planted solutions in random CSPs, despite the Markov chain having bottlenecks to mixing. In this talk, we will see some generic methods to analyze the structure of distributions that non-mixing Markov chains sample from, along with some applications.
<br>
Based on joint work with Kuikui Liu, Prasad Raghavendra, Amit Rajaraman and David X Wu. Link
<br>
Microsoft teams link:
<br>
<a href="Link
">Link
<br>
<br>
We are grateful to the Kirani family for generously supporting the theory seminar series
<br>
<br>
Hosts: Rameesh Paul, KVN Sreenivas, Rahul Madhavan, Debajyoti Kar