We would like to thank you for attending the SMC 2022 Workshop and for presenting your research work. Both poster sessions were a big success, and all the merit is yours for tirelessly explaining your work over and over to the other attendees.
There were two poster sessions on May 4 and May 5. Here you can find the posters presented.
May 4: Poster session #1
Gregoire Aufort (Aix-Marseille Université)
Co-authors of the work :Pierre Pudlo
TAMIS : Tempered Anti-truncated Multiple Importance Sampling
We propose a Tempered Anti-truncated Adaptive Multiple Importance Sampling (TAMIS) algorithm to solve the initialization difficulty of such sequential algorithms, without introducing to many evaluations of the target density. TAMIS does not require computation of the gradient of the target distribution, allowing its use for blackbox problems. Combining a tempering scheme and a new nonlinear transformation of the weights, named anti-truncation, our algorithm introduces a sequence of automatically tuned auxiliary targets that form a bridge between the initial and the target distribution. Those auxiliary targets are used only to stabilize the update of the proposal distribution as a Gaussian mixture model. As a result, our proposal is an automatically tuned sequential algorithm that is robust to many initial proposals, and scales linearly with the dimension. We will show numerical results on well-known toy problems in dimensions up to 1000.
Xiongjie Chen (University of Surrey)
Co-author of the work: Yunpeng Li
Normalising Flow-based Differentiable Particle Filters
Differentiable particle filters provide a data-driven mechanism to adaptively learn the state dynamic and the observation likelihood through neural networks. However, most existing approaches are within the bootstrap particle filtering framework and also do not admit valid probability densities in constructing measurement models. In this poster, we present our recent work in incorporating conditional normalising flows into differentiable particle filtering frameworks to address these issues. We show how to construct 1) an expressive dynamic model with normalizing flow, 2) a valid probability density in a measurement model with conditional normalizing flow, and 3) a flexible proposal distribution through conditional normalizing flow. We evaluate the empirical performance of the proposed methods in visual object tracking numerical experiments.
Adrien Corenflos (Aalto University)
Co-authors of the work: Nicolas Chopin, Simo Särkkä
De-Sequentialized Monte Carlo: A parallel-in-time Particle Smoother
Particle smoothers are SMC (Sequential Monte Carlo) algorithms designed to approximate the joint distribution of the states given observations from a state-space model. We propose dSMC (de-Sequentialized Monte Carlo), a new particle smoother that is able to process T observations in O(log T) time on parallel architecture. This compares favorably with standard particle smoothers, the complexity of which is linear in T. We derive Lp convergence results for dSMC, with an explicit upper bound, polynomial in T. We then discuss how to reduce the variance of the smoothing estimates computed by dSMC by (i) designing good proposal distributions for sampling the particles at the initialization of the algorithm, as well as by (ii) using lazy resampling to increase the number of particles used in dSMC. Finally, we design a particle Gibbs sampler based on dSMC, which is able to perform parameter inference in a state-space model at a O(log(T)) cost on parallel hardware.
Francesca R. Crucinio
On the backward sampling step in the smoothing of SSM
Co-authors: Sylvain Le Corff and Yohan Petetin
Variance estimation for Sequential Monte Carlo algorithms through backward sampling
This paper deals with the problem of online asymptotic variance estimation for particle filtering and smoothing. Current solutions for the particle filter that rely on the particle genealogy are either unstable or hard to tune. We propose to mitigate these limitations by incorporating the so called backward weights into the existing estimator. The resulting estimator is weakly consistent and trades computational cost for more stability and reduced variance. A more computationally efficient estimator inspired by the PaRIS algorithm of Olsson & Westerborn is also introduced. As an application, particle smoothing is considered and an estimator of the asymptotic variance for the FFBS applied to additive functionals is provided.
Mary Llewellyn (University of Edinburgh)
Co-authors of the work: Prof. Ruth King, Dr. Víctor Elvira, and Dr. Gordon Ross
Discretising a Continuous World: Accelerated Inference for State-Space Models via Hidden Markov Models
State-space models (SSMs) are often used to model time series data where the observations depend on an unobserved latent process. However, inference on the process parameters of an SSM can be challenging, especially when the likelihood of the data given the parameters is not available in closed form. In the Bayesian framework, a variety of approaches to model fitting have been applied, including MCMC using Bayesian data augmentation, sequential Monte Carlo (SMC) approximation, and particle MCMC algorithms, which combine SMC approximations and MCMC steps. However, such methods can be inefficient because of sample impoverishment in the sequential Monte Carlo approximations and/or poor mixing in the MCMC steps. This poster details an approach that borrows ideas from discrete hidden Markov models (HMMs) to provide an efficient MCMC with data augmentation approach, imputing the latent states within the algorithm. Our approach deterministically approximates the SSM by a discrete HMM, which is subsequently used as an MCMC proposal distribution for the latent states. We illustrate our approach with several examples.
Alessandro Mastrototaro (KTH Royal Institute of Technology)
Co-authors of the work: Jimmy Olsson (Department of Mathematics, KTH Royal Institute of Technology, Stockholm) and Johan Alenlöv (Department of Computer and Information Science, Linköping University, Linköping).
Fast and numerically stable particle-based online additive smoothing: the AdaSmooth algorithm
Abstract: This poster will deal with a novel SMC approach to online smoothing of additive functionals in a very general class of path-space models. Hitherto, the solutions proposed in the literature suffer from either long-term numerical instability due to particle-path degeneracy or, in the case that degeneracy is remedied by particle approximation of the so-called backward kernel, high computational demands. The poster will present a new, function-specific additive smoothing algorithm, AdaSmooth, which is computationally fast, numerically stable and easy to implement. In order to balance optimally computational speed against numerical stability, AdaSmooth combines a (fast) naive particle smoother, propagating recursively a sample of particles and associated smoothing statistics, with an adaptive backward-sampling-based updating rule which allows the number of (costly) backward samples to be kept at a minimum. The algorithm is provided with rigorous theoretical results guaranteeing its consistency, asymptotic normality and long-term stability as well as numerical results demonstrating empirically the clear superiority of AdaSmooth to existing algorithms.
Rui Min (Université de Lille)
Co-authors of the work: Christelle Garnier, François Septier and John Klein
State space partitioning based on constrained spectral clustering for block particle filtering
The particle filter (PF) is a powerful tool to estimate the filtering distribution in non-linear and/or non-Gaussian problems. To overcome the curse of dimensionality, the block PF (BPF) inserts a blocking step to partition the state space into several blocks of smaller dimension so that the correction and resampling steps can be performed independently on each block. Using blocks of small size can reduce the variance of the filtering distribution estimate, but the correlation between blocks is broken and a bias is introduced.
When dependence relationships between state variables are unknown and the partition is chosen arbitrarily, a significant error overhead may arise from a poor partitioning. In this paper, we formulate the partitioning problem as a clustering problem and propose a partitioning method based on spectral clustering (SC). We design a generalized BPF algorithm that contains two new steps: (i) estimation of the state vector correlation matrix from predicted particles, (ii) SC using this estimate as similarity matrix to determine an appropriate partition. A constraint is also imposed on the maximal cluster size to avoid too large blocks. The proposed method succeeds to bring the most correlated state variables in the same blocks while escaping the curse of dimensionality.
Kostas Tsampuorakis (University of Edinburgh)
Co-authors: Victor Elvira
Approximating the likelihood ratio in Linear-Gaussian state-space models for change detection
Abstract: Change-point detection methods are widely used in signal processing, primarily for detecting and locating changes in a considered model. An important family of algorithms for this problem relies on the likelihood ratio (LR) test. In state-space models (SSMs), the time series is modeled through a Markovian latent process.
In this paper, we focus on the linear-Gaussian (LG) SSM, in which the LR-based methods require running a Kalman filter for every candidate change point. Since the number of candidates grows with the length of the time series, this strategy is inefficient in short time series and unfeasible for long ones. We propose an approximation to the LR which uses a constant number of filters. The approximated LR relies on the Markovian property of the filter, which forgets errors at an exponential rate. We present theoretical results that justify the approximation, and we bound its error. We demonstrate its good performance in two numerical examples
Alessandro Viani (University of Genova)
Co-Author: Alberto Sorrentino (University of Genova)
Free hyper-parameter selection and averaging in Bayesian inverse problems by SMC samplers
Abstract: We present an innovative method for exploiting Sequential Monte Carlo (SMC) samplers structure in order to obtain free hyper-parameter selection and averaging for a large class of inverse problems where the likelihood depends on a scalar hyper-parameter for which an hyper-prior is available.Indeed, considering a wise choice for the SMC samplers sequence of distributions, we can exploit the approximation of the normalizing constant provided by the algorithm to obtain either selection or marginalization of the hyper-parameter. This approach is substantially different from the standard alternative where the hyper-parameter is itself sampled and estimated by the SMC samplers because particles from all iterations are effectively utilized in the final estimate, avoiding the typical SMC samplers waste of computational time. The most straightforward application is in the context of additive Gaussian noise inverse problems, for which we provide a comparison between the performances of the proposed method and the standard alternative on two examples.
May 5: Poster Session #2
laa Amri (University of Edinburgh)
Co-authors : Amy Wilson, Chris Dent, Gail Robertson.
A Poisson model with spatially and temporally varying coefficients
abstract: We propose a Dynamic Generalized Linear model, where the coefficients vary both spatially and temporally, to account for spatial and temporal correlation in responses. An inference algorithm, that is based on Particle Gibbs, is described. Our approach is illustrated with both simulated and real datasets where observations are count data observed at different points in space and time. The first dataset is about cycling counts in the City of Glasgow collected from Strava and automated bike counters while the other one is related to the population of primary school pupils in the City of Edinburgh.
Nicola Branchini (University of Edinburgh)
Co-authors : Víctor Elvira
Optimized Auxiliary Particle Filters: adapting mixture proposals via convex optimization
Auxiliary particle filters (APFs) are a class of sequential Monte Carlo (SMC) methods for Bayesian inference in state-space models. In their original derivation, APFs operate in an extended state space using an auxiliary variable to improve inference. In this work, we propose optimized auxiliary particle filters, a framework where the traditional APF auxiliary variables are interpreted as weights in a importance sampling mixture proposal. Under this interpretation, we devise a mechanism for proposing the mixture weights that is inspired by recent advances in multiple and adaptive importance sampling. In particular, we propose to select the mixture weights by formulating a convex optimization problem, with the aim of approximating the filtering posterior at each timestep. Further, we propose a weighting scheme that generalizes previous results on the APF (Pitt et al. 2012), proving unbiasedness and consistency of our estimators. Our framework demonstrates significantly improved estimates on a range of metrics compared to state-of-the-art particle filters at similar computational complexity in challenging and widely used dynamical models
Mauro Camara Escudero (University of Bristol)
Co-authors of the work: Christophe Andrieu, Mark Beaumont
Sequential Monte Carlo for Approximate Manifold Sampling
Abstract: Sampling from a probability density constrained to a manifold is of importance in numerous applications arising in statistical physics, statistics, or machine learning. Sampling from such constrained densities, in particular using an MCMC approach, poses significant challenges and it is only recently that correct solutions have been proposed. The resulting algorithm can however be computationally expensive. We propose a relaxation of the problem and construct a bespoke and efficient parametrized family of MCMC kernels to sample from a small neighbourhood around the manifold, which we use as the transition kernel of SMC-THUG, an adaptive SMC sampler. We show the superior performance brought by this kernel against HMC and RWM in a number of Bayesian inverse problems.
Co-authors of the work: Paul J Birrell, Daniela De Angelis, Travelian J McKinley, Anne Presanis
Lifebelt particle filter for under-dispersed models
Our work is motivated by the real-word problem of the inference of severity parameters (namely the intensive-care fatality risk) from influenza surveillance data. The state process assumed is a chain-multinomial process (hence discrete and under-dispersed) and it is assumed to be highly observed; consequently, particle degradation is extremely common and standard methods such as the BPF are unsuccessful.
To enable the use of particle methods in this context, we introduce one or more ‘lifebelt particles’ which have the role of exploring safer parts of the state-space. We provide the derivations and properties of our particle filtering method, and we present results that show that, in simulation settings, it outperforms current gold-standard estimates of the intensive-care fatality risk. We conclude by discussing similarities and contrasts between our method proposed and the existing literature.
Co-authors of the work: Sumeetpal S. Singh and Matti Vihola
Title: Conditional particle filters with bridge backward sampling
Abstract: The celebrated backward sampling conditional particle filter (CPF) often performs very well with hidden Markov models (HMMs) involving informative observations, but its benefits vanish when the observations become less informative relative to the dynamic model. Such settings arise in particular with discretisations of continuous-time path integral models, which we focus on. To avoid these problems, we propose a new CPF that replaces backward sampling with a generalisation that involves auxiliary `bridging’ CPF steps, and is parameterised by a blocking sequence. We develop a tuning strategy for choosing an appropriate blocking. Our experiments demonstrate that the proposed CPF with backward bridge sampling is stable with respect to refined discretisation, and offers efficient inference for HMMs with weakly informative observations.
Oskar Kviman (KTH Royal Institute of Technology)
Co-authors: Hazal Koptagel, Harald Melin, Jens Lagergren
KL/TV Resampling: Statistical Distance Based Offspring Selection in SMC Methods
The research interest in combining the variational inference (VI) and sequential Monte Carlo (SMC) methodologies is rapidly increasing, especially in domains allowing for model parameter learning via amortized VI. Yet, utilizing VI in order to directly design resampling/offspring selection schemes — one of the defining elements of particle filters — has, to the best of our knowledge, received little attention so far. In light of this, we propose two novel offspring selection schemes which multiply/discard particles in order to minimize the Kullback-Leibler (KL) divergence or the total variation (TV) distance with respect to the real-valued particle distribution (prior to resampling). The reference distribution can either be the rational-valued particle distribution (post resampling), or the model’s target distribution. By regarding offspring selection as a problem of minimizing statistical distances, we further bridge the gap between optimization-based density estimation and SMC theory. Our proposed methods outperform or compare favorably with the multinomial, systematic and stratified resampling schemes on common density-estimation benchmark datasets in terms of estimating the true sequence of latent variables.
Caroline Lawless (University of Oxford)
Co-authors of the work: Judith Rousseau (University of Oxford), Robin Ryder (Université Paris Dauphine)
Inference of grammar complexity by Bayes factors using sequential Monte Carlo
The class of context-free grammars is believed to be too restrictive to fully describe all features of natural language. The class of context-sensitive grammars, on the other hand, is too complex: modelling with them would require an unrealistic amount of computational time. Various mildly context-free grammar formalisms, which may be placed between context-free grammars and context-sensitive grammars in terms of complexity, have thus been proposed in the last few decades. We will be interested in the class of 2-multiple context-free grammars (2-MCFGs), which properly include the class of context-free grammars. We propose a Bayesian non-parametric model for 2-MCFGs within which a model for context-free grammars is naturally embedded. We carry out model choice by Bayes factors using sequential Monte Carlo in Birch probabilistic programming language.
Sara Pérez-Vieites (Universidad Carlos III de Madrid)
Co-authors: Harold Molina-Bulla, Joaquín Míguez
Nested Smoothing Algorithms for Inference and Tracking of Heterogeneous Multi-scale State-space Systems
Abstract: Multi-scale problems, where variables of interest evolve in different time-scales and live in different state-spaces, can be found in many fields of science where complex series of data have to be analyzed. Here, we introduce a new recursive methodology for Bayesian inference that aims at estimating the static parameters and tracking the dynamic variables of these kind of systems. Although the proposed approach works in rather general multi-scale systems, for clarity we analyze the case of a homogeneous multi-scale model with 3 time-scales (static parameters, slow dynamic state variables and fast dynamic state variables). The proposed scheme, based on the nested filtering methodology of [S. Pérez-Vieites, I. P. Mariño, J. Míguez. Probabilistic scheme for joint parameter estimation and state prediction in complex dynamical systems. Physical Review E, 98(6), 063305, 2017], combines three intertwined layers of filtering techniques that approximate recursively the joint posterior probability distribution of the parameters and both sets of dynamic state variables given a sequence of noisy data. We explore the use of sequential Monte Carlo schemes and Gaussian filtering techniques in the different layers of computation. Some numerical results are presented for a stochastic two-scale Lorenz 96 model with unknown parameters.
Iñigo Urteaga (Columbia University)
Co-authors of the work: Iñigo Urteaga and Chris H. Wiggins
Sequential Monte Carlo for Multi-Armed Bandit Agents
We extend state-of-the-art multi-armed bandit (MAB) algorithms beyond their original settings by leveraging advances in sequential Monte Carlo (SMC) methods. We study dynamic bandits, where the unknown world the bandit agent interacts with evolves over time, and devise flexible SMC-MAB algorithms that leverage SMC for posterior sampling and estimation of sufficient statistics in Bayesian MAB algorithms.
The proposed SMC-MAB agents, which implement Thompson sampling and upper confidence bound (UCB) algorithms, provide a flexible approach to solving a rich class of bandit problems, including restless MABs modeled with linear dynamical systems (where unknown parameters are marginalized via Rao-Blackwellization) and nonlinear, stateless or context-dependent, reward distributions.
Empirical results demonstrate good regret performance of the proposed algorithms in practice, showcasing the promise of SMC-based Thompson sampling and UCB. Ongoing work studies how we may gain theoretical understanding of the dependency between bandit arm dynamics and regret in the case of these SMC-based MAB policies.
Marginalized particle Gibbs for multiple state-space models coupled through shared parameters