New preprint on model calibration and source inversion in atmospheric dispersion

We just uploaded a new preprint “Simultaneous model calibration and source inversion in atmospheric dispersion models” that I co-authored with Juan Garcia and John Stockie. This is mainly thanks to Juan’s work for his master’s thesis. I am very proud of him. Here’s the abstract:

We present a cost-effective method for model calibration and solution of source inversion problems in atmospheric dispersion modelling. We use Gaussian process emulations of atmospheric dispersion models within a Bayesian framework for solution of inverse problems. The model and source parameters are treated as unknowns and we obtain point estimates and approximation of uncertainties for sources while simultaneously calibrating the forward model. The method is validated in the context of an industrial case study involving emissions from a smelting operation for which cumulative monthly measurements of zinc particulate depositions are available.

A day of celebration at SFU

I had the opportunity to celebrate my graduation with some of the lovelies people in my life. My two supervisors John Stockie and Nilima Nigam hooded me at the convocation ceremony. I was also awarded the Governor General’s gold medal by President Andrew Petter.  This is a huge honor which I credit to my supervisors as well as the wonderful people at SFU who supported me for the past six years.



Preprint on Metropolis-Hastings for self-decomposable priors

A new preprint titled “A Metropolis-Hastings algorithm for posterior measures with self-decomposable priors” is now available on arXiv:


We introduce a new class of Metropolis-Hastings algorithms for sampling target measures that are absolutely continuous with respect to an underlying self-decomposable prior measure on infinite-dimensional Hilbert spaces. We particularly focus on measures that are highly non-Gaussian and cannot be sampled effectively using conventional algorithms. We utilize the self-decomposability of the prior to construct an autoregressive proposal kernel that preserves the prior measure and satisfies detailed balance. We then introduce an entirely new class of self-decomposable prior measures, called the Bessel-K prior, as a generalization of the gamma density to infinite dimensions. The Bessel-K priors interpolate between well-known priors such as the gamma distribution and Besov priors and can model sparse or compressible parameters. We present example applications of our algorithm in inverse problems ranging from finite-dimensioanl denoising to deconvolution on L^2.