Blog posts


Talk Slides

less than 1 minute read


First post of the year :-) Following are two talks I gave in the seminar with topic machine learning.


Fixed Points of SVGD

1 minute read


The following note is inspired by the discussion on properties of linear kernel functions. Though they’re not supposed to perform well when we use SVGD, they could provide exact estimates for some functions including the mean and variance. Three kinds of kernel functions, constant kernel, linear kernel and polynomial kernel respectively, are explored here to see what kinds of functions they could provide exact estimates for and to see how well particles could approximate the target distribution.

Reading: Stein method in machine learning

4 minute read


Reading notes for paper Stein variational gradient descent: A general purpose bayesian inference algorithm. (Liu & Wang, 2016) including introduction to Stein’s identity, Stein discrepancy and finally Stein variational gradient descent.