Short notes of five papers I read these days on applications of Stein’s method including wild variational inference, reinforcement learning and sampling. Most paper are from the project Stein’s method for practical machine learning that I am quite interested in.
The following note is inspired by the discussion on properties of linear kernel functions. Though they’re not supposed to perform well when we use SVGD, they could provide exact estimates for some functions including the mean and variance. Three kinds of kernel functions, constant kernel, linear kernel and polynomial kernel respectively, are explored here to see what kinds of functions they could provide exact estimates for and to see how well particles could approximate the target distribution.
Reading notes for paper Stein variational gradient descent: A general purpose bayesian inference algorithm. (Liu & Wang, 2016) including introduction to Stein’s identity, Stein discrepancy and finally Stein variational gradient descent.