Posts by Tags

concentration inequality

gradient descent

Fixed Points of SVGD

1 minute read

Published:

The following note is inspired by the discussion on properties of linear kernel functions. Though they’re not supposed to perform well when we use SVGD, they could provide exact estimates for some functions including the mean and variance. Three kinds of kernel functions, constant kernel, linear kernel and polynomial kernel respectively, are explored here to see what kinds of functions they could provide exact estimates for and to see how well particles could approximate the target distribution.

reading

Talk Slides

less than 1 minute read

Published:

First post of the year :-) Following are two talks I gave in the seminar with topic machine learning.

Reading: Stein method in machine learning

4 minute read

Published:

Reading notes for paper Stein variational gradient descent: A general purpose bayesian inference algorithm. (Liu & Wang, 2016) including introduction to Stein’s identity, Stein discrepancy and finally Stein variational gradient descent.

reinforcement learning

statistical machine learning

Talk Slides

less than 1 minute read

Published:

First post of the year :-) Following are two talks I gave in the seminar with topic machine learning.

Fixed Points of SVGD

1 minute read

Published:

The following note is inspired by the discussion on properties of linear kernel functions. Though they’re not supposed to perform well when we use SVGD, they could provide exact estimates for some functions including the mean and variance. Three kinds of kernel functions, constant kernel, linear kernel and polynomial kernel respectively, are explored here to see what kinds of functions they could provide exact estimates for and to see how well particles could approximate the target distribution.

Reading: Stein method in machine learning

4 minute read

Published:

Reading notes for paper Stein variational gradient descent: A general purpose bayesian inference algorithm. (Liu & Wang, 2016) including introduction to Stein’s identity, Stein discrepancy and finally Stein variational gradient descent.

variational inference

Fixed Points of SVGD

1 minute read

Published:

The following note is inspired by the discussion on properties of linear kernel functions. Though they’re not supposed to perform well when we use SVGD, they could provide exact estimates for some functions including the mean and variance. Three kinds of kernel functions, constant kernel, linear kernel and polynomial kernel respectively, are explored here to see what kinds of functions they could provide exact estimates for and to see how well particles could approximate the target distribution.