Stein Variational Gradient Descent
The main challenge of the modern Bayesian inferences is about how to handle the intractable posterior. In many models, especially in deep neural networks, the posterior distribution consists of billions of parameters in a highly nonlinear and complex relationships. Traditional algorithms such as Markov chain Monte Carlo (MCMC) and variational inference (VI) are usually not suitable; the former lacks scalability and the latter lacks expressive power.
Stein variational gradient descent (SVGD), suggested by Liu & Wang, is a new algorithm for Bayesian models different from the MCMC and VI.