Learning in integer latent variable models with nested automatic differentiation

Published in ICML 2018, 2018

We develop nested automatic differentiation (AD) algorithms for exact inference and learning in integer latent variable models. Recently, Winner, Sujono, and Sheldon showed how to reduce marginalization in a class of integer latent variable models to evaluating a probability generating function which contains many levels of nested high-order derivatives. We contribute faster and more stable AD algorithms for this challenging problem and a novel algorithm to compute exact gradients for learning. These contributions lead to significantly faster and more accurate learning algorithms, and are the first AD algorithms whose running time is polynomial in the number of levels of nesting.

Download paper here

Recommended citation: Daniel Sheldon, Kevin Winner, and Debora Sujono. Learning in integer latent variable models with nested automatic differentiation. In Proceedings of the 35th International Conference on Machine Learning, ICML 2018, Stockholm, Sweden, July 10-15, 2018, pages 4622–4630, 2018 https://people.cs.umass.edu/~sheldon/papers/pgf-backprop.pdf