NeurIPS 2020

Latent Template Induction with Gumbel-CRFs


Meta Review

This paper consider the text generation task in a VAE framework where the latent variables of a CRF are used as template of generation. The paper uses Gumbel-Softmax as the gradient estimator for the posterior distribution. As a reparameterized gradient estimator, the Gumbel-CRF gives more stable gradients than other gradient estimators such as REINFORCE and PM-MRF. The proposed method are tested in a variety of text modelling tasks. Reviewers agree this is an important and difficult problem. The paper gives a reasonable solution and experiments demonstrated its effectiveness. Although R3&R4 thought it is a bit lower than the throshold, their objection seems not very strong. I see the authors' response answered most of the questions. I would like to see the paper being accepted.