NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:5275
Title:Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes


		
The paper presents a method for collapsing a wide range of operations (convolution, pooling, batchnorm, attention, gating, as well as the inner products for the actual GP Kernel computation) into the matrix multiplication / nonlinearity / linear combination framework; and also a mean field theory of tied weights, which allows a rigorous extension to RNNs as well as a rigorous integration of the forward and backward pass. The results are novel and interesting. This paper had strong overlap with another paper (that was clearly identified by the authors in both submissions), and so the discussion of the tw o papers took place together.