NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:9358
Title:Self-attention with Functional Time Representation Learning


		
The main contribution of the work is a novel approach to embed continuous time to differentiable functional domain in such a way that is compatible with modern models using self-attention. All reviewers agree that the derivation of these embeddings from function analysis results is quite interesting and contrasts with the intuitive derivations presented in the literature. In their response, the authors included improved experimental results by tuning a hyper parameter that was previously set to a fixed value (the degree for Fourier basis under each frequency in the Mercer embedding). Reviewers 1 and 3 still point out that the experimental section could be stronger. While the AC agrees that more experiments would make the paper stronger, the paper contains several interesting ideas and is able to show enough empirical evidence that the method has practical relevance. The AC encourages the authors to follow the recommendation of reviewer 2 and include the missing references as well as some insights of why the 3 of the proposed methods do not perform well (in the lines of what was included in the response).