NeurIPS 2020

Meta-Learning through Hebbian Plasticity in Random Networks


Meta Review

The paper proposes the use of evolution to find the params of a (parametrized) Hebbian plasticity learning rule and optimize the network to be able to adapt from random weights to perform tasks rather than learning weights themselves. They experiment with car racing and robot simulation tasks. The paper is well written, and the idea is interesting with motivations from neuroscience. Reviewers generally find the work encouraging, and suggested improvements / future work such as looking at generalization abilities. R3 also made a good point that while the work is motivated from neuroscience, it is “difficult to relate the Hebbian plasticity rules that are considered in this paper to rules for synaptic plasticity in the brain that have been found in neuroscience. Synaptic plasticity in the brain appears to rely often on a multitude of gating signals, and on the relative timing of pre- and postsynaptic activity. Also the recent history of a synapse appears to play a role. Hence I find it difficult to convince myself that this paper provide new insight into synaptic plasticity or the organization of learning in the brain.” A study of why the current proposed method works will be required to understand whether it overly relies on the optimization method used, and perhaps shed some new light on the role of synaptic plasticity in meta-learning.