Synaptic plasticity mechanisms are believed to govern the development of cortical sensory networks, with connections continuously shaped by natural inputs. While excitatory connections define the drive of each neuron, developing a selectivity to specific input patterns, inhibitory connections enforce decorrelation of activity, preventing neurons to represent redundant information. Previous developmental modeling efforts have engaged how some of these observations may arise, but many pieces of the puzzle remain elusive. Remarkably, the simultaneous development of excitatory and inhibitory connections may lead to instability, frustrating the development of receptive fields in spiking balanced networks. We present here a theory for the properties of LTP, LTD and homeostasis in synaptic plasticity that leads us to a robust functional model of visual development.
We consider higher-order feature learning as a fundamental principle for the plasticity of excitatory connections, defined as the optimization of higher moments of the activity of a neuron, while invariant to its second-order statistics. This simple principle is shown to lead to a plasticity rule with a clear functional interpretation. Specifically, the LTP term has a quadratic dependency on the post-synaptic activity, thus performing nonlinear Hebbian learning, which has been proposed as a universal feature learning mechanism . A linear LTD mechanism is shown to implement invariance to second-order statistics, thus guaranteeing the sensitivity to higher order features under very general conditions. A weight dependent homeostatic mechanism provides a robust balance between LTP and LTD, seamlessly implementing a constraint on total synaptic strength, eliminating any possible instability, while maintaining all properties predicted by the feature learning theory. The resulting learning rule is readily implemented in spiking networks, by a weight dependent version of the triplet spike-timing dependent plasticity rule . While functionally motivated, our learning rule is remarkably consistent with known phenomenological properties of cortical plasticity.
We demonstrate how our plasticity model is invariant to any linear transformation of the neuron's input, enabling robustness to heterogeneities such dendritic EPSP attenuation, varying pre-synaptic firing rates and, importantly, non-whitened sensory inputs. By implementing a network of spiking neurons, we show that feed-forward connections develop simple cell-like receptive fields for natural image inputs, while recurrent inhibition diversify the receptive fields of different neurons, generating a detailed balance between inhibition and excitation. Additionally, recurrent excitatory connections strengthen between neurons with overlapping receptive fields, reproducing known characteristics of the developing visual cortex.
Our results give a precise functional interpretation for cortical plasticity, which appears optimally designed for robust feature learning, and elucidates how neurons may self-organize into stable balanced sensory networks.
 C.S.N. Brito and W. Gerstner Cosyne Abstracts (2014).
 J.P. Pfister, W. Gerstner, J. of Neuroscience 26:9673-9682 (2006).