
Walter Senn
Institute of Physiology, University of Bern, Switzerland
Wednesday 13 December 2006, 16:00
Seminar Room B10 (Basement)
Alexandra House, 17 Queen Square, London, WC1N 3AR
Stochastic learning with discretevalued synapses
Accumulating experimental evidence shows that biological synapses may encode information through digital or even binaryvalued strengths, in analogy to the information storage in electronic devices. While digital encoding makes any storage device robust against thermal noise, the same feature also makes the design of learning algorithms difficult. In fact, the learning capacity of the classical perceptron dramatically shrinks when exchanging analogvalued synapses by binary ones without applying additional tricks. However, when introducing stochastic LTP/LTD transitions and global inhibition, the learning capacity is essentially restored. I will review these results and show that the theory also applies to a spikedriven learning rule with binary synapses which achieves top performance on a benchmark classification problem. I also present a generalization to rewardmaximizing reinforcement learning with binary synapses and stochastic Hebbian transitions which apply to any network topology. It turns out that the noise introduced by the stochastic LTP/LTD transitions is speeding up learning as compared to other reinforcement learning rules. The benefit of stochastic LTP/LTD becomes particularly drastic in complex networks. For instance, a multilayer network endowed with stochastic Hebbian transitions may reproduce the learning speed of monkeys during an association task. In contrast, other reinforcement algorithms do not scale well with the number of hidden layers and therefore cannot reproduce these data, even when the synaptic strengths are unbounded.
In collaboration with S. Fusi, E. Vasilaki and B. Vladimirski