GATSBY COMPUTATIONAL NEUROSCIENCE UNIT
UCL Logo

Kenneth D Harris

Rutgers, USA

 

Friday 11 April 2008

14.00

 

Seminar Room B10 (Basement)

Alexandra House, 17 Queen Square, London, WC1N 3AR

 

 

Axonal backpropagation in real neuronal networks

 

Classically, neurons communicate by anterograde conduction of action potentials. However, information can also pass backward along axons, a process that is well characterized during the development of the nervous system. Recent experiments have shown that changes to a neuron's output synapses may "backpropagate" along the axon to cause changes in the same neurons inputs. Here we suggest a computational role for such "retroaxonal" signals in adult learning. Although a neural implementation of the original error backpropagation algorithm would require retroaxonal signals to travel faster than physiologically possible, the proposed scheme requires only realistic speeds of retroaxonal communication. We hypothesize that strengthening of a neuron's output synapses stabilizes recent changes in the same neuron's inputs. During learning, the input synapses of many neurons undergo transient changes, resulting in altered spiking activity. If this in turn promotes strengthening of output synapses, the recent synaptic changes will be stabilized; otherwise they will decay. A representation of sensory stimuli therefore evolves that is tailored to the demands of behavioral tasks. We describe experimental evidence in support of this hypothesis, and outline a candidate molecular mechanism involving the activation of CREB by retrograde neurotrophin signals.