Computational and Biological Learning Lab, Department of Engineering, University of Cambridge
Wednesday 19 May 2010
Seminar Room B10 (Basement)
Alexandra House, 17 Queen Square, London, WC1N 3AR
Single cell computations: smarter than you think, dumber than you hope
The range of operations that individual neurons can perform on their synaptic inputs bears heavily on our understanding of how computations are performed in neural circuits. The canonical picture of neuronal operations is that they can be well approximated by some non-linear transformation of presynaptic firing rates. Two key assumptions underlying this approximation are that (1) spatial or temporal averaging of inputs is necessary for input spikes to be treated effectively as firing rates, and (2) individual neurons only integrate their inputs on a time scale that is set by their membrane and synaptic time constants. I will present recent work challenging these assumptions.
We have shown that short-term synaptic plasticity may operate as an adaptive filter so that the local postsynaptic potential at individual synapses can closely approximate the dynamically changing membrane potential of the presynaptic neuron. This theory predicts a match between synaptic dynamics and the natural statistics of presynaptic membrane potential fluctuations that can be directly tested in experiments.
We have also shown that membrane potential oscillations in the dendritic tree of neurons can integrate inputs on time scales that are orders of magnitude longer than the those set by basic biophysical time constants. Analysing how synaptic inputs are integrated in this dynamical regime, and in particular finding a trade-off between dendritic democracy and independence, yielded interesting (and somewhat disappointing) implications for current theories of entorhinal grid cell firing.