Next
Previous
Up
Spikes, synapses, learning and memory in simple networks
Surya Ganguli
UCSF
Multiple time scales underly neural computation, coding and dynamics.
The interactions between biophysical dynamics occuring on these
multiple time scales presents a significant challenge in understanding
their collective contribution to neural computation and coding. For
example how might fast spiking dynamics, intermediate range
oscillations, and slower changes in synaptic efficacies collectively
give rise to even longer lasting effects, such as long term
memory traces? We will explore this type of question in simple model
neural networks. We will first show how to explicitly compute the
statistical properties of multineuronal spike trains directly in terms
of such networks' synaptic efficacies and latencies. We then apply
this result to generate and analyze an effective dynamics on the space
of synaptic parameters that captures the slow evolution of synaptic
patterns in response to spiking inputs. Such an effective dynamics
can aid in bridging the time scales between fast spiking dynamics and
slower changes in long term memory traces due to changes in synaptic
patterns. Finally we comment on how simple models of this form might
also aid in solving an inverse problem: given multineuronal
spiking statistics, how can one infer underlying synaptic efficacies
and latencies?