Next
Previous
Up
Evaluating information transfer between neurons: a different view on causality and integration within neuron assemblies

Boris Gourevitch and Jos J. Eggermont
Departments of Physiology and Biophysics, and Psychology, University of Calgary, Alberta, Canada

Neurons assemblies are often considered as simultaneous activations of a set of neurons leading to a certain representation or even a "coding" of the incoming information. Thus, cross-correlation between pairs or neurons has remained the most widely used tool to investigate such assemblies for decades. However, while the number of simultaneous recordings tends to increase, there is obviously no chance that multiple spike trains fully represent even one assembly. We only have access to a very subsampled representation of the brain states, with no clear knowledge of the physiological links between the neurons we're recording. In this case, notions of causality, through neurons driving other neurons, or analysis of synchrony or integration between neurons are not well defined.

With this in prospect, we have developed the transfer entropy, presented as a new tool for investigating neural assemblies. The transfer entropy quantifies the fraction of information (in a Shannon's sense) in a neuron found in the past history of another neuron. The maximization of the transfer entropy over several intervals defining the past history of a neuron gives insights about the memory span involved in information transmission between neurons. Consequently, this tool has potential applications in investigating windows of temporal integration and stimulus-induced modulation of firing rate. The asymmetry of the measure allows feedback evaluations. Transfer entropy is also able to eliminate some effects of common history in spike trains and obtains results that are different from cross-correlation. Moreover, the measure of the information transferred from one neuron to another avoids the difficulty of giving a precise definition of what is causality between neurons.

The basic transfer entropy properties are illustrated with simulations. The information transfer through a network of 16 simultaneous multiunit recordings in cat's auditory cortex was examined for a large number of acoustic stimulus types. Application of the transfer entropy to a large database of multiple single-unit activity in cat's primary auditory cortex revealed that most windows of temporal integration found during spontaneous activity range between 2 and 15 ms. The normalized transfer entropy shows similarities and differences with the strength of cross-correlation; these form the basis for revisiting the neural assembly concept.