An information-theoretic analysis of neural coding schemes
Alexander G. Dimitrov and John P. Miller
Center for Computational Biology
Montana State University
We present a novel analytical approach for studying neural encoding.
As a first step we model a neural sensory system as a communication
channel. Using the method of typical sequence in this context, we show
that a coding scheme is an almost bijective relation between
equivalence classes of stimulus/response pairs. The analysis allows a
quantitative determination of the type of information encoded in
neural activity patterns and, at the same time, identifying the code
with which that information is represented. Due to the high
dimensionality of the sets involved, such a relation is extremely
difficult to quantify. To circumvent this problem, and to use whatever
limited data set is available most efficiently, we quantize the neural
responses to a reproduction set of a small finite size. We optimize
the quantization to minimize an information-based distortion function.
This method allows us to study coarse but highly informative models of
a coding scheme, and then to refine them automatically when more data
We apply this method to the analysis of coding in the cricket cercal
sensory system. To cope with the high dimensional stimulus space, we
use model and minimize an upper bound of the information distortion.
We compare the results to the linear stimulus reconstruction approach.
For a single neuron, a reproduction with two classes completely
recovers the stimulus reconstruction results. A 3-class reproduction
uncovered additional structure not present in the stimulus
reconstruction results. Further structure was found in the
class-conditioned covariance matrix.