European Biophysics Journal 29:245.
Information theory is a popular tool in determining the role of cortical processes. It has been used to explain the desirability of V1 receptive fields and to determine the time over which rates are averaged in information processing. Information theory is useful because it says how much signal can be extracted in a noisy environment. However this depends crucially on the model of the noise. Describing neurons by Poisson processes or Gaussian channels, as is commonly done, yields results that do not generalize to more detailed models. I will discuss information theoretic results for model neurons described by renewal processes. Unlike Poisson processes, these can have a refractory period. The Poisson model has led people to equate rate encoding with a code for which a spike counter is an optimal decoder. This is not true for a generalized renewal process. The precise timing of the spikes yields more information than the total number of spikes alone. For Gaussian channels the output variance is independent of the mean rate, and an extra constraint of sparseness has to be imposed to obtain realistic network properties. Because with renewal neurons the variance depends on the mean, models using renewal neurons automatically yield sparse representations when information is maximized.
Back to Carl van Vreeswijk's Home Page, Publications