Networks of neurons are often tasked with representing and performing computations on their inputs, and relaying this information to downstream networks. However, they don't have unlimited resources with which to operate. Using a top-down approach, we build a neural code that seeks to minimize its representation error and its metabolic costs [1]. What results is a neural network that is E/I balanced and whose neurons are subject to spike-frequency adaptation. This framework allows one to investigate and make predictions about the structure of neural networks as well as the roles that specific biophysical mechanisms may have in neural computation.
We use this approach to investigate encoding and decoding in a model of orientation discrimination. While spike-frequency adaptation leads to efficient and biologically realistic spiking activity, it also produces a variable population code that is dependent on recent spiking history. This can lead to drastic changes in the neural tuning curves depending on the statistics of the stimuli. Despite those changes, an accurate representation can still be obtained without having to adjust the decoder. These results predict that population code variability isn't simply due to noise, rather, it is a consequence of the cost/accuracy tradeoff inherent in the neural code.
[1] M. Boerlin, C.K. Machens, and S. Deneve PLoS Comput Biol. 9(11): e1003258 (2013).