Next
Previous
Up
Bayes' optimal inference, decision making, and learning with
Probabilistic Population Codes
Jeff Beck
University of Rochester
Human behavior has been shown to take uncertainty into account when
combining ambiguous cues and to do so in a Bayes' optimal way. This
particular computation requires a neural code which represents entire
probability distribution functions rather than simply estimates.
Moreover, this code must be structured so that the operations
available to neural circuits are, in fact, capable of implementing
(and learn to implement) optimal cue combination and action selection.
Here, we will show how the Probabilistic Population Coding (PPC)
framework naturally links biological constraints on neural operations
with an optimal form of variability that leads to specific predictions
regarding stimulus conditioned neural statistics. As an example, we
will show how the requirement that optimal cue combination be
performed by linear operations implies that neural variability should
exhibit tuning curve like behavior with arbitrary correlations and
fixed (but not necessarily unit) Fano factors. We will then show that
this particular form of neural variability is makes it possible to
optimally implement other useful probabilistic operations such as
posterior diffusion/saliency, maximum likelihood estimation, and
information maximization via the neural operations of divisive
normalization, dynamic attraction, and delta rule learning
respectively.