Back to the abstracts page
Back to the NCCD 2015 home page
A neural implementation of variational graphical models
Alexandre Hyafil1, Gustavo Deco1 and Ruben Moreno-Bote1
1Universitat Pompeu Fabra

The neural mechanisms that underlie perceptual inference in cortex are a matter of intense research. Neural models implementing Bayesian models have proved useful to uncover neural computations in a variety of paradigms. Still no existing neurocomputational model provides a general framework for understanding perception in naturalistic conditions, that is solving inference problems over complex graphical models [1-3]. We propose a new model of perception called Constrained Entropy Maximization Network (CEMNet) that provides a theoretical framework for probabilistic inference in complex naturalistic environments. CEMNet stores an internal model of the environment by representing regularities across stochastic variables as constraints; those constraints shape the response of the network. The model feature Representation neurons whose activity encodes the marginalized likelihood for the presence of a specific object/feature (or combination of features) in the environment. Inference is mediated by Maximization of Entropy of the full neural population (or rather minimization of KL divergence with prior distribution) subject to certain constraints [4]. This is achieved after convergence through simple dynamical rules. The rationale for the algorithm is that the distribution with maximal entropy represents the probability distribution with least added information that conforms to the known structure of the environment (i.e. the constraints) and observed variables. Constraints represent known linear relations between different features, i.e. they encode the internal model of the environment: normalization constraints ensure that the probabilities over all possible states of one object sum to 1 (essentially enforcing divisive normalization) [5]; prediction constraints store the predictions about a certain state or sensory event based on estimates of upper level states, allowing updating of those estimates following actual sensory evidence, effectively representing an integrated form of the classical prediction error signal [6] ; dynamical constraints represent knowledge about the evolution of an object states, allowing to maintain a probabilistic representation of the object state across time even in the absence of sensory evidence [7]. These constraints are enforced by a second class of neurons (Constraint neurons) projecting to Representation neurons. Overall the model performs online approximate variational inference on graphical model of any complexity. We will present the general architecture of the model and the underlying mathematical framework; describe how CEMNet can be implemented in a realistic neural network; finally we will show through simulations how inference is performed in classical perceptual integration and accumulation tasks. Overall we believe this work constitute an innovative step towards understanding neural computations underlying perception in naturalistic environments.

[1] A Ma WJ et al. Nature neurosci. 9:1432-8 (2006).
[2] Denève S, Neural Comput. 20:139191-117 (2008)..
[3] Darroch J & Ratcliff D. Ann Math Stats (1972).
[4] Caticha A & Giffin A, Bayesian Inference and Maximum Entropy.:1-13 (2006).
[5] Beck JM et a.. Journal of neurosci. 31:15310-9 (2011).
[6] Friston KJ, Phil trans of Royal Soc of London 360:815-36 (2005).
[7] Rao RPN Bayesian brain: Probabilistic approaches to neural networks (2007).