Information Theory of Cognitive Systems Group, Max Planck Institute, Germany
Wednesday 18 June 2008
Seminar Room B10 (Basement)
Alexandra House, 17 Queen Square, London, WC1N 3AR
Towards an integration of infomax concepts
Information theory provides a useful framework for the formulation of first principles of learning in neural networks (infomax). In feed-forward structures, the interpretation of the corresponding objective functions is usually closely related to the causal concept of information flow. The overall research on which my talk is based aims at integrating various infomax approaches within one framework. On one hand, experimental and theoretical work on local information flow maximization for single neurons has been very fruitful and, on the other hand, global complexity measures have been suggested referring to the balance of the integration and segregation capabilities of the brain as a whole. However, no attempt has been made to relate these two descriptional scales to each other in a consistent way. In my talk, I demonstrate how local information flows contribute to the global complexity of a system defined in terms of information geometry. I discuss some theoretical results on the maximization of complexity and illustrate these results in terms of computer simulations. Going one further step towards an integration of informax principles, one has to incorporate information flows within the perception-action loop. I discuss some initial steps to this end.