G. Edelman, O. Sporns and G. Tononi ont introduit la complexité neuronale d'une famille de variables aléatoires, définie comme une certaine moyenne de l'information mutuelle de ses sous-familles. On montre ici que leur choix des poids satisfait deux propriétés naturelles: l'invariance par permutations et l'additivité. Nous appelons toute fonctionnelle satisfaisant ces deux propriétés une intrication. Nous classifions toutes les intrications en termes de mesures de probabilité sur l'intervalle unité et nous étudions le taux de croissance du maximum de l'intrication quand la taille du système tend vers l'infini. Pour un système de taille fixée, nous montrons que les maximiseurs ont un petit support et que les systèmes échangeables ont une petite intrication. En particulier, maximiser l'intrication mène à une rupture spontanée de symétrie et il n'y a pas d'unicité.
G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and lack of uniqueness.
Mots clés : entropy, mutual information, complexity, discrete probability, exchangeable random variables
@article{AIHPB_2012__48_2_343_0, author = {Buzzi, J. and Zambotti, L.}, title = {Mean mutual information and symmetry breaking for finite random fields}, journal = {Annales de l'I.H.P. Probabilit\'es et statistiques}, pages = {343--367}, publisher = {Gauthier-Villars}, volume = {48}, number = {2}, year = {2012}, doi = {10.1214/11-AIHP416}, mrnumber = {2954258}, zbl = {1259.94032}, language = {en}, url = {http://www.numdam.org/articles/10.1214/11-AIHP416/} }
TY - JOUR AU - Buzzi, J. AU - Zambotti, L. TI - Mean mutual information and symmetry breaking for finite random fields JO - Annales de l'I.H.P. Probabilités et statistiques PY - 2012 SP - 343 EP - 367 VL - 48 IS - 2 PB - Gauthier-Villars UR - http://www.numdam.org/articles/10.1214/11-AIHP416/ DO - 10.1214/11-AIHP416 LA - en ID - AIHPB_2012__48_2_343_0 ER -
%0 Journal Article %A Buzzi, J. %A Zambotti, L. %T Mean mutual information and symmetry breaking for finite random fields %J Annales de l'I.H.P. Probabilités et statistiques %D 2012 %P 343-367 %V 48 %N 2 %I Gauthier-Villars %U http://www.numdam.org/articles/10.1214/11-AIHP416/ %R 10.1214/11-AIHP416 %G en %F AIHPB_2012__48_2_343_0
Buzzi, J.; Zambotti, L. Mean mutual information and symmetry breaking for finite random fields. Annales de l'I.H.P. Probabilités et statistiques, Tome 48 (2012) no. 2, pp. 343-367. doi : 10.1214/11-AIHP416. http://www.numdam.org/articles/10.1214/11-AIHP416/
[1] Exchangeability and related topics. In Ecole d'été de probabilités de Saint-Flour, XIII 1-198. Lecture Notes in Math. 1117. Springer, Berlin, 1985. | MR | Zbl
.[2] Complexity, contingency and criticality. Proc. Natl. Acad. Sci. USA 92 (1995) 6689-6696.
and .[3] Neural complexity and structural connectivity. Phys. Rev. E 79 (2009) 051914. | MR
, and .[4] How to define complexity in physics and why. In Complexity, Entropy and the Physics of Information, Vol. VIII. W. Zurek (Ed.). Addison-Wesley, Redwood City, 1990.
.[5] Random Fragmentation and Coagulation Processes. Cambridge Univ. Press, Cambridge, 2006. | Zbl
.[6] Approximate maximizers of intricacy functionals. Probab. Theory Related Fields. To appear. Available at http://arxiv.org/abs/0909.2120. | MR | Zbl
and .[7] Elements of Information Theory. John Wiley & Sons, Hoboken, NJ, 2006. | Zbl
and .[8] Inferring statistical complexity. Phys. Rev. Lett. 63 (1989) 105-109. | MR
and .[9] A topological approach to neural complexity. Phys. Rev. E 71 (2005), 016114. | MR
, , and .[10] Degeneracy and complexity in biological systems. Proc. Natl. Acad. Sci. USA 98 (2001) 13763-13768.
and .[11] Simple lessons from complexity. Science 284 (1999) 87-89.
and .[12] Entropy. Princeton Univ. Press, Princeton, NJ, 2003. | MR | Zbl
, and .[13] Polymatroidal dependence structure of a set of random variables. Information and Control 39 (1978) 55-72. | MR | Zbl
.[14] Nonnegative entropy measures of multivariate symmetric correlations. Information and Control 36 (1978) 133-156. | MR | Zbl
.[15] Analytical description of the evolution of neural networks: Learning rules and complexity. Biol. Cybern. 81 (1999) 169-176. | Zbl
and .[16] Characterizing functional hippocampal pathways in a brain-based device as it solves a spatial memory task. Proc. Natl. Acad. Sci. USA 102 (2005) 2111-2116.
, , and .[17] Information inequalities for joint distributions, with interpretations and applications. IEEE Trans. Inform. Theory 56 (2010) 2699-2713. | MR
and .[18] Theories and measures of consciousness: An extended framework. Proc. Natl. Acad. Sci. USA 103 (2006) 10799-10804.
, , and .[19] Models of consciousness. Scholarpedia 2 (2007) 1328.
.[20] Dynamical complexity in small-world networks of spiking neurons. Phys. Rev. E 78 (2008) 041924. | MR
.[21] Connectivity and complexity: The relationship between neuroanatomy and brain dynamics. Neural Netw. 13 (2000) 909-922.
, and .[22] Networks analysis, complexity, and brain function. Complexity 8 (2002) 56-60. | MR
.[23] Complexity. Scholarpedia 2 (2007) 1623.
.[24] Spin Glasses: A Challenge for Mathematicians. Springer, Berlin, 2003. | MR | Zbl
.[25] Nonnegative entropy measures of multivariate symmetric correlations. Information and Control 36 (1978) 133-156. | MR | Zbl
.[26] A measure for brain complexity: Relating functional segregation and integration in the nervous system. Proc. Natl. Acad. Sci. USA 91 (1994) 5033-5037.
, and .[27] A complexity measure for selective matching of signals by the brain. Proc. Natl. Acad. Sci. USA 93 (1996) 3422-3427.
, and .[28] Measures of degeneracy and redundancy in biological networks. Proc. Natl. Acad. Sci. USA 96 (1999), 3257-3262.
, and .Cité par Sources :