Mean mutual information and symmetry breaking for finite random fields
Annales de l'I.H.P. Probabilités et statistiques, Tome 48 (2012) no. 2, pp. 343-367.

G. Edelman, O. Sporns and G. Tononi ont introduit la complexité neuronale d'une famille de variables aléatoires, définie comme une certaine moyenne de l'information mutuelle de ses sous-familles. On montre ici que leur choix des poids satisfait deux propriétés naturelles: l'invariance par permutations et l'additivité. Nous appelons toute fonctionnelle satisfaisant ces deux propriétés une intrication. Nous classifions toutes les intrications en termes de mesures de probabilité sur l'intervalle unité et nous étudions le taux de croissance du maximum de l'intrication quand la taille du système tend vers l'infini. Pour un système de taille fixée, nous montrons que les maximiseurs ont un petit support et que les systèmes échangeables ont une petite intrication. En particulier, maximiser l'intrication mène à une rupture spontanée de symétrie et il n'y a pas d'unicité.

G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and lack of uniqueness.

DOI : 10.1214/11-AIHP416
Classification : 94A17, 92B20, 60C05
Mots clés : entropy, mutual information, complexity, discrete probability, exchangeable random variables
@article{AIHPB_2012__48_2_343_0,
     author = {Buzzi, J. and Zambotti, L.},
     title = {Mean mutual information and symmetry breaking for finite random fields},
     journal = {Annales de l'I.H.P. Probabilit\'es et statistiques},
     pages = {343--367},
     publisher = {Gauthier-Villars},
     volume = {48},
     number = {2},
     year = {2012},
     doi = {10.1214/11-AIHP416},
     mrnumber = {2954258},
     zbl = {1259.94032},
     language = {en},
     url = {http://www.numdam.org/articles/10.1214/11-AIHP416/}
}
TY  - JOUR
AU  - Buzzi, J.
AU  - Zambotti, L.
TI  - Mean mutual information and symmetry breaking for finite random fields
JO  - Annales de l'I.H.P. Probabilités et statistiques
PY  - 2012
SP  - 343
EP  - 367
VL  - 48
IS  - 2
PB  - Gauthier-Villars
UR  - http://www.numdam.org/articles/10.1214/11-AIHP416/
DO  - 10.1214/11-AIHP416
LA  - en
ID  - AIHPB_2012__48_2_343_0
ER  - 
%0 Journal Article
%A Buzzi, J.
%A Zambotti, L.
%T Mean mutual information and symmetry breaking for finite random fields
%J Annales de l'I.H.P. Probabilités et statistiques
%D 2012
%P 343-367
%V 48
%N 2
%I Gauthier-Villars
%U http://www.numdam.org/articles/10.1214/11-AIHP416/
%R 10.1214/11-AIHP416
%G en
%F AIHPB_2012__48_2_343_0
Buzzi, J.; Zambotti, L. Mean mutual information and symmetry breaking for finite random fields. Annales de l'I.H.P. Probabilités et statistiques, Tome 48 (2012) no. 2, pp. 343-367. doi : 10.1214/11-AIHP416. http://www.numdam.org/articles/10.1214/11-AIHP416/

[1] D. J. Aldous. Exchangeability and related topics. In Ecole d'été de probabilités de Saint-Flour, XIII 1-198. Lecture Notes in Math. 1117. Springer, Berlin, 1985. | MR | Zbl

[2] P. Bak and M. Paczuski. Complexity, contingency and criticality. Proc. Natl. Acad. Sci. USA 92 (1995) 6689-6696.

[3] L. Barnett, C. L. Buckley and S. Bullock. Neural complexity and structural connectivity. Phys. Rev. E 79 (2009) 051914. | MR

[4] C. Bennett. How to define complexity in physics and why. In Complexity, Entropy and the Physics of Information, Vol. VIII. W. Zurek (Ed.). Addison-Wesley, Redwood City, 1990.

[5] J. Bertoin. Random Fragmentation and Coagulation Processes. Cambridge Univ. Press, Cambridge, 2006. | Zbl

[6] J. Buzzi and L. Zambotti. Approximate maximizers of intricacy functionals. Probab. Theory Related Fields. To appear. Available at http://arxiv.org/abs/0909.2120. | MR | Zbl

[7] T. Cover and J. Thomas. Elements of Information Theory. John Wiley & Sons, Hoboken, NJ, 2006. | Zbl

[8] J. Crutchfield and K. Young. Inferring statistical complexity. Phys. Rev. Lett. 63 (1989) 105-109. | MR

[9] M. De Lucia, M. Bottaccio, M. Montuori and L. Pietronero. A topological approach to neural complexity. Phys. Rev. E 71 (2005), 016114. | MR

[10] G. Edelman and J. Gally. Degeneracy and complexity in biological systems. Proc. Natl. Acad. Sci. USA 98 (2001) 13763-13768.

[11] N. Goldenfeld and L. Kadanoff. Simple lessons from complexity. Science 284 (1999) 87-89.

[12] A. Greven, G. Keller and G. Warnecke. Entropy. Princeton Univ. Press, Princeton, NJ, 2003. | MR | Zbl

[13] S. Fujishige. Polymatroidal dependence structure of a set of random variables. Information and Control 39 (1978) 55-72. | MR | Zbl

[14] T. S. Han. Nonnegative entropy measures of multivariate symmetric correlations. Information and Control 36 (1978) 133-156. | MR | Zbl

[15] K. Holthausen and O. Breidbach. Analytical description of the evolution of neural networks: Learning rules and complexity. Biol. Cybern. 81 (1999) 169-176. | Zbl

[16] J. Krichmar, D. Nitz, J. Gally and G. Edelman. Characterizing functional hippocampal pathways in a brain-based device as it solves a spatial memory task. Proc. Natl. Acad. Sci. USA 102 (2005) 2111-2116.

[17] M. Madiman and P. Tetali. Information inequalities for joint distributions, with interpretations and applications. IEEE Trans. Inform. Theory 56 (2010) 2699-2713. | MR

[18] A. Seth, E. Izhikevich, G. Reeke and G. Edelman. Theories and measures of consciousness: An extended framework. Proc. Natl. Acad. Sci. USA 103 (2006) 10799-10804.

[19] A. Seth. Models of consciousness. Scholarpedia 2 (2007) 1328.

[20] M. P. Shanahan. Dynamical complexity in small-world networks of spiking neurons. Phys. Rev. E 78 (2008) 041924. | MR

[21] O. Sporns, G. Tononi and G. Edelman. Connectivity and complexity: The relationship between neuroanatomy and brain dynamics. Neural Netw. 13 (2000) 909-922.

[22] O. Sporns. Networks analysis, complexity, and brain function. Complexity 8 (2002) 56-60. | MR

[23] O. Sporns. Complexity. Scholarpedia 2 (2007) 1623.

[24] M. Talagrand. Spin Glasses: A Challenge for Mathematicians. Springer, Berlin, 2003. | MR | Zbl

[25] T. S. Han. Nonnegative entropy measures of multivariate symmetric correlations. Information and Control 36 (1978) 133-156. | MR | Zbl

[26] G. Tononi, O. Sporns and G. Edelman. A measure for brain complexity: Relating functional segregation and integration in the nervous system. Proc. Natl. Acad. Sci. USA 91 (1994) 5033-5037.

[27] G. Tononi, O. Sporns and G. Edelman. A complexity measure for selective matching of signals by the brain. Proc. Natl. Acad. Sci. USA 93 (1996) 3422-3427.

[28] G. Tononi, O. Sporns and G. Edelman. Measures of degeneracy and redundancy in biological networks. Proc. Natl. Acad. Sci. USA 96 (1999), 3257-3262.

Cité par Sources :