Entropy of network ensembles explained
A set of networks that satisfies given structural characteristics can be treated as a network ensemble.[1] Brought up by Ginestra Bianconi in 2007, the entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble.[2]
The entropy is the logarithm of the number of graphs.[3] Entropy can also be defined in one network. Basin entropy is the logarithm of the attractors in one Boolean network.[4]
Employing approaches from statistical mechanics, the complexity, uncertainty, and randomness of networks can be described by network ensembles with different types of constraints.[5]
Gibbs and Shannon entropy
By analogy to statistical mechanics, microcanonical ensembles and canonical ensembles of networks are introduced for the implementation. A partition function Z of an ensemble can be defined as:
where
is the constraint, and
(
) are the elements in the
adjacency matrix,
if and only if there is a link between node i and node j.
is a step function with
if
, and
if
. The auxiliary fields
and
have been introduced as analogy to the bath in classical mechanics.
For simple undirected networks, the partition function can be simplified as[6]
where
,
is the index of the weight, and for a simple network
.
Microcanonical ensembles and canonical ensembles are demonstrated with simple undirected networks.
is defined by:
where
indicates the cardinality of the ensemble, i.e., the total number of networks in the ensemble.
The probability of having a link between nodes i and j, with weight
is given by:
For a canonical ensemble, the entropy is presented in the form of a Shannon entropy:
Relation between Gibbs and Shannon entropy
Network ensemble
with given number of nodes
and links
, and its conjugate-canonical ensemble
are characterized as microcanonical and canonical ensembles and they have Gibbs entropy
and the Shannon entropy S, respectively. The Gibbs entropy in the
ensemble is given by:
[7]
For
ensemble,
Inserting
into the Shannon entropy:
[6]
The relation indicates that the Gibbs entropy
and the Shannon entropy per node S/N of random graphs are equal in the
thermodynamic limit
.
See also
Notes and References
- Levin . E. . Tishby . N. . Solla . S.A.. Sara Solla . A statistical approach to learning and generalization in layered neural networks . Proceedings of the IEEE . October 1990 . 78 . 10 . 1568–1574 . 10.1109/5.58339 . 5254307 . 1558-2256.
- The entropy of randomized network ensembles . EPL (Europhysics Letters) . 2008 . 81 . 2 . 10.1209/0295-5075/81/28005 . en . 0295-5075. Bianconi . Ginestra . 28005 . 0708.0153 . 2008EL.....8128005B . 17269886 .
- Menichetti . Giulia . Remondini . Daniel . Entropy of a network ensemble: definitions and applications to genomic data . Theoretical Biology Forum . 2014 . 107 . 1–2 . 77–87 . 25936214 . 0035-6050.
- Krawitz . Peter . Shmulevich . Ilya . Entropy of complex relevant components of Boolean networks . Physical Review E . 27 September 2007 . 76 . 3 . 036115 . 10.1103/PhysRevE.76.036115 . 17930314 . 0708.1538 . 2007PhRvE..76c6115K . 6192682 .
- Bianconi . Ginestra . Entropy of network ensembles . Physical Review E . 27 March 2009 . 79 . 3 . 036114 . 10.1103/PhysRevE.79.036114 . 19392025 . 0802.2888 . 2009PhRvE..79c6114B . 26082469 .
- Anand . Kartik . Bianconi . Ginestra . Entropy measures for networks: Toward an information theory of complex topologies . Physical Review E . 13 October 2009 . 80 . 4 . 045102 . 10.1103/PhysRevE.80.045102 . 19905379 . 0907.1514 . 2009PhRvE..80d5102A . 27419558 .
- Bogacz . Leszek . Burda . Zdzisław . Wacław . Bartłomiej . Homogeneous complex networks . Physica A: Statistical Mechanics and Its Applications . 1 July 2006 . 366 . 587–607 . 10.1016/j.physa.2005.10.024 . cond-mat/0502124 . 2006PhyA..366..587B . 119428248 . en . 0378-4371.