Misplaced Pages

Graph entropy

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

In information theory, the graph entropy is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused. This measure, first introduced by Körner in the 1970s, has since also proven itself useful in other settings, including combinatorics.

Definition

Let G = ( V , E ) {\displaystyle G=(V,E)} be an undirected graph. The graph entropy of G {\displaystyle G} , denoted H ( G ) {\displaystyle H(G)} is defined as

H ( G ) = min X , Y I ( X ; Y ) {\displaystyle H(G)=\min _{X,Y}I(X;Y)}

where X {\displaystyle X} is chosen uniformly from V {\displaystyle V} , Y {\displaystyle Y} ranges over independent sets of G, the joint distribution of X {\displaystyle X} and Y {\displaystyle Y} is such that X Y {\displaystyle X\in Y} with probability one, and I ( X ; Y ) {\displaystyle I(X;Y)} is the mutual information of X {\displaystyle X} and Y {\displaystyle Y} .

That is, if we let I {\displaystyle {\mathcal {I}}} denote the independent vertex sets in G {\displaystyle G} , we wish to find the joint distribution X , Y {\displaystyle X,Y} on V × I {\displaystyle V\times {\mathcal {I}}} with the lowest mutual information such that (i) the marginal distribution of the first term is uniform and (ii) in samples from the distribution, the second term contains the first term almost surely. The mutual information of X {\displaystyle X} and Y {\displaystyle Y} is then called the entropy of G {\displaystyle G} .

Properties

  • Monotonicity. If G 1 {\displaystyle G_{1}} is a subgraph of G 2 {\displaystyle G_{2}} on the same vertex set, then H ( G 1 ) H ( G 2 ) {\displaystyle H(G_{1})\leq H(G_{2})} .
  • Subadditivity. Given two graphs G 1 = ( V , E 1 ) {\displaystyle G_{1}=(V,E_{1})} and G 2 = ( V , E 2 ) {\displaystyle G_{2}=(V,E_{2})} on the same set of vertices, the graph union G 1 G 2 = ( V , E 1 E 2 ) {\displaystyle G_{1}\cup G_{2}=(V,E_{1}\cup E_{2})} satisfies H ( G 1 G 2 ) H ( G 1 ) + H ( G 2 ) {\displaystyle H(G_{1}\cup G_{2})\leq H(G_{1})+H(G_{2})} .
  • Arithmetic mean of disjoint unions. Let G 1 , G 2 , , G k {\displaystyle G_{1},G_{2},\cdots ,G_{k}} be a sequence of graphs on disjoint sets of vertices, with n 1 , n 2 , , n k {\displaystyle n_{1},n_{2},\cdots ,n_{k}} vertices, respectively. Then H ( G 1 G 2 G k ) = 1 i = 1 k n i i = 1 k n i H ( G i ) {\displaystyle H(G_{1}\cup G_{2}\cup \cdots G_{k})={\tfrac {1}{\sum _{i=1}^{k}n_{i}}}\sum _{i=1}^{k}{n_{i}H(G_{i})}} .

Additionally, simple formulas exist for certain families classes of graphs.

  • Complete balanced k-partite graphs have entropy log 2 k {\displaystyle \log _{2}k} . In particular,
    • Edge-less graphs have entropy 0 {\displaystyle 0} .
    • Complete graphs on n {\displaystyle n} vertices have entropy log 2 n {\displaystyle \log _{2}n} .
    • Complete balanced bipartite graphs have entropy 1 {\displaystyle 1} .
  • Complete bipartite graphs with n {\displaystyle n} vertices in one partition and m {\displaystyle m} in the other have entropy H ( n m + n ) {\displaystyle H\left({\frac {n}{m+n}}\right)} , where H {\displaystyle H} is the binary entropy function.

Example

Here, we use properties of graph entropy to provide a simple proof that a complete graph G {\displaystyle G} on n {\displaystyle n} vertices cannot be expressed as the union of fewer than log 2 n {\displaystyle \log _{2}n} bipartite graphs.

Proof By monotonicity, no bipartite graph can have graph entropy greater than that of a complete bipartite graph, which is bounded by 1 {\displaystyle 1} . Thus, by sub-additivity, the union of k {\displaystyle k} bipartite graphs cannot have entropy greater than k {\displaystyle k} . Now let G = ( V , E ) {\displaystyle G=(V,E)} be a complete graph on n {\displaystyle n} vertices. By the properties listed above, H ( G ) = log 2 n {\displaystyle H(G)=\log _{2}n} . Therefore, the union of fewer than log 2 n {\displaystyle \log _{2}n} bipartite graphs cannot have the same entropy as G {\displaystyle G} , so G {\displaystyle G} cannot be expressed as such a union. {\displaystyle \blacksquare }

General References

Notes

  1. Matthias Dehmer; Abbe Mowshowitz; Frank Emmert-Streib (21 June 2013). Advances in Network Complexity. John Wiley & Sons. pp. 186–. ISBN 978-3-527-67048-2.
  2. Körner, János (1973). "Coding of an information source having ambiguous alphabet and the entropy of graphs". 6th Prague Conference on Information Theory: 411–425.
  3. Niels da Vitoria Lobo; Takis Kasparis; Michael Georgiopoulos (24 November 2008). Structural, Syntactic, and Statistical Pattern Recognition: Joint IAPR International Workshop, SSPR & SPR 2008, Orlando, USA, December 4-6, 2008. Proceedings. Springer Science & Business Media. pp. 237–. ISBN 978-3-540-89688-3.
  4. Bernadette Bouchon; Lorenza Saitta; Ronald R. Yager (8 June 1988). Uncertainty and Intelligent Systems: 2nd International Conference on Information Processing and Management of Uncertainty in Knowledge Based Systems IPMU '88. Urbino, Italy, July 4-7, 1988. Proceedings. Springer Science & Business Media. pp. 112–. ISBN 978-3-540-19402-6.
  5. G. Simonyi, "Perfect graphs and graph entropy. An updated survey," Perfect Graphs, John Wiley and Sons (2001) pp. 293-328, Definition 2”
Categories: