Methods of information theory and algorithmic complexity for network biology

Hector Zenil*, Narsis A. Kiani, Jesper Tegner

*Corresponding author for this work

Research output: Contribution to journalReview articlepeer-review

32 Scopus citations


We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdös-Rényi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity.

Original languageEnglish (US)
Pages (from-to)32-43
Number of pages12
JournalSeminars in Cell and Developmental Biology
StatePublished - Mar 1 2016


  • Algorithmic probability
  • Algorithmic randomness
  • Biological networks
  • Complex networks
  • Information theory
  • Kolmogorov complexity

ASJC Scopus subject areas

  • Developmental Biology
  • Cell Biology


Dive into the research topics of 'Methods of information theory and algorithmic complexity for network biology'. Together they form a unique fingerprint.

Cite this