Network transfer entropy and metric space for causality inference

Phys Rev E Stat Nonlin Soft Matter Phys. 2013 May;87(5):052814. doi: 10.1103/PhysRevE.87.052814. Epub 2013 May 31.

Abstract

A measure is derived to quantify directed information transfer between pairs of vertices in a weighted network, over paths of a specified maximal length. Our approach employs a general, probabilistic model of network traffic, from which the informational distance between dynamics on two weighted networks can be naturally expressed as a Jensen Shannon divergence. Our network transfer entropy measure is shown to be able to distinguish and quantify causal relationships between network elements, in applications to simple synthetic networks and a biological signaling network. We conclude with a theoretical extension of our framework, in which the square root of the Jensen Shannon Divergence induces a metric on the space of dynamics on weighted networks. We prove a convergence criterion, demonstrating that a form of convergence in the structure of weighted networks in a family of matrix metric spaces implies convergence of their dynamics with respect to the square root Jensen Shannon divergence metric.

MeSH terms

  • Algorithms*
  • Animals
  • Computer Simulation
  • Humans
  • Metabolome / physiology*
  • Models, Biological*
  • Models, Statistical*
  • Signal Transduction / physiology*