Entropy and information provide natural measures of correlation among elements in a network. We construct here the information theoretic analog of connected correlation functions: irreducible N-point correlation is measured by a decrease in entropy for the joint distribution of N variables relative to the maximum entropy allowed by all the observed N-1 variable distributions. We calculate the "connected information" terms for several examples and show that it also enables the decomposition of the information that is carried by a population of elements about an outside source.

}, keywords = {Information Services, Models, Genetic, Models, Neurological, Models, Theoretical}, issn = {0031-9007}, doi = {10.1103/PhysRevLett.91.238701}, author = {Schneidman, Elad and Still, Susanne and Berry, Michael J and Bialek, William} }