Synergy, redundancy, and independence in population codes.

TitleSynergy, redundancy, and independence in population codes.
Publication TypeJournal Article
Year of Publication2003
AuthorsSchneidman, E, Bialek, W, Berry, MJ
JournalJ Neurosci
Volume23
Issue37
Pagination11539-53
Date Published2003 Dec 17
ISSN1529-2401
KeywordsAction Potentials, Animals, Models, Neurological, Nerve Net, Neurons, Synaptic Transmission
Abstract

<p>A key issue in understanding the neural code for an ensemble of neurons is the nature and strength of correlations between neurons and how these correlations are related to the stimulus. The issue is complicated by the fact that there is not a single notion of independence or lack of correlation. We distinguish three kinds: (1) activity independence; (2) conditional independence; and (3) information independence. Each notion is related to an information measure: the information between cells, the information between cells given the stimulus, and the synergy of cells about the stimulus, respectively. We show that these measures form an interrelated framework for evaluating contributions of signal and noise correlations to the joint information conveyed about the stimulus and that at least two of the three measures must be calculated to characterize a population code. This framework is compared with others recently proposed in the literature. In addition, we distinguish questions about how information is encoded by a population of neurons from how that information can be decoded. Although information theory is natural and powerful for questions of encoding, it is not sufficient for characterizing the process of decoding. Decoding fundamentally requires an error measure that quantifies the importance of the deviations of estimated stimuli from actual stimuli. Because there is no a priori choice of error measure, questions about decoding cannot be put on the same level of generality as for encoding.</p>

Alternate JournalJ. Neurosci.
PubMed ID14684857