Difference between revisions of "CSB Systems extraction"

From "A B C"
Jump to navigation Jump to search
m
m
Line 17: Line 17:
 
==Introductory reading==
 
==Introductory reading==
 
<section begin=reading />
 
<section begin=reading />
This is a useful introduction to the use of {{WP|information theory}}, in particular  {{WP|mutual information}} for the analysis of signal transduction networks.
+
Here is a useful introduction to the use of {{WP|information theory}}, in particular  {{WP|mutual information}} for the analysis of signal transduction networks.
 
{{#pmid:21798319}}  <!-- intro. reading -->
 
{{#pmid:21798319}}  <!-- intro. reading -->
<section end=reading />
 
  
 
Mutual information is at the core of a novel approach to quantify non-linear correlations in data. Read the perspective on this recent work here:
 
Mutual information is at the core of a novel approach to quantify non-linear correlations in data. Read the perspective on this recent work here:

Revision as of 19:43, 9 March 2012

Mutual information


This page is a placeholder, or under current development; it is here principally to establish the logical framework of the site. The material on this page is correct, but incomplete.


A powerful concept within the mathematical theory of information, the Mutual Information of two variables measures how much the knowledge about one variable reduces uncertainty about the other. For example, if two genes always either occur as a pair, or are both absent from a genome, it is sufficient to know whether one is present or not, to also know about the other. In biology, genes with high mutual information invariably are either components of physical complexes or collaborate functionally. Thus measuring mutual information in large datasets can be used to infer such relationships.



 

Introductory reading

Here is a useful introduction to the use of information theory, in particular mutual information for the analysis of signal transduction networks.

Waltermann & Klipp (2011) Information theory based approaches to cellular signaling. Biochim Biophys Acta 1810:924-32. (pmid: 21798319)

PubMed ] [ DOI ]

Mutual information is at the core of a novel approach to quantify non-linear correlations in data. Read the perspective on this recent work here:

Speed (2011) Mathematics. A correlation for the 21st century. Science 334:1502-3. (pmid: 22174235)

PubMed ] [ DOI ]

The actual paper is here; have a look, but its contents wil not be material for the quiz.

Reshef et al. (2011) Detecting novel associations in large data sets. Science 334:1518-24. (pmid: 22174245)

PubMed ] [ DOI ]


 

Contents

...

   

Further reading and resources

Wu et al. (2003) Identification of functional links between genes using phylogenetic profiles. Bioinformatics 19:1524-30. (pmid: 12912833)

PubMed ] [ DOI ]

Wu et al. (2005) Deciphering protein network organization using phylogenetic profile groups. Genome Inform 16:142-9. (pmid: 16362916)

PubMed ]

Rao et al. (2008) Using directed information to build biologically relevant influence networks. J Bioinform Comput Biol 6:493-519. (pmid: 18574860)

PubMed ] [ DOI ]

Luo & Woolf (2010) Reconstructing transcriptional regulatory networks using three-way mutual information and Bayesian networks. Methods Mol Biol 674:401-18. (pmid: 20827604)

PubMed ] [ DOI ]

Speed (2011) Mathematics. A correlation for the 21st century. Science 334:1502-3. (pmid: 22174235)

PubMed ] [ DOI ]

Reshef et al. (2011) Detecting novel associations in large data sets. Science 334:1518-24. (pmid: 22174245)

PubMed ] [ DOI ]