Difference between revisions of "BIO Machine learning"
(Created page with "<div id="APB"> <div class="b1"> Machine Learning </div> {{dev}} Overview of "classical" and current approaches to machine learning. __TOC__ <!-- (Classification problem...") |
|||
Line 56: | Line 56: | ||
==Further reading and resources== | ==Further reading and resources== | ||
{{#pmid: 16764513}} | {{#pmid: 16764513}} | ||
− | |||
{{PDF | {{PDF | ||
|authors= Lodhi, H. | |authors= Lodhi, H. | ||
|year= 2012 | |year= 2012 | ||
|title= Computational biology perspective: kernel methods and deep learning | |title= Computational biology perspective: kernel methods and deep learning | ||
− | |journal= | + | |journal= WIREs: Computational Statistics |
|volume= 4(5) | |volume= 4(5) | ||
|pages= 455-465 | |pages= 455-465 |
Revision as of 20:07, 16 November 2012
Machine Learning
Overview of "classical" and current approaches to machine learning.
Contents
Introductory reading
Introduction
Paradigms
Neural Networks
Hidden Markov Models
Support Vector Machines
Introduction
Introduction
Introduction
Introduction
Introduction
Further reading and resources
Hinton et al. (2006) A fast learning algorithm for deep belief nets. Neural Comput 18:1527-54. (pmid: 16764513) |
[ PubMed ] [ DOI ] We show how to use "complementary priors" to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. Using complementary priors, we derive a fast, greedy algorithm that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory. The fast, greedy algorithm is used to initialize a slower learning procedure that fine-tunes the weights using a contrastive version of the wake-sleep algorithm. After fine-tuning, a network with three hidden layers forms a very good generative model of the joint distribution of handwritten digit images and their labels. This generative model gives better digit classification than the best discriminative learning algorithms. The low-dimensional manifolds on which the digits lie are modeled by long ravines in the free-energy landscape of the top-level associative memory, and it is easy to explore these ravines by using the directed connections to display what the associative memory has in mind. |
Lodhi, H. (2012) Computational biology perspective: kernel methods and deep learning. WIREs: Computational Statistics 4(5):455-465. |
(pmid: None) [ Source URL ][ DOI ] The field of machine learning provides useful means and tools for finding accurate solutions to complex and challenging biological problems. In recent years a class of learning algorithms namely kernel methods has been successfully applied to various tasks in computational biology. In this article we present an overview of kernel methods and support vector machines and focus on their applications to biological sequences. We also describe a new class of approaches that is termed as deep learning. These techniques have desirable characteristics and their use can be highly effective within the field of computational biology. |