Partenaires

CNRS
Logo tutelle
Logo tutelle
Logo tutelle


Rechercher

Sur ce site

Sur le Web du CNRS


Accueil du site > Séminaires > Probabilités Statistiques et réseaux de neurones > Relevance Determination in Learning Vector Quantization

Vendredi 4 mars 2005, à 10h

Relevance Determination in Learning Vector Quantization

Barbara Hammer (Technical University of Clausthal, Allemagne)

Résumé : Prototype based classifiers like Kohonen’s Learning Vector Quantization (LVQ) constitute intuitive and efficient classification mechanisms in machine learning which application area ranges from speach recognition up to bioinformatics. However, the basic algorithm has several drawbacks :

it does not explicitely obey a cost function such that the behavior can be instable in particular for overlapping classes ; the algorithm depends crucially on the euclidean metric and fails if dimensions are inappropriately scaled or disrupted. Thus, problems occur in particular for high dimensional and noisy real-life data. In this talk, we introduce an alternative derivation of LVQ-type algorithms by means of an appropriate cost function. This general formulation allows to include arbitrary differentiable metrics within the problem. These metrics can be designed in a problem dependent way, and metric parameters can be automatically adapted during training. We demonstrate the improved behavior on two examples, a technical application and an application from computational biology. In addition, we accompany the experimental results by a theoretical counterpart showing that the algorithm can be interpreted (for specific metrics) as large margin algorithm including formal generalization bounds from statistical learning theory.

Voir en ligne : le site de Barbara Hammer

Dans la même rubrique :