Partenaires

CNRS
Logo tutelle
Logo tutelle
Logo tutelle


Rechercher

Sur ce site

Sur le Web du CNRS


Accueil du site > Séminaires > Probabilités Statistiques et réseaux de neurones > Supervised Recursive Networks

Vendredi 11 mars 2005, à 10h

Supervised Recursive Networks

Barbara Hammer (Technical University of Clausthal, Allemagne)

Résumé : Nowadays, supervised recursive neural networks (RecNNs) constitute well-established neural methods for the automatic learning of functions on tree structured data. Application areas range from natural language parsing, bioinformatics, chemistry, up to logo recognition. Their training agorithms and theoretical properties such as the approximation capability and learnability are well-established. In recent years, an increasing interest can be observed to generalize the models, which have been proposed for recursive structures with a distinct direction of processing, towards more general objects such as spatial sequences and graphs. However, results on their capacity in these domains are so far rare. In this talk, we first give an overview about recursive neural networks, their training methods, applications, and theoretical properties. Afterwards, we examine one specific generalization of RecNNs to graph structures which includes context. This model is based on the popular Cascade Correlation (CC) architecture, a very powerful and fast learning architecture. We discuss the universal approximation capability of this model within several variants : the original CC architecture for time series, an extension which includes tree processing, and an extension by means of context which can approximate mappings on acyclic graph structures (including real valued outputs as well as IO-isomorphic mappings).

Voir en ligne : le site de Barbara Hammer

Dans la même rubrique :