Partenaires

CNRS
Logo tutelle
Logo tutelle
Logo tutelle


Rechercher

Sur ce site

Sur le Web du CNRS


Accueil du site > Séminaires > Probabilités Statistiques et réseaux de neurones > Non-parametric Residual Variance Estimation in SupervisedLearning with Applications

Vendredi 9 novembre 2007 à 11h00

Non-parametric Residual Variance Estimation in SupervisedLearning with Applications

Elia Liitiäinen (Helsinki University of Technology, Finlande)

Résumé : The residual variance estimation problem is well-known in statistics and machine learning with many applications for example in the field of nonlinear modelling. In my presentation I formulate the problem in a general supervised learning context. Three different statistical estimators are introduced and their properties are analyzed. It is shown that a simple 1-NN estimator allows a tractable theoretical analysis, but tends to be inaccurate. Two improvements are presented : the Gamma test and a modified 1-NN estimator. The Gamma test is a well established method, while the modified 1-NN estimator is novel. Thirdly, I discuss concrete applications in variable selection, model structure selection and ICA. As a special case of model structure selection, a method for choosing the number of neurons for MLP is demonstrated. Moreover, I show how residual variance estimators can be used to select relevant variables to avoid the curse of dimensionality. Finally, a recently published application in ICA is shortly presented.

Dans la même rubrique :