Density estimation with minimization of U-divergence

Speaker:

Assoc. Professor Kanta Naito

Affiliation:

Shimane University (Japan)

Date:

Fri, 04/03/2011 - 4:00pm to 5:00pm

OMB-145

Abstract:

Recently, there has been renewed widespread interest in supervised learning in regard to regression, classification and pattern recognition.
Boosting has been known as a promising technique with feasible computational algorithms that have received a great deal of attention.
In contrast to supervised learning, boosting approaches to unsupervised learning, such as density estimation, appear to be less developed.
Although it is understood that unsupervised learning is more difficult than supervised learning, there is a need for an effective learning method for density estimation.
The purpose of this study is to develop a general but practical learning method for multivariate density estimation.
Especially the proposed method for density estimation is based on the stagewise minimization of the $U$-divergence.
The $U$-divergence is a general divergence measure involving a convex function $U$ which includes the Kullback-Leibler divergence and the $L_{2}$ norm as special cases.
The algorithm to yield the density estimator is closely related to the boosting algorithm and it is shown that the usual kernel density estimator can also be seen as a special case of the proposed estimator.
Non-asymptotic error bounds of the proposed estimators are developed and numerical experiments show that the proposed estimators often perform better than a kernel density estimator with a sophisticated bandwidth matrix.
The research is a joint work with Shinto Eguchi of The Institute of Statistical Mathematics.