Please use this identifier to cite or link to this item:
|Scopus||Web of Science®||Altmetric|
|Title:||Laplacian margin distribution boosting for learning from sparsely labeled data|
|Citation:||Proceedings of International Conference on on Digital Image Computing: Techniques and Applications: DICTA 2011: pp.209-216|
|Conference Name:||Digital Image Computing Techniques and Applications (2011 : Noosa, Qld.)|
|Tao Wang, Xuming He, Chunhua Shen, and Nick Barnes|
|Abstract:||Boosting algorithms attract much attention in computer vision and image processing because of their strong performance in a variety of applications. Recent progress on the theory of boosting algorithms suggests a close link between good generalization and the margin distribution of the classifier w.r.t. a dataset. In this paper, we propose a novel data-dependent margin distribution learning criterion for boosting, termed Laplacian MDBoost, which utilizes the intrinsic geometric structure of dataset. One key aspect of our method is that it can seamlessly incorporate unlabeled data by including a graph Laplacian regularizer. We derive a dual formulation of the learning problem that can be efficiently solved by column generation. Experiments on various datasets validate the effectiveness of the new graph Laplacian based learning criterion on both supervised and unsupervised learning settings. We also show that the performance of our algorithm outperforms the state-of-the-art semi-supervised learning algorithms on a variety of inductive inference tasks, including real world video segmentation.|
|Keywords:||margin distribution boosting; Laplacian eigenmaps; semi-supervised learning|
|Rights:||Copyright © 2011 by The Institute of Electrical and Electronics Engineers, Inc.|
|Appears in Collections:||Computer Science publications|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.