Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/60934
Citations
Scopus Web of Science® Altmetric
?
?
Full metadata record
DC FieldValueLanguage
dc.contributor.authorWang, H.-
dc.contributor.authorMirota, D.-
dc.contributor.authorHager, G.-
dc.date.issued2010-
dc.identifier.citationIEEE Transactions on Pattern Analysis and Machine Intelligence, 2010; 32(1):178-184-
dc.identifier.issn0162-8828-
dc.identifier.issn1939-3539-
dc.identifier.urihttp://hdl.handle.net/2440/60934-
dc.description.abstractIn this paper, we present a new adaptive-scale kernel consensus (ASKC) robust estimator as a generalization of the popular and state-of-the-art robust estimators such as random sample consensus (RANSAC), adaptive scale sample consensus (ASSC), and maximum kernel density estimator (MKDE). The ASKC framework is grounded on and unifies these robust estimators using nonparametric kernel density estimation theory. In particular, we show that each of these methods is a special case of ASKC using a specific kernel. Like these methods, ASKC can tolerate more than 50 percent outliers, but it can also automatically estimate the scale of inliers. We apply ASKC to two important areas in computer vision, robust motion estimation and pose estimation, and show comparative results on both synthetic and real data.-
dc.description.statementofresponsibilityHanzi Wang, Daniel Mirota, Gregory D. Hager-
dc.language.isoen-
dc.publisherIEEE Computer Soc-
dc.rights© 2010 IEEE-
dc.source.urihttp://dx.doi.org/10.1109/tpami.2009.148-
dc.subjectRobust statistics-
dc.subjectkernel density estimation-
dc.subjectmodel fitting-
dc.subjectmotion estimation-
dc.subjectpose estimation-
dc.titleA generalized kernel consensus-based robust estimator-
dc.typeJournal article-
dc.identifier.doi10.1109/TPAMI.2009.148-
pubs.publication-statusPublished-
Appears in Collections:Aurora harvest 5
Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.