Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/56429
Citations
Scopus Web of Science® Altmetric
?
?
Type: Conference paper
Title: Robust Fitting by Adaptive-Scale Residual Consensus
Author: Wang, H.
Suter, D.
Citation: Computer vision, ECCV 2004: Proceedings of the 8th European Conference on Computer Vision, Part III. May 11-14, 2004 / Tomáš Pajdla and Jiří Matas(eds.): pp.107-118
Publisher: Springer
Publisher Place: Berlin
Issue Date: 2004
Series/Report no.: Lecture Notes in Computer Science, Computer Vision - ECCV 2004 ; v. 3023
ISBN: 354021982X
ISSN: 0302-9743
1611-3349
Conference Name: European Conference on Computer Vision (8th : 2004 : Prague, Czech Republic)
Editor: Pajdla, T.
Matas, J.
Statement of
Responsibility: 
Hanzi Wang and David Suter
Abstract: Computer vision tasks often require the robust fit of a model to some data. In a robust fit, two major steps should be taken: i) robustly estimate the parameters of a model, and ii) differentiate inliers from outliers. We propose a new estimator called Adaptive-Scale Residual Consensus (ASRC). ASRC scores a model based on both the residuals of inliers and the corresponding scale estimate determined by those inliers. ASRC is very robust to multiple-structural data containing a high percentage of outliers. Compared with RANSAC, ASRC requires no pre-determined inlier threshold as it can simultaneously estimate the parameters of a model and the scale of inliers belonging to that model. Experiments show that ASRC has better robustness to heavily corrupted data than other robust methods. Our experiments address two important computer vision tasks: range image segmentation and fundamental matrix calculation. However, the range of potential applications is much broader than these.
DOI: 10.1007/b97871
Published version: http://www.springerlink.com/content/6lxbwmqgatb4/
Appears in Collections:Aurora harvest
Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.