Please use this identifier to cite or link to this item:
Scopus Web of Science® Altmetric
Type: Conference paper
Title: A general two-step approach to learning-based hashing
Author: Lin, G.
Shen, C.
Suter, D.
Van Den Hengel, A.
Citation: Proceedings, 2013 IEEE International Conference on Computer Vision, ICCV 2013: pp.2552-2559
Publisher: IEEE Computer Society
Publisher Place: USA
Issue Date: 2013
Series/Report no.: IEEE International Conference on Computer Vision
ISBN: 9781479928392
ISSN: 1550-5499
Conference Name: IEEE International Conference on Computer Vision (14th : 2013 : Sydney, Australia)
Statement of
Guosheng Lin, Chunhua Shen, David Suter, Anton van den Hengel
Abstract: Most existing approaches to hashing apply a single form of hash function, and an optimization process which is typically deeply coupled to this specific form. This tight coupling restricts the flexibility of the method to respond to the data, and can result in complex optimization problems that are difficult to solve. Here we propose a flexible yet simple framework that is able to accommodate different types of loss functions and hash functions. This framework allows a number of existing approaches to hashing to be placed in context, and simplifies the development of new problem-specific hashing methods. Our framework decomposes the hashing learning problem into two steps: hash bit learning and hash function learning based on the learned bits. The first step can typically be formulated as binary quadratic problems, and the second step can be accomplished by training standard binary classifiers. Both problems have been extensively studied in the literature. Our extensive experiments demonstrate that the proposed framework is effective, flexible and outperforms the state-of-the-art.
Rights: © 2013 IEEE
RMID: 0020137399
DOI: 10.1109/ICCV.2013.317
Appears in Collections:Computer Science publications

Files in This Item:
File Description SizeFormat 
RA_hdl_83879.pdfRestricted Access321.49 kBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.