Please use this identifier to cite or link to this item: http://hdl.handle.net/2440/115918
Type: Conference paper
Title: Active learning from noisy tagged images
Author: Abbasnejad, M.E.
Dick, A.R.
Shi, Q.
Hengel, A.V.D.
Citation: Proceedings of BMVC 2018 and Workshops, 2018 / pp.1-13
Publisher: BMVA Press
Issue Date: 2018
Conference Name: British Machine Vision Conference 2018 (BMVC 2018) (03 Sep 2018 - 06 Sep 2018 : Newcastle upon Tyne)
Statement of
Responsibility: 
M. Ehsan Abbasnejad, Anthony Dick, Qinfeng Shi, Anton van den Hengel
Abstract: Learning successful image classification models requires large quantities of labelled examples that are generally hard to obtain. On the other hand, the web provides an abundance of loosely labelled images, i.e. tagged in websites such as Flickr. Although these images are cheap and massively available, their tags are typically noisy and unreliable. In an attempt to use such images for training a classifier, we propose a simple probabilistic model to learn a latent semantic space from which deep vector representations of the images and tags are generated. This latent space is subsequently used in an active learning framework based on adaptive submodular optimisation that selects informative images to be labelled. Afterwards, we update the classifier according to the importance of each labelled image to best capture the information they provide. Through this simple approach, we are able to train a classifier that performs well using a fraction of the effort that is typically required for image labelling and classifier training.
Rights: © 2018. The copyright of this document resides with its authors. It may be distributed unchanged freely in print or electronic forms.
RMID: 0030099027
Published version: http://bmvc2018.org/programmedetail.html
Appears in Collections:Australian Institute for Machine Learning publications
Computer Science publications

Files in This Item:
File Description SizeFormat 
hdl_115918.pdfPublished version3.93 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.