Please use this identifier to cite or link to this item:
|Scopus||Web of Science®||Altmetric|
|Title:||Clustering with Hypergraphs: The Case for Large Hyperedges|
|Citation:||IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017; 39(9):1697-1711|
|Pulak Purkait, Tat-Jun Chin, Alireza Sadri, and David Suter|
|Abstract:||The extension of conventional clustering to hypergraph clustering, which involves higher order similarities instead of pairwise similarities, is increasingly gaining attention in computer vision. This is due to the fact that many clustering problems require an affinity measure that must involve a subset of data of size more than two. In the context of hypergraph clustering, the calculation of such higher order similarities on data subsets gives rise to hyperedges. Almost all previous work on hypergraph clustering in computer vision, however, has considered the smallest possible hyperedge size, due to a lack of study into the potential benefits of large hyperedges and effective algorithms to generate them. In this paper, we show that large hyperedges are better from both a theoretical and an empirical standpoint. We then propose a novel guided sampling strategy for large hyperedges, based on the concept of random cluster models. Our method can generate large pure hyperedges that significantly improve grouping accuracy without exponential increases in sampling costs. We demonstrate the efficacy of our technique on various higher-order grouping problems. In particular, we show that our approach improves the accuracy and efficiency of motion segmentation from dense, long-term, trajectories.|
|Keywords:||Higher order grouping; hypergraph clustering; motion segmentation|
|Description:||Date of publication 3 Oct. 2016; date of current version 11 Aug. 2017.|
|Rights:||© 2016 IEEE|
|Appears in Collections:||Computer Science publications|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.