Please use this identifier to cite or link to this item:
Title: Relative order analysis and optimization for unsupervised deep metric learning
Authors: Kan S.
Cen Y.
Li, Yidong
Mladenovic, Vladimir
He Z.
Journal: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Issue Date: 1-Jan-2021
Abstract: In unsupervised learning of image features without labels, especially on datasets with fine-grained object classes, it is often very difficult to tell if a given image belongs to one specific object class or another, even for human eyes. However, we can reliably tell if image C is more similar to image A than image B. In this work, we propose to explore how this relative order can be used to learn discriminative features with an unsupervised metric learning method. Instead of resorting to clustering or self-supervision to create pseudo labels for an absolute decision, which often suffers from high label error rates, we construct reliable relative orders for groups of image samples and learn a deep neural network to predict these relative orders. During training, this relative order prediction network and the feature embedding network are tightly coupled, providing mutual constraints to each other to improve overall metric learning performance in a cooperative manner. During testing, the predicted relative orders are used as constraints to optimize the generated features and refine their feature distance-based image retrieval results using a constrained optimization procedure. Our experimental results demonstrate that the proposed relative orders for unsupervised learning (ROUL) method is able to significantly improve the performance of unsupervised deep metric learning.
Type: Conference Paper
DOI: 10.1109/CVPR46437.2021.01378
ISSN: 10636919
SCOPUS: 85123215318
Appears in Collections:Faculty of Technical Sciences, Čačak
[ Google Scholar ]

Page views(s)




Files in This Item:
File Description SizeFormat 
  Restricted Access
29.86 kBAdobe PDFThumbnail

Items in SCIDAR are protected by copyright, with all rights reserved, unless otherwise indicated.