Image search by graph-based label propagation with image representation from DNN

Yingwei Pan, Ting Yao, Kuiyuan Yang, Houqiang Li, Chong-Wah Ngo, Jingdong Wang, Tao Mei

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

Abstract

Our objective is to estimate the relevance of an image to a query for image search purposes. We address two limi- Tations of the existing image search engines in this paper. First, there is no straightforward way of bridging the gap between semantic textual queries as well as users' search in- Tents and image visual content. Image search engines there- fore primarily rely on static and textual features. Visual features are mainly used to identify potentially useful re- current patterns or relevant training examples for comple- menting search by image reranking. Second, image rankers are trained on query-image pairs labeled by human experts, making the annotation intellectually expensive and time- consuming. Furthermore, the labels may be subjective when the queries are ambiguous, resulting in difficulty in predict- ing the search intention. We demonstrate that the afore- mentioned two problems can be mitigated by exploring the use of click-through data, which can be viewed as the foot- prints of user searching behavior, as an effective means of understanding query. The correspondences between an image and a query are determined by whether the image was searched and clicked by users under the query in a commercial image search en- gine. We therefore hypothesize that the image click counts in response to a query are as their relevance indications. For each new image, our proposed graph-based label propaga- Tion algorithm employs neighborhood graph search to find the nearest neighbors on an image similarity graph built up with visual representations from deep neural networks and further aggregates their clicked queries/click counts to get the labels of the new image. We conduct experiments on MSR-Bing Grand Challenge and the results show consistent performance gain over various baselines. In addition, the proposed approach is very efficient, completing annotation of each query-image pair within just 15 milliseconds on a regular PC. Copyright © 2013 ACM.
Original languageEnglish
Title of host publicationMM 2013 - Proceedings of the 2013 ACM Multimedia Conference
Pages397-400
DOIs
Publication statusPublished - 2013
Event21st ACM International Conference on Multimedia, MM 2013 - Barcelona, Spain
Duration: 21 Oct 201325 Oct 2013

Conference

Conference21st ACM International Conference on Multimedia, MM 2013
PlaceSpain
CityBarcelona
Period21/10/1325/10/13

Research Keywords

  • Click-through data
  • Deep neural networks
  • Image search
  • Neighborhood graph search

Fingerprint

Dive into the research topics of 'Image search by graph-based label propagation with image representation from DNN'. Together they form a unique fingerprint.

Cite this