Please use this identifier to cite or link to this item: https://une.intersearch.com.au/unejspui/handle/1959.11/307
Title: On a relaxation-labeling algorithm for real-time contour-based image similarity retrieval
Contributor(s): Kwan, PH (author); Kameyama, K (author); Toraichi, K (author)
Publication Date: 2003
DOI: 10.1016/S0262-8856(02)00159-2
Handle Link: https://hdl.handle.net/1959.11/307
Abstract: In this paper, we propose a relaxation-labeling algorithm for real-time contour-based image similarity retrieval that treats the matchingbetween two images as a consistent labeling problem. To satisfy real-time response, our algorithm works by reducing the size of the labeling problem, thus decreasing the processing required. This is accomplished by adding compatibility constraints on contour segments between the images to reduce the size of the relational network and the order of the compatibility coefficient matrix. Particularly, a relatively strong type constraint based on approximating contour segments by straight line, arc, and smooth curve is introduced. A distance metric, defined using the negative of an objective function maximized by the relaxation labeling processes, is used in computing the similarity ranking.Experiments are conducted on 700 trademark images from the Japan Patent Office for evaluation.
Publication Type: Journal Article
Source of Publication: Image and Vision Computing, 21(3), p. 285-294
Publisher: Elsevier
Place of Publication: Netherlands
ISSN: 0262-8856
Field of Research (FOR): 080109 Pattern Recognition and Data Mining
Peer Reviewed: Yes
HERDC Category Description: C1 Refereed Article in a Scholarly Journal
Statistics to Oct 2018: Visitors: 205
Views: 220
Downloads: 0
Appears in Collections:Journal Article

Files in This Item:
2 files
File Description SizeFormat 
Show full item record

SCOPUSTM   
Citations

10
checked on Nov 26, 2018

Page view(s)

42
checked on Dec 29, 2018
Google Media

Google ScholarTM

Check

Altmetric

SCOPUSTM   
Citations

 

Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.