%0 Journal Article %@ 21693536 %A Minh, P.V.T. %A Viet, N.D.D. %A Son, N.T. %A Anh, B.N. %A Jaafar, J. %D 2023 %F scholars:19334 %I Institute of Electrical and Electronics Engineers Inc. %J IEEE Access %K Binary codes; Convolution; Deep learning; Hash functions; Image representation; Image retrieval; Nearest neighbor search; Network coding; Semantic Segmentation; Semantics; Supervised learning, Code; Convolutional neural network; Loss measurement; Neural-networks; Optimisations; Quantisation; Quantization (signal); Supervised deep hashing, Neural networks %P 30094-30108 %R 10.1109/ACCESS.2023.3259104 %T RelaHash: Deep Hashing with Relative Position %U https://khub.utp.edu.my/scholars/19334/ %V 11 %X Deep hashing has been widely used as a solution to encoding binary hash code for approximating nearest neighbor problem. It has been showing superior performance in terms of its ability to index high-level features by learning compact binary code. Many recent state-of-the-art deep hashing methods often use multiple loss terms at once, thus introducing optimization difficulty and may result in sub-optimal hash codes. OrthoHash was proposed to replace those losses with just a single loss function. However, the quantization error minimization problem in OrthoHash is still not addressed effectively. In this paper, we take one step further -propose a single-loss model that can effectively minimize the quantization error without explicit loss terms. Specifically, we introduce a new way to measure the similarity between the relaxed codes with centroids, called relative similarity. The relative similarity is the similarity between the relative position representation of continuous codes and the normalized centroids. The resulting model outperforms many state-of-the-art deep hashing models on popular benchmark datasets. © 2013 IEEE. %Z cited By 2