Image retrieval technology has made great breakthroughs in the development of computer vision. Feature extraction is crucial to image retrieval as a good method not only brings convenience to image recognition, but also improves the performance of image retrieval system. Many traditional methods only extract shallow features like the statistics of brightness, color components and texture measures, which miss the contextual semantic information and cannot behave well in image retrieval. However, feature extraction based on deep learning is able to obtain better features with semantic information. Therefore, much work has shown that integrating neural network with hash codes performs better in retrieval tasks. Encouraged by this, we propose an image hash retrieval algorithm by optimizing structures in deep layer aggregation network (DlaNet). This model is mainly composed of an improved net called DlaNet-V and a four-valued hash code scheme. The DlaNet-V is optimized on the basis of DlaNet, improving the efficiency. Meanwhile, the binary hash codes are switched into the quaternary hash codes to make the model more robust and efficient. Experiments are conducted on the CIFAR-10 data set and a medical device image data set collected by authors. Results show that the image retrieval based on DlaNet-V and quaternary hash code is more accurate and stable.
|Original language||English (US)|
|Title of host publication||2019 15th International Conference on Semantics, Knowledge and Grids (SKG)|
|Number of pages||6|
|State||Published - Mar 23 2020|
Bibliographical noteKAUST Repository Item: Exported on 2022-06-30
Acknowledgements: This research was sponsored by the Natural Science Foundation of Nanjing University of Posts and Telecommunications NUPTSF (grant no. NY219080) and the China Postdoctoral Science Foundation (grant no. 2018T110531). The authors also thank King Abdullah University of Science and Technology for the support on computational resources.
This publication acknowledges KAUST support, but has no KAUST affiliated authors.