A Novel Approach for Robust Perceptual Image Hashing

  •  Azhar Hadmi    
  •  Awatif Rouijel    


Perceptual image hashing system generates a short signature called perceptual hash attached to an image before transmission and acts as side information for analyzing the trustworthiness of the received image. In this paper, we propose a novel approach to improve robustness for perceptual image hashing scheme for generating a perceptual hash that should be resistant to content-preserving manipulations, such as JPEG compression and Additive white Gaussian noise (AWGN) also should differentiate the maliciously tampered image and its original version. Our algorithm first constructs a robust image, derived from the original input by analyzing the stability of the extracted features and improving their robustness. From the robust image, which does perceptually resemble the original input, we further extract the final robust features. Next, robust features are suitably quantized allowing the generation of the final perceptual hash using the cryptographic hash function SHA1. The main idea of this paper is to transform the original image into a more robust one that allows the extraction of robust features. Generation of the robust image turns out be quite important since it introduces further robustness to the perceptual image hashing system. The paper can be seen as an attempt to propose a general methodology for more robust perceptual image hashing. The experimental results presented in this paper reveal that the proposed scheme offers good robustness against JPEG compression and Additive white Gaussian noise.

This work is licensed under a Creative Commons Attribution 4.0 License.
  • ISSN(Print): 1913-8989
  • ISSN(Online): 1913-8997
  • Started: 2008
  • Frequency: quarterly

Journal Metrics

WJCI (2020): 0.439

Impact Factor 2020 (by WJCI): 0.247

Google Scholar Citations (March 2022): 6907

Google-based Impact Factor (2021): 0.68

h-index (December 2021): 37

i10-index (December 2021): 172

(Click Here to Learn More)