Image Quality Assessment for Gaussian Blur using Siamese Network combined with ResNet-18
Abstract
This paper presents a novel Image Quality Assessment (IQA) framework, SNR (Siamese Network with ResNet-18), specifically designed for Gaussian blur detection. The approach leverages a Siamese network architecture combined with the ResNet-18 backbone to process image pairs—one blurred and one reference—to predict image quality based on their differences. The model effectively captures high-frequency features lost due to blur, such as edges and gradients. We conduct extensive experiments on the TID2013 dataset, showing that SNR achieves superior performance in blur-specific IQA tasks compared to other full-reference methods. Data augmentation techniques significantly improve model generalization, resulting in a test accuracy of 97.37% for ResNet-18. The proposed method demonstrates a strong correlation with human judgment and robust generalization across various image contents, with future work focusing on expanding its applicability to other distortions and optimizing computational efficiency.
References
1. Narvekar, N.D., Karam, L.J. A no-reference perceptual image sharpness metric based on a cumulative probability of blur detection. In: Proceedings of the International Workshop on Quality of Multimedia Experience (QoMEX); 18 September 2009; San Diego, CA, USA. pp. 87–91. doi: 10.1109/QOMEX.2009.5246982
2. Kang L, Ye P, Li Y, et al. Convolutional neural networks for no-reference image quality assessment. In: Proceedings of the IEEE conference on computer vision and pattern recognition; 23–28 June 2014; Columbus, OH, USA. pp. 1733–1740.
3. He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2016; 38(9): 1777–1787.
4. Wang Z, Bovik AC, Sheikh HR, Simoncelli EP. Image quality assessment: From error visibility to structural similarity. IEEE Transactions on Image Processing. 2004; 13(4): 600–612. doi: 10.1109/TIP.2003.819861
5. Wang Z, Simoncelli EP, Bovik AC. Multiscale structural similarity for image quality assessment. The Thirty-Seventh Asilomar Conference on Signals, Systems & Computers. 2003; 2: 1398–1402. doi: 10.1109/ACSSC.2003.1292216
6. Zhang L, Zhang L, Mou X, Zhang D. Fsim: A feature similarity index for image quality assessment. IEEE Transactions on Image Processing. 2011; 20(8): 2378–2386. doi: 10.1109/TIP.2011.2109730
7. Mittal A, Moorthy AK, Bovik AC. No-reference image quality assessment in the spatial domain. IEEE Transactions on Image Processing. 2012; 21(12): 4695–4708. doi: 10.1109/TIP.2012.2214050
8. Moorthy AK, Bovik AC. A two-step framework for blind image quality assessment. IEEE Transactions on Image Processing. 2010; 20(9): 2463–2474. doi: 10.1109/TIP.2011.2123162
9. Saad MA, Bovik AC, Charrier C. Blind image quality assessment: A naturalscene statistics approach in the dct domain. IEEE Transactions on Image Processing. 2012; 21(8): 3339–3352. doi: 10.1109/TIP.2012.2191563
10. Allaert B, Mennesson J, Bilasco IM, et al. Impact of the face registration techniques on facial expressions recognition. Signal Processing: Image Communication. 2018; 61: 44–53. 11. Nawaz T, Poiesi F, Cavallaro A. Assessing tracking assessment measures. In: Proceedings of the 2014 IEEE International Conference on Image Processing (ICIP); 27–30 October 2014; Paris, France. pp. 441–445.
11. Ferzli R, Karam LJ. A no-reference objective image sharpness metric based on the notion of just noticeable blur (JNB). IEEE transactions on image processing. 2009; 18(4): 717–728. 13. Craciun P, Zerubia J. Unsupervised marked point process model for boat extraction in harbors from high resolution optical remotely sensed images. In: Proceedings of the 2013 IEEE International Conference on Image Processing; 15–18 September 2013; Melbourne, Australia. pp. 4122–4125.
12. Joshi P, Prakash S. No-reference image quality assessment using contourlet transform. Multimedia Tools and Applications. 2018; 77(13): 16977–16994.
13. Bosse S, Maniry D, Müller KR, et al. Deep neural networks for no-reference and full-reference image quality assessment. IEEE Transactions on image processing. 2017; 27(1): 206–219.
14. Liu X, Van De Weijer J, Bagdanov AD. Rankiqa: Learning from rankings for no-reference image quality assessment. arXiv preprint. 2017. arXiv:1707.08347
15. Cong H, Fu L, Zhang R, et al. Image quality assessment with gradient siamese network. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 18–24 June 2022; New Orleans, LA, USA. pp. 1201–1210.
16. Xu K, Liao L, Xiao J, et al. Boosting image quality assessment through efficient transformer adaptation with local feature enhancement. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition; 16–22 June 2024; Seattle, WA, USA. pp. 2662–2672.
17. Avanaki NJ, Ghildyal A, Barman N, et al. Lar-iqa: A lightweight, accurate, and robust no-reference image quality assessment model. arXiv preprint. 2024; 17057.
18. Liu, H., Li, Z., Liao, S., et al. TransIFC: Invariant cues-aware feature concentration learning for efficient fine-grained bird image classification. IEEE Transactions on Multimedia, 2023.
19. Yi Deng, Jun Ma, Ziyi Wu, Wanhao Wang, Hai Liu. DSR-Net: Distinct selective rollback queries for road cracks detection with detection transformer. Digital Signal Processing. 2025; 164: 105266. doi: 10.1016/j.dsp.2025.105266
20. Zhang, Y., Li, G., Lu, J., et al. UGeNet: Uncertainty-aware gaze estimation with discriminative embedding. In: Proceedings of the European Conference on Computer Vision (ECCV); 2022. pp. 253–270.
21. Li L, Wu D, Wu J, et al. Blind image quality assessment using semantics and perception. IEEE Access. 2017; 5: 24303–24312. doi: 10.1109/ACCESS.2017.2767858
22. Zhang W, Li L, Wu J, Lin W. Blind image quality assessment using a no-reference structural similarity index. IEEE Signal Processing Letters. 2018; 25(11): 1658–1662. doi: 10.1109/LSP.2018.2870778
23. Ren H, Chen D, Wang Y. RAN4IQA: restorative adversarial nets for no-reference image quality assessment. Available online: http://arxiv.org/abs/1712.05444 (accessed on 2 June 2025).
24. Bosse S, Maniry D, Wiegand T, et al. A deep neural network for image quality assessment. In: Proceedings of the IEEE international conference on image processing (ICIP); 25–28 September 2016; Phoenix, AZ, USA. pp. 3773–3777.
25. Li D, Jiang T, Jiang M. Exploiting high-level semantics for no-reference image quality assessment of realistic blur images. In: Proceedings of the 25th ACM international conference on Multimedia; 23–27 October 2017; New York, NY, USA. pp. 378–386.
26. Lin KY, Wang G. Hallucinated-IQA: No-reference image quality assessment via adversarial learning. In: Proceedings of the IEEE conference on computer vision and pattern recognition; 18–23 June 2018; Salt Lake City, UT, USA. pp. 732–741.