适用小样本的无参考水下视频质量评价方法
Non-reference underwater video quality assessment method for small size samples
- 2020年25卷第9期 页码:1787-1799
纸质出版日期: 2020-09-16 ,
录用日期: 2020-03-16
DOI: 10.11834/jig.200025
移动端阅览
浏览全部资源
扫码关注微信
纸质出版日期: 2020-09-16 ,
录用日期: 2020-03-16
移动端阅览
宋巍, 刘诗梦, 黄冬梅, 王文娟, 王建. 适用小样本的无参考水下视频质量评价方法[J]. 中国图象图形学报, 2020,25(9):1787-1799.
Wei Song, Shimeng Liu, Dongmei Huang, Wenjuan Wang, Jian Wang. Non-reference underwater video quality assessment method for small size samples[J]. Journal of Image and Graphics, 2020,25(9):1787-1799.
目的
2
视频质量评价是视频技术研究的关键之一。水下环境比其他自然环境更加复杂,自然光在深水中被完全吸收,拍摄所用的人工光源在水中传播时会发生光吸收、色散和散射等情况,同时受水体浑浊度和拍摄设备等影响,导致水下视频具有高度的空间弱可视性和时间不稳定性,常规视频质量评价方法无法对水下视频进行准确、有效的评价。本文考虑水下视频特性,提出一种适用小样本的结合空域统计特性与编码的水下视频质量评价方法。
方法
2
基于水下视频成像特性,建立新的水下视频数据库,设计主观质量评价方法对所有视频进行15分质量标注。从水下视频中提取视频帧图像,针对空间域计算图像失真统计特性,然后结合视频编码参数,通过训练线性模型权重系数完成水下视频的质量评价。
结果
2
实验表明,与几种主流的质量评价方法相比,本文水下视频质量评价方法与人类视觉感知的相关性最高,模型评价结果与主观质量评价结果的皮尔森线性相关系数PCC(Pearson's correlation coefficient)为0.840 8,斯皮尔曼等级秩序相关系数SROCC(Spearman's rank order correlation coefficient)为0.832 2。通过比较各方法评价结果与真实值的均方误差(mean square error,MSE),本文方法MSE值最小,为0.113 1,说明本文的质量评价结果更加稳定。
结论
2
本文通过空间域单帧图像自然场景统计特性和视频编码参数融合的方式,提出的无参考水下视频质量评价方法,能够很好地运用小样本水下视频数据集建立与人类视觉感知高度相关的评价模型,为水下视频做出更准确的质量评价。
Objective
2
The application of underwater video technology has a history of more than 60 years. This technology plays an important role in promoting research on marine bioecology
fish species
and underwater object detection and tracking. Video quality assessment is one of the key areas being studied in video technology research. Such assessment is especially vital for underwater videos because underwater environments are more complex than atmospheric ones. On the one hand
natural sunlight is seriously absorbed in deep water
and the artificial light used in video shooting suffers from light absorption
dispersion
and scattering due to water turbidity and submarine topography. As a result
underwater videos have blurred picture
low contrast
color cast
and uneven lighting. On the other hand
underwater video quality is affected by the limitation of photography equipment and the influence of water flow. When shooting a moving object
the lens hardly stabilizes and turns unsmooth. Compared with videos shot in natural scenes
underwater videos are characterized by large lens movement
shaking
and serious out of focus. These characteristics make it difficult for conventional video quality assessment(VQA) methods to evaluate underwater video accurately and effectively. Thus
the "quality" of underwater videos must be redefined
and an effective quality assessment method must be established. In this study
we establish an underwater video dataset by considering underwater video imaging characteristics
annotate its video quality via subjective quality assessment
and propose an objective underwater video quality assessment model on the basis of spatial naturalness and video compression index.
Method
2
First
a new underwater video dataset is established to 1) collect several underwater videos captured in real deep sea environments for processing as source data; 2) filter these videos preliminarily to include different underwater scenes; 3) cut the preliminary screened videos at intervals of 10 seconds; 4) refilter the short video sequences to cover different shoot characteristics and color diversity
thus generating 25 video sequences with rich color information
different video contents
and different underwater video features; and 5) expand the dataset using different frame rates and bit rates as compression parameters. A total of 250 (25+25×3×3) video sequences are obtained. Then
subjective quality assessment is conducted. Absolute category rating is used by 20 participants to annotate all the 250 videos with scores ranging from 1 to 5. Then
we consider influences on the underwater video quality from the aspects of spatial
temporal
and compression features. The spatial features are expressed by natural scene statistics distortion characteristics in the spatial domain and are calculated using the blind/referenceless image spatial quality evaluator(BRISQUE) algorithm. The temporal features are expressed by optical flow motion features. We first compute the dense optical flow matrix between adjacent frames and then extract the mean and variation of overall optical flows and the mean and variation of the main objects in the video. Compression features use resolution
frame rate
and bit rate
which are easy-to-access video coding parameters. Considering the redundancy and relevancy of these potential features
we analyze the correlations among the features and between the features and the subjective quality scores. Then
we select 21 features as influence factors
which only contain 18 spatial natural characteristics and three compression indexes. Lastly
we establish a linear model with the selected features to evaluate underwater video quality objectively through linear regression with cross validation.
Result
2
Experimental results show that the proposed underwater video quality assessment model based on spatial naturalness and compression index can obtain the highest correlation with subjective scores in comparison with several mainstream quality assessment models
including two underwater image quality indices (underwater image quality measure(UIQM) and underwater color image quality evaluation(UCIQE))
a natural image quality distortion index (BRISQUE)
and a video quality assessment model (video intrinsic integrity and distortion evaluation oracle(VIIDEO)). Performance evaluation is based on Pearson's correlation coefficient (PCC)
Spearman's rank order correlation coefficient (SROCC) and the mean squared errors (MSE) between the predicted video quality scores of each model and the subjective scores. On the test video dataset
our method achieves the highest correlation (PCC=0.840 8
SROCC=0.832 2) and a minimum MSE value of 0.113 1. This result indicates that our proposed method is more stable and can predict video quality more accurately than other methods. By contrast
the video quality assessment model VIIDEO can hardly provide correct results
whereas UIQM and UCIQE demonstrate poor performance with a PCC and SROCC of 0.3~0.4. In addition
BRISQUE performs relatively better than the other methods although still poorer than our method.
Conclusion
2
Underwater videos are characterized by blurred picture
low contrast
color distortion
uneven lighting
large lens movement
and out of focus. To achieve an accurate assessment of underwater video quality
we fully consider the characteristics and shooting conditions of underwater videos and establish a labeled underwater video dataset with subjective video quality assessment. By fitting a linear regression model for subjective quality scores with natural statistical characteristics of video frames and video compression parameters
we propose an objective underwater video quality assessment model. The proposed nonreference underwater video quality assessment method is suitable to establish a prediction model that is highly related to human visual perception
with a small sample size of underwater video dataset.
视频质量评价客观质量评价模型水下视频自然场景统计编码参数
video quality assessmentobjective quality assessment modelunderwater videonatural scene statisticscompression parameters
De Simone F, Tagliasacchi M, Naccari M, Tubaro S and Ebrahimi T. 2010. A H.264/AVC video database for the evaluation of quality metrics//Proceedings of 2010 IEEE International Conference on Acoustics, Speech and Signal Processing. Dallas: IEEE: 2430-2433[DOI:10.1109/ICASSP.2010.5496296http://dx.doi.org/10.1109/ICASSP.2010.5496296]
Duda R O and Hart P E. 1973. Pattern Classification and Scene Analysis. New York: Wiley
Field D J. 1999. Wavelets, vision and the statistics of natural scenes. Philosophical Transactions of the Royal Society A:Mathematical, Physical and Engineering Sciences, 357(1760):2527-2542[DOI:10.1098/rsta.1999.0446]
Guo J C, Li C Y, Zhang Y and Gu X Y. 2017. Quality assessment method for underwater images. Journal of Image and Graphics, 22(1):1-8
郭继昌, 李重仪, 张艳, 顾翔元. 2017.面向水下图像的质量评价方法.中国图象图形学报, 22(1):1-8)[DOI:10.11834/jig.20170101]
ITU-R. 2002. Methodology for the subjective assessment of the quality of television pictures. Recommendation BT.500-11[EB/OL].[2020-01-01].https://www.itu.int/rec/R-REC-BT.500https://www.itu.int/rec/R-REC-BT.500
ITU-R. 2007. Methodology for the subjective assessment of video quality in multimedia applications. Recommendation BT. 1788[EB/OL].[2020-01-01].https://www.itu.int/rec/R-REC-BT.1788/enhttps://www.itu.int/rec/R-REC-BT.1788/en
ITU-T. 1999. Subjective video quality assessment methods for multimedia applications. Recommendation P.910[EB/OL].[2020-01-01].https://www.itu.int/rec/T-REC-P.910/enhttps://www.itu.int/rec/T-REC-P.910/en
Lasmar N E, Stitou Yand Berthoumieu Y. 2009. Multiscale skewed heavy tailed model for texture analysis//Proceedings of the 16th IEEE International Conference on Image Processing. Cairo: IEEE: 2281-2284[DOI:10.1109/ICIP.2009.5414404http://dx.doi.org/10.1109/ICIP.2009.5414404]
Legge G E and Foley J M. 1980. Contrast masking in human vision. Journal of the Optical Society of America, 70(12):1458-1471[DOI:10.1364/JOSA.70.001458]
Mittal A, Moorthy A K and Bovik A C. 2012. No-reference image quality assessment in the spatial domain. IEEE Transactions on Image Processing, 21(12):4695-4708[DOI:10.1109/TIP.2012.2214050]
Mittal A, Saad M A and Bovik A C. 2016. A completely blind video integrity oracle. IEEE Transactions on Image Processing, 25(1):289-300[DOI:10.1109/TIP.2015.2502725]
Mittal A, Soundararajan R and Bovik A C. 2013. Making a "completely blind" image quality analyzer. IEEE Signal Processing Letters, 20(3):209-212[DOI:10.1109/LSP.2012.2227726]
Moorthy A K and Bovik A C. 2011. Blind image quality assessment:from natural scene statistics to perceptual quality. IEEE Transactions on Image Processing, 20(12):3350-3364[DOI:10.1109/TIP.2011.2147325]
Moreno-Roldán J M, Luque-Nieto M Á, Poncela J and Otero P. 2017. Objective video quality assessment based on machine learning for underwater scientific applications. Sensors, 17(4):664[DOI:10.3390/s17040664]
Moreno-Roldán J M, Poncela J, Otero P and Bovik A C. 2018. A no-reference video quality assessment model for underwater networks. IEEE Journal of Oceanic Engineering, 45(1):342-346[DOI:10.1109/JOE.2018.2869441]
Panetta K, Gao C and Agaian S. 2016. Human-visual-system-inspired underwater image quality measures. IEEE Journal of Oceanic Engineering, 41(3):541-551[DOI:10.1109/JOE.2015.2469915]
Ruderman D L. 1994. The statistics of natural images. Network:Computation in Neural Systems, 5(4):517-548[DOI:10.1088/0954-898X_5_4_006]
Saad M A, Bovik A C and Charrier C. 2014. Blind prediction of natural video quality. IEEE Transactions on Image Processing, 23(3):1352-1365[DOI:10.1109/TIP.2014.2299154]
Seshadrinathan K, Soundararajan R, Bovik A C and Cormack L K. 2010. Study of subjective and objective quality assessment of video. IEEE Transactions on Image Processing, 19(6):1427-1441[DOI:10.1109/TIP.2010.2042111]
Sharifi K and Leon-Garcia A. 1995. Estimation of shape parameter for generalized Gaussian distributions in subband decompositions of video. IEEE Transactions on Circuits and Systems for Video Technology, 5(1):52-56[DOI:10.1109/76.350779]
Wang Y, Li N, Li Z Y, Gu Z R, Zheng H Y, Zheng B and Sun M M. 2018. An imaging-inspired no-reference underwater color image quality assessment metric. Computers&Electrical Engineering, 70:904-913[DOI:10.1016/j.compeleceng.2017.12.006]
Wang Y, Song W, Fortino G, Qi L Z, Zhang W Q, and Liotta A. 2019. An experimental-based review of image enhancement and image restoration methods for underwater imaging. IEEE Access, 7:140233-140251[DOI:10.1109/ACCESS.2019.2932130]
Yang M and Sowmya A. 2015. An underwater color image quality evaluation metric. IEEE Transactions on Image Processing, 24(12):6062-6071[DOI:10.1109/TIP.2015.2491020]
Zhong Y, Richardson I, Sahraie A and McGeorge P. 2004. Influence of task and scene content on subjective video quality//Campilho A, Kamel M, eds. International Conference on Image Analysis and Recognition. Porto: Springer: 295-301[DOI:10.1007/978-3-540-30125-7_37http://dx.doi.org/10.1007/978-3-540-30125-7_37]
相关文章
相关作者
相关机构