融合定位信息的热带气旋强度估计
Location information-fused estimation method in relevance to tropical cyclone intensity
- 2023年28卷第8期 页码:2522-2535
纸质出版日期: 2023-08-16
DOI: 10.11834/jig.220348
移动端阅览
浏览全部资源
扫码关注微信
纸质出版日期: 2023-08-16 ,
移动端阅览
刘英杰, 张芮, 刘青山, 杭仁龙. 2023. 融合定位信息的热带气旋强度估计. 中国图象图形学报, 28(08):2522-2535
Liu Yingjie, Zhang Rui, Liu Qingshan, Hang Renlong. 2023. Location information-fused estimation method in relevance to tropical cyclone intensity. Journal of Image and Graphics, 28(08):2522-2535
目的
2
精确估计热带气旋的强度有助于提升天气预报和预警的准确性。随着深度学习技术的不断发展,基于卷积神经网络(convolutional neural network,CNN)的方法已应用于强度估计任务中。然而,现有方法仍存在许多问题,例如无法充分利用不同波段的卫星图像信息、输入图像以热带气旋的定位为中心等限制,从而产生较大误差,影响实时估计的结果。针对以上问题,本文提出一种融合定位信息的强度估计网络IEFL(intensity estimation fusing location),提升强度估计的准确率。
方法
2
模型采用双分支结构,能有效融合不同波段的图像特征,同时可以同步优化两个任务,达到互相促进的效果。此外,模型对强度估计任务做了定位的特征融合,将得到的定位特征图与强度特征图进行拼接,共同输出最后的强度结果,通过利用定位信息达到提升强度估计精度的目的。
结果
2
本文在完成热带气旋强度估计的同时,可获取较好的热带气旋中心定位结果。收集了2015—2018年葵花-8卫星多通道图像用以训练模型,并在2019和2020年的数据上进行测试。结果表明,融合定位信息后模型的强度估计均方根误差为4.74 m/s,平均绝对误差为3.52 m/s。相比传统单一强度估计模型误差分别降低了7%和9%。
结论
2
IEFL模型在不依赖定位准确率的同时,能够有效提升强度估计的准确率。
Objective
2
A tropical cyclone can generate such severe weather condition like strong winds or heavy precipitation, as well as such secondary disasters derived of floods, landslides, and mudslides. Tropical cyclones may often threaten survival contexts in related to coastal community. The intensity of tropical cyclones (TC) can be estimated accurately and it is beneficial for weather forecasting and warning. Deep learning techniques-based convolutional neural networks (CNNs) methods have its optimal ability for estimation task of tropical cyclone intensity apparently. However, CNN-based methods are still challenging for such problem of insufficient use of multi-channel satellite images, and the input images are preferred to be centered on the location of tropical cyclones. To resolve large estimation errors and real-time estimation results. we develop a network called intensity-estimation-fusing-location (IEFL) to improve accuracy of intensity estimation further.
Method
2
The training data is captured from Himawari-8 satellite images from 2015 to 2018 in comparason with such data contexts from 2019 to 2020. The dataset contains 42 028 training images and 5 229 testing images. First, the data are preprocessed to remove non-TC cloud systems via clipping satellite images. Then, the implementation of data augmentation strategy is oriented to optimize the over-fitting problem and enhance the model robustness. Moreover, multiple channel images analysis is required to reveal varied features of TCs. Thus, a better combination for intensity estimation can be developed through fusing multi-channel images. The network is set up via a two-branch structure, which can be used to fuse different channel images effectively. Two sort of tasks can be optimized simultaneously and learnt mutually. In addition, the network can feed location task-extracted features into intensity estimation task. Specifically, their feature maps can be concatenated and intensity estimation results are generated as following. The experiment is segmented into two categories as mentioned below: for the first category, it is focused on intensity estimation model only, and different channels-related location information fusion results can be used to analyze the location information-fused impact in relevance to intensity estimation. For the second one, multi-channel integration is selected for the model with location information to analyze the integrated effect of different channel for intensity estimation. The IEFL network is configurable on the Pytorch toolbox. The input images are resized to 512 × 512 pixels for training, the momentum parameter is set to 0.9, the learning rate is set to 0.001, the batch size is set to 5, and the weight decay is 10
-4
. The stochastic gradient descent (SGD) learning procedure is optimized using an NVIDIA GTX TITAN XP device. The loss function of intensity estimation regression is root mean square error (RMSE). The RMSE can be used to measure the difference between the ground truth and predictable values of tropical cyclone intensity. The smaller the RMSE, the better the performance of the model. Furthermore, the loss function of location regression is recognized as the RMSE as well. Therefore, the total loss function of the model can be set as the sum of the intensity loss and location loss. The main contributions are listed below: 1) develop a location information-fused model to estimate tropical cyclone intensity and location, called intensity-estimation-fusing-location(IEFL); 2) validate the intensities of tropical cyclones using different channel images captured from the Himawari-8 satellite, and 3) analyze the intensity estimation performance of each channel and the integrated effect of different channel.
Result
2
The non-location information-relevant intensity of root mean square error (RMSE) is 5.08 m/s, in which the location information-relevant intensity of RMSE is 4.74 m/s. Compared to the network without the location task, the RMSE values are reduced by 7%. The error-related comparative analyses are carried out between the IEFL model and other six related methods. Compared to the traditional method deviation angle variance technique (DAVT), it is increased by about 27%. Compared to the CNN-based methods, it is 11% higher than convolutional neural network-tropical cyclone (CNN-TC), 8% higher than tropical cyclone intensity estimation net (TCIENet), as well as 4% higher than tropical cyclone intensity classification and estimation net (TCICENet).
Conclusion
2
we develop the IEFL model to estimate intensity in terms of location information fusion. This IEFL is focused on improving the accuracy of intensity estimation beyond location accuracy. The experiment result shows that location information-fused model has its optimal potentials farther.
强度估计热带气旋(TC)卷积神经网络(CNN)中心定位葵花-8
intensity estimationtropical cyclone (TC)convolutional neural network (CNN)center locationHimawari-8
Chen B F, Chen B Y, Lin H T and Elsberry R L. 2019. Estimating tropical cyclone intensity by satellite imagery utilizing convolutional neural networks. Weather and Forecasting, 34(2): 447-465 [DOI: 10.1175/waf-d-18-0136.1http://dx.doi.org/10.1175/waf-d-18-0136.1]
Ding X H, Zhang X Y, Ma N N, Han J G, Ding G G and Sun J. 2021. RepVGG: making VGG-style ConvNets great again//Proceedings of 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Nashville, USA: IEEE: 13728-13737 [DOI: 10.1109/CVPR46437.2021.01352http://dx.doi.org/10.1109/CVPR46437.2021.01352]
Dvorak V F. 1975. Tropical cyclone intensity analysis and forecasting from satellite imagery. Monthly Weather Review, 103(5): 420-430 [DOI: 10.1175/1520-0493(1975)103<0420:tciaaf>2.0.co;2http://dx.doi.org/10.1175/1520-0493(1975)103<0420:tciaaf>2.0.co;2]
Geng X Q, Li Z W and Yang X F. 2014. Tropical cyclone auto-recognition from stationary satellite imagery. Journal of Image and Graphics, 19(6): 964-970
耿晓庆, 李紫薇, 杨晓峰. 2014. 静止卫星图像热带气旋云系自动识别. 中国图象图形学报, 19(6): 964-970 [DOI: 10.11834/jig.20140618http://dx.doi.org/10.11834/jig.20140618]
Hu H and Weng F Z. 2020. Estimation of location and intensity of tropical cyclones based on microwave sounding instruments//IGARSS 2020-2020 IEEE International Geoscience and Remote Sensing Symposium. Waikoloa, USA: IEEE: 5442-5445 [DOI: 10.1109/IGARSS39084.2020.9323785http://dx.doi.org/10.1109/IGARSS39084.2020.9323785]
Lu X Q, Yu H, Ying M, Zhao B K, Zhang S, Lin L M, Bai L N and Wan R J. 2021. Western North Pacific tropical cyclone database created by the China meteorological administration. Advances in Atmospheric Sciences, 38(4): 690-699 [DOI: 10.1007/s00376-020-0211-7http://dx.doi.org/10.1007/s00376-020-0211-7]
Olander T L and Velden C S. 2007. The advanced Dvorak technique: continued development of an objective scheme to estimate tropical cyclone intensity using geostationary infrared satellite imagery. Weather and Forecasting, 22(2): 287-298 [DOI: 10.1175/waf975.1http://dx.doi.org/10.1175/waf975.1]
Olander T L, Velden C S and Kossin J P. 2004. The advanced objective dvorak technique (AODT)–latest upgrades and future directions//Proceedings of the 26th Conference on Hurricanes and Tropical Meteorology. Deauviue Beach Resort, USA: AMS: 294-295
Pradhan R, Aygun R S, Maskey M, Ramachandran R and Cecil D J. 2018. Tropical cyclone intensity estimation using a deep convolutional neural network. IEEE Transactions on Image Processing, 27(2): 692-702 [DOI: 10.1109/tip.2017.2766358http://dx.doi.org/10.1109/tip.2017.2766358]
Ritchie E A, Wood K M, Rodríguez-Herrera O G, Piñeros M F and Tyo J S. 2014. Satellite-derived tropical cyclone intensity in the North Pacific Ocean using the deviation-angle variance technique. Weather and Forecasting, 29(3): 505-516 [DOI: 10.1175/waf-d-13-00133.1http://dx.doi.org/10.1175/waf-d-13-00133.1]
Robusto C C. 1957. The cosine-haversine formula. The American Mathematical Monthly, 64(1): 38-40 [DOI: 10.2307/2309088http://dx.doi.org/10.2307/2309088]
Shimada U, Sawada M and Yamada H. 2016. Evaluation of the accuracy and utility of tropical cyclone intensity estimation using single ground-based Doppler radar observations. Monthly Weather Review, 144(5): 1823-1840 [DOI: 10.1175/mwr-d-15-0254.1http://dx.doi.org/10.1175/mwr-d-15-0254.1]
Simonyan K and Zisserman A. 2015. Very deep convolutional networks for large-scale image recognition//Proceedings of the 3rd International Conference on Learning Representations. San Diego, USA: ICLR
Velden C S, Olander T L and Zehr R M. 1998. Development of an objective scheme to estimate tropical cyclone intensity from digital geostationary satellite infrared imagery. Weather and Forecasting, 13(1): 172-186 [DOI: 10.1175/1520-0434(1998)013<0172:doaost>2.0.co;2http://dx.doi.org/10.1175/1520-0434(1998)013<0172:doaost>2.0.co;2]
Wang C, Zheng G, Li X F, Xu Q, Liu B and Zhang J. 2022. Tropical cyclone intensity estimation from geostationary satellite imagery using deep convolutional neural networks. IEEE Transactions on Geoscience and Remote Sensing, 60: #4101416 [DOI: 10.1109/tgrs.2021.3066299http://dx.doi.org/10.1109/tgrs.2021.3066299]
Wang Y Y, Ye Z and Sun W C. 2002. Typhoon center locating using rotation feature matching method. Journal of Image and Graphics, 7A(5): 491-494
王燕燕, 叶臻, 孙慰迟. 2002. 台风中心的旋转定位. 中国图象图形学报, 7A(5): 491-494 [DOI: 10.3969/j.issn.1006-8961.2002.05.013http://dx.doi.org/10.3969/j.issn.1006-8961.2002.05.013]
Wimmers A, Velden C and Cossuth J H. 2019. Using deep learning to estimate tropical cyclone intensity from satellite passive microwave imagery. Monthly Weather Review, 147(6): 2261-2282 [DOI: 10.1175/mwr-d-18-0391.1http://dx.doi.org/10.1175/mwr-d-18-0391.1]
Ying M, Zhang W, Yu H, Lu X Q, Feng J X, Fan Y X, Zhu Y T and Chen D Q. 2014. An overview of the China meteorological administration tropical cyclone database. Journal of Atmospheric and Oceanic Technology, 31(2): 287-301 [DOI: 10.1175/JTECH-D-12-00119.1http://dx.doi.org/10.1175/JTECH-D-12-00119.1]
Zhang C J, Wang X J and Ma L M. 2021. Tropical cyclone intensity classification and estimation using infrared satellite images with deep learning. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 14: 2070-2086 [DOI: 10.1109/JSTARS.2021.3050767http://dx.doi.org/10.1109/JSTARS.2021.3050767]
Zhang C J, Xue L C, Ma L M and Lu X Q. 2018. Infrared brightness-temperature variance method for the objective location of tropical cyclones. Journal of Image and Graphics, 23(3): 450-457
张长江, 薛利成, 马雷鸣, 鲁小琴. 2018. 热带气旋客观定位的红外亮温方差方法. 中国图象图形学报, 23(3): 450-457 [DOI: 10.11834/jig.170402http://dx.doi.org/10.11834/jig.170402]
Zhang R, Liu Q S and Hang R L. 2020. Tropical cyclone intensity estimation using two-branch convolutional neural network from infrared and water vapor images. IEEE Transactions on Geoscience and Remote Sensing, 58(1): 586-597 [DOI: 10.1109/tgrs.2019.2938204http://dx.doi.org/10.1109/tgrs.2019.2938204]
Zhang W L and Cui X P. 2013. Review of the studies on tropical cyclone genesis. Journal of Tropical Meteorology, 29(2): 337-346
张文龙, 崔晓鹏. 2013. 热带气旋生成问题研究综述. 热带气象学报, 29(2): 337-346[DOI: 10.3969/j.issn.1004-4965.2013.02.019http://dx.doi.org/10.3969/j.issn.1004-4965.2013.02.019]
相关作者
相关机构