A New Classification Approach with Deep Mask R-CNN for Synthetic Aperture Radar Image Segmentation
DOI:
https://doi.org/10.5755/j01.eie.26.6.25849Keywords:
Image segmentation, Neural networks, Radar imaging, Synthetic aperture radarAbstract
In this paper, a hybrid classification approach which is combined with a more deep mask region-convolutional neural network and sparsity driven despeckling algorithm is proposed for synthetic aperture radar (SAR) image segmentation instead of the classical segmentation methods. In satellite technology, synthetic aperture radar images are strongly used for a lot of areas, such as evaluating air conditions, determining agricultural fields, climatic changes, and as a target in the military. Synthetic aperture radar images must be segmented to each meaningful point in the image for a quality segmentation process. In contrast, synthetic aperture radar images have a lot of noisy speckles and these speckles should be also reduced for a quality segmentation. Current studies show that deep learning techniques are widely used for segmentation methods. High accuracy and fast results can be obtained with deep learning techniques for image segmentation. Mask region-convolutional neural network can not only separate each meaningful field in the image, but it can also generate a high accuracy prediction for each meaningful field of synthetic aperture radar images. The study shows that smoothed SAR images can be classified as multiple regions with deep neural networks.
Downloads
Published
How to Cite
Issue
Section
License
The copyright for the paper in this journal is retained by the author(s) with the first publication right granted to the journal. The authors agree to the Creative Commons Attribution 4.0 (CC BY 4.0) agreement under which the paper in the Journal is licensed.
By virtue of their appearance in this open access journal, papers are free to use with proper attribution in educational and other non-commercial settings with an acknowledgement of the initial publication in the journal.