DynaEdge-Net: Dynamic Feature Gating and Edge Enhancement for Precise Road Crack Segmentation
DOI:
https://doi.org/10.5755/j02.eie.42747Keywords:
Road crack, Semantic segmentation, GDG, Edge enhancementAbstract
Automatic detection of road cracks is essential for the long-term safety maintenance of roads, bridges, and other infrastructure. Although existing deep learning-based crack segmentation methods have improved detection accuracy, challenges remain in terms of high computational complexity and inadequate capture of fine cracks and edge details. To address these issues, this study proposes an enhanced UNet-based architecture, termed DynaEdge-Net. In the encoder and decoder stages, a Residual Detail Enhancement Block (RDEB) and a Cascaded Group Attention (CGA) module are incorporated to strengthen edge feature representation and focus on critical regions, respectively. In the skip connections, a Group-wise Dynamic Gating (GDG) module is introduced to adaptively suppress background noise and optimize feature transmission. During decoding, a Dynamic Upsampling (DySample) strategy replaces conventional interpolation, enabling high-fidelity reconstruction of crack structures. Experimental results show that DynaEdge-Net achieves IoU, F1-score, boundary F1-score, and recall rates of 82.34%, 90.83%, 83.27%, and 90.12%, respectively, outperforming several state-of-the-art road segmentation algorithms. The proposed method not only improves the continuity and accuracy of crack extraction but also demonstrates strong robustness and generalization capability, providing a reliable solution for intelligent inspection and maintenance of transportation infrastructure.
Downloads
Published
Issue
Section
License
The copyright for the paper in this journal is retained by the author(s) with the first publication right granted to the journal. The authors agree to the Creative Commons Attribution 4.0 (CC BY 4.0) agreement under which the paper in the Journal is licensed.
By virtue of their appearance in this open access journal, papers are free to use with proper attribution in educational and other non-commercial settings with an acknowledgement of the initial publication in the journal.




