1. Automatic Pixel-Level Segmentation of Multiple Pavement Distresses and Surface Design Features with PDSNet II.
- Author
-
Lang, Hong, Qian, Jinsong, Yuan, Ye, Chen, Jiang, Xing, Yingying, and Wang, Aidi
- Subjects
- *
ROAD maintenance , *TRAFFIC safety , *RANDOM fields , *DEEP learning , *PAVEMENTS - Abstract
Effective distress detection and quantitative analysis play a crucial role in road maintenance and driving safety. The Pavement distress segmentation network (PDSNet) is designed to combine the pyramid scene parsing network (PSPNet) and U-Net, providing both prior global information and local features that can overcome the common detection issues on the pavement data set faced by a single network. This paper proposes an efficient and improved architecture of PDSNet called PDSNet II for enhanced global modeling and retrieving fine details capacities. The proposed PDSNet II represents two major modifications on the original PDSNet. Firstly, a shifted window based on fully connected conditional random fields (FC-CRFs) layer is purposefully introduced to provide connections among consecutive self-attention layers that significantly enhance modeling power. Secondly, PDSNet II adopts multiple-head attention mechanisms to capture diverse interaction information across multiple projection spaces. Consequently, the output maps from the pyramid pooling module (PPM) head and the U-Net tail are fed into a neural window FC-CRFs layer. PDSNet II was trained using a data set consisting of 12,648 two-dimensional (2D) intensity and three-dimensional (3D) range images depicting various pavement conditions. The experimental results demonstrate that PDSNet II outperforms the original PDSNet in terms of F1-score and intersection over union (IoU). Compared with state-of-the-art networks, PDSNet II exhibits superior performance in detecting complex distress patterns, while effectively reducing noise and maintaining robustness. Overall, the proposed PDSNet II framework shows promising results in pavement distress segmentation, highlighting its potential for practical applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF