BSPR-Net: Dual-Branch Feature Extraction Network for LiDAR Place Recognition in Unstructured Environments
DOI:
https://doi.org/10.5755/j02.eie.40523Keywords:
Unstructured environments, LiDAR point clouds, Place recognition, Dual-branch featuresAbstract
LiDAR point cloud-based place recognition (LPR) in unstructured natural environments remains an open challenge with limited existing research. To address the limitations of unstructured environments, such as sparse structural features, uneven point cloud density, and significant viewpoint variations, we present BSPR-Net, a dual-branch point cloud feature extraction approach for point cloud place recognition, which consists of a BEV - projection rotation - invariant convolution branch and a point cloud sparse convolution branch. This design enhances the representation capability of geometric structural features while aggregating rotation-invariant characteristics of point clouds, thereby better addressing the challenge of large viewpoint disparities in reverse-revisited unstructured environments. The proposed network was tested on multiple reverse-revisited sequences of the Wild-Places data set, a benchmark for unstructured natural environment place recognition. It achieved a maximum F1 score of 85.46 %, exceeding other classical methods by more than 4 %. The ablation experiments further confirmed the effectiveness of each module in improving place recognition performance.
Downloads
Published
How to Cite
Issue
Section
License
The copyright for the paper in this journal is retained by the author(s) with the first publication right granted to the journal. The authors agree to the Creative Commons Attribution 4.0 (CC BY 4.0) agreement under which the paper in the Journal is licensed.
By virtue of their appearance in this open access journal, papers are free to use with proper attribution in educational and other non-commercial settings with an acknowledgement of the initial publication in the journal.