Recognition of Road Type and Quality for Advanced Driver Assistance Systems with Deep Learning
Keywords:Driver assistance systems, Automatic driving systems, Deep learning, Road type detection.
To develop effective advanced driving assistance systems, it is important to accurately recognize current driving environments and make critical decisions about driving processes. Preventing accidents through the interaction between the driving assistance systems and the environment and ensuring optimum driving dynamics are the main topics in this field. Vehicles need to recognize the road type and quality at a high accuracy to ensure the most suitable driving for the road type. It is also important to use both uncomplicated and cost-effective systems when performing this detection. In this study, a deep learning-based approach that can be used in vehicle driver assistance systems is proposed to automatically recognize road type and quality. Using this approach, it is possible to determine the road type and the quality of the road using only driving images as the input data. A new convolutional neural network model is designed for classification of the driving images. Driving images obtained from Google Street View are used to evaluate the recognition system for an actual driving environment. The proposed approach shows that the road types were determined with accuracy of 91.41 %, and the pothole road–smooth road distinction was successful at 91.07 %. It can be said that the proposed method is an effective structure that can be used for advanced driving support systems, V2I communications systems, and similar intelligent transportation systems.
How to Cite
The copyright for the paper in this journal is retained by the author(s) with the first publication right granted to the journal. The authors agree to the Creative Commons Attribution 4.0 (CC BY 4.0) agreement under which the paper in the Journal is licensed.
By virtue of their appearance in this open access journal, papers are free to use with proper attribution in educational and other non-commercial settings with an acknowledgement of the initial publication in the journal.