EconPapers    
Economics at your fingertips  
 

Target Fusion Detection of LiDAR and Camera Based on the Improved YOLO Algorithm

Jian Han, Yaping Liao, Junyou Zhang, Shufeng Wang and Sixian Li
Additional contact information
Jian Han: College of Transportation, Shandong University of Science and Technology, Huangdao District, Qingdao 266590, China
Yaping Liao: College of Transportation, Shandong University of Science and Technology, Huangdao District, Qingdao 266590, China
Junyou Zhang: College of Transportation, Shandong University of Science and Technology, Huangdao District, Qingdao 266590, China
Shufeng Wang: College of Transportation, Shandong University of Science and Technology, Huangdao District, Qingdao 266590, China
Sixian Li: College of Transportation, Shandong University of Science and Technology, Huangdao District, Qingdao 266590, China

Mathematics, 2018, vol. 6, issue 10, 1-16

Abstract: Target detection plays a key role in the safe driving of autonomous vehicles. At present, most studies use single sensor to collect obstacle information, but single sensor cannot deal with the complex urban road environment, and the rate of missed detection is high. Therefore, this paper presents a detection fusion system with integrating LiDAR and color camera. Based on the original You Only Look Once (YOLO) algorithm, the second detection scheme is proposed to improve the YOLO algorithm for dim targets such as non-motorized vehicles and pedestrians. Many image samples are used to train the YOLO algorithm to obtain the relevant parameters and establish the target detection model. Then, the decision level fusion of sensors is introduced to fuse the color image and the depth image to improve the accuracy of the target detection. Finally, the test samples are used to verify the decision level fusion. The results show that the improved YOLO algorithm and decision level fusion have high accuracy of target detection, can meet the need of real-time, and can reduce the rate of missed detection of dim targets such as non-motor vehicles and pedestrians. Thus, the method in this paper, under the premise of considering accuracy and real-time, has better performance and larger application prospect.

Keywords: autonomous vehicle; target detection; multi-sensors; fusion; YOLO (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2018
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
https://www.mdpi.com/2227-7390/6/10/213/pdf (application/pdf)
https://www.mdpi.com/2227-7390/6/10/213/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:6:y:2018:i:10:p:213-:d:176873

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:6:y:2018:i:10:p:213-:d:176873