EconPapers    
Economics at your fingertips  
 

Using Traffic Light Signal to Enhance Intersection Foreground Detection Based on Video Sensor Networks

Rong Ding, Shunli Wang and Xu Liu

International Journal of Distributed Sensor Networks, 2014, vol. 10, issue 4, 576759

Abstract: Foreground detection plays an important role in the traffic surveillance applications, especially in urban intersections. Background subtraction is an efficient approach to segment the background and foreground with static cameras from video sensor networks. But when modelling the background, most statistical techniques adjust the learning rate only based on the changes from video sequences, which is a crucial parameter controlling the updating speed. This causes a slow adaptation to sudden environmental changes. For example, a stopped car fuses into background before moving again, and it lowers the segmentation performance. This paper proposes an efficient way to address the problem by accounting for the physical world signal in traffic junctions. It assigns an adaptive learning rate to each pixel by integrating traffic light signal obtained from sensor networks. Combined with abundant physical world signals, background subtraction method is able to adapt itself to the outside world changes instantly. We test our approach in real urban traffic intersection; experimental results show that the new method increases the accuracy of detection and has a promising future.

Date: 2014
References: Add references at CitEc
Citations:

Downloads: (external link)
https://journals.sagepub.com/doi/10.1155/2014/576759 (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:sae:intdis:v:10:y:2014:i:4:p:576759

DOI: 10.1155/2014/576759

Access Statistics for this article

More articles in International Journal of Distributed Sensor Networks
Bibliographic data for series maintained by SAGE Publications ().

 
Page updated 2025-03-19
Handle: RePEc:sae:intdis:v:10:y:2014:i:4:p:576759