EconPapers    
Economics at your fingertips  
 

AN EFFICIENT SOLUTION FOR PEOPLE DETECTION, TRACKING AND COUNTING USING CONVOLUTIONAL NEURAL NETWORKS

Eduard Cojocea () and Traian Rebedea ()
Additional contact information
Eduard Cojocea: University Politehnica of Bucharest, Open Gov SRL, Bucharest, Romania
Traian Rebedea: University Politehnica of Bucharest, Open Gov SRL, Bucharest, Romania

Journal of Information Systems & Operations Management, 2020, vol. 14, issue 2, 49-56

Abstract: The number of unique persons walking near a shop or inside a mall is relevant since it can indicate the possible extension margin of a certain business. Also, being able to extract statistics regarding gender, age group and so on, can offer key insights regarding how to better manage and stock a business. In this paper we present a system which detects, tracks and counts the number of people in a video stream. The results obtained can be visualised in a GUI interface which allows for customizing multiple visualization tools. We use YOLOv3, a Convolutional Neural Network model, for object detection and Deep SORT for tracking. We describe how the system works on different hardware architectures: on a server with two high-end GPUs and on various edge devices, such as Raspberry Pi 3, Raspberry Pi 4 and NVidia Jetson TX2.

Date: 2020
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.rebe.rau.ro/RePEc/rau/jisomg/WI20/JISOM-WI20-A05.pdf (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:rau:jisomg:v:14:y:2020:i:2:p:49-56

Access Statistics for this article

More articles in Journal of Information Systems & Operations Management from Romanian-American University Contact information at EDIRC.
Bibliographic data for series maintained by Alex Tabusca ().

 
Page updated 2025-11-29
Handle: RePEc:rau:jisomg:v:14:y:2020:i:2:p:49-56