EconPapers    
Economics at your fingertips  
 

Automatic Extraction of Pedestrian Trajectories from Video Recordings

Maik Boltes (), Armin Seyfried, Bernhard Steffen and Andreas Schadschneider
Additional contact information
Maik Boltes: Research Centre Jülich, Jülich Supercomputing Centre
Armin Seyfried: Research Centre Jülich, Jülich Supercomputing Centre
Bernhard Steffen: Research Centre Jülich, Jülich Supercomputing Centre
Andreas Schadschneider: Cologne University, Institute of Theoretical Physics

A chapter in Pedestrian and Evacuation Dynamics 2008, 2010, pp 43-54 from Springer

Abstract: Summary To understand and model pedestrian dynamics, reliable empirical data of pedestrian movement are necessary for analysis and verification, but the existing database is small, inaccurate and highly contradictory. For collecting trajectories from extensive experimental series with a large number of persons we are developing a software named PeTrack which automatically extracts these trajectories from normal video recordings with high accuracy in space and time.

Keywords: Video Recording; Automatic Extraction; Height Class; Colour Marker; Wide Angle Lens (search for similar items in EconPapers)
Date: 2010
References: Add references at CitEc
Citations: View citations in EconPapers (4)

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:sprchp:978-3-642-04504-2_3

Ordering information: This item can be ordered from
http://www.springer.com/9783642045042

DOI: 10.1007/978-3-642-04504-2_3

Access Statistics for this chapter

More chapters in Springer Books from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2026-02-19
Handle: RePEc:spr:sprchp:978-3-642-04504-2_3