EconPapers    
Economics at your fingertips  
 

Multi-Modal Dataset of Human Activities of Daily Living with Ambient Audio, Vibration, and Environmental Data

Thomas Pfitzinger (), Marcel Koch, Fabian Schlenke and Hendrik Wöhrle
Additional contact information
Thomas Pfitzinger: Institute of Communication Technology, Department of Information Technology, Dortmund University of Applied Sciences and Arts, Sonnenstraße 96, 44139 Dortmund, Germany
Marcel Koch: Institute of Communication Technology, Department of Information Technology, Dortmund University of Applied Sciences and Arts, Sonnenstraße 96, 44139 Dortmund, Germany
Fabian Schlenke: Institute of Communication Technology, Department of Information Technology, Dortmund University of Applied Sciences and Arts, Sonnenstraße 96, 44139 Dortmund, Germany
Hendrik Wöhrle: Institute of Communication Technology, Department of Information Technology, Dortmund University of Applied Sciences and Arts, Sonnenstraße 96, 44139 Dortmund, Germany

Data, 2024, vol. 9, issue 12, 1-18

Abstract: The detection of human activities is an important step in automated systems to understand the context of given situations. It can be useful for applications like healthcare monitoring, smart homes, and energy management systems for buildings. To achieve this, a sufficient data basis is required. The presented dataset contains labeled recordings of 25 different activities of daily living performed individually by 14 participants. The data were captured by five multisensors in supervised sessions in which a participant repeated each activity several times. Flawed recordings were removed, and the different data types were synchronized to provide multi-modal data for each activity instance. Apart from this, the data are presented in raw form, and no further filtering was performed. The dataset comprises ambient audio and vibration, as well as infrared array data, light color and environmental measurements. Overall, 8615 activity instances are included, each captured by the five multisensor devices. These multi-modal and multi-channel data allow various machine learning approaches to the recognition of human activities, for example, federated learning and sensor fusion.

Keywords: sensor data; labeled data; internet of things; smart home; assisted living; human activity recognition; machine learning; classification; activities of daily living (search for similar items in EconPapers)
JEL-codes: C8 C80 C81 C82 C83 (search for similar items in EconPapers)
Date: 2024
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2306-5729/9/12/144/pdf (application/pdf)
https://www.mdpi.com/2306-5729/9/12/144/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jdataj:v:9:y:2024:i:12:p:144-:d:1539490

Access Statistics for this article

Data is currently edited by Ms. Cecilia Yang

More articles in Data from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jdataj:v:9:y:2024:i:12:p:144-:d:1539490