NPFC-Test: A Multimodal Dataset from an Interactive Digital Assessment Using Wearables and Self-Reports
Luis Fernando Morán-Mirabal,
Luis Eduardo Güemes-Frese,
Mariana Favarony-Avila,
Sergio Noé Torres-Rodríguez and
Jessica Alejandra Ruiz-Ramirez ()
Additional contact information
Luis Fernando Morán-Mirabal: Tecnologico de Monterrey, Institute for the Future of Education, Av. Eugenio Garza Sada 2501 Sur, Tecnológico, Monterrey 64700, Mexico
Luis Eduardo Güemes-Frese: Tecnologico de Monterrey, Institute for the Future of Education, Av. Eugenio Garza Sada 2501 Sur, Tecnológico, Monterrey 64700, Mexico
Mariana Favarony-Avila: Tecnologico de Monterrey, Institute for the Future of Education, Av. Eugenio Garza Sada 2501 Sur, Tecnológico, Monterrey 64700, Mexico
Sergio Noé Torres-Rodríguez: Tecnologico de Monterrey, Institute for the Future of Education, Av. Eugenio Garza Sada 2501 Sur, Tecnológico, Monterrey 64700, Mexico
Jessica Alejandra Ruiz-Ramirez: Tecnologico de Monterrey, Institute for the Future of Education, Av. Eugenio Garza Sada 2501 Sur, Tecnológico, Monterrey 64700, Mexico
Data, 2025, vol. 10, issue 7, 1-15
Abstract:
The growing implementation of digital platforms and mobile devices in educational environments has generated the need to explore new approaches for evaluating the learning experience beyond traditional self-reports or instructor presence. In this context, the NPFC-Test dataset was created from an experimental protocol conducted at the Experiential Classroom of the Institute for the Future of Education. The dataset was built by collecting multimodal indicators such as neuronal, physiological, and facial data using a portable EEG headband, a medical-grade biometric bracelet, a high-resolution depth camera, and self-report questionnaires. The participants were exposed to a digital test lasting 20 min, composed of audiovisual stimuli and cognitive challenges, during which synchronized data from all devices were gathered. The dataset includes timestamped records related to emotional valence, arousal, and concentration, offering a valuable resource for multimodal learning analytics (MMLA). The recorded data were processed through calibration procedures, temporal alignment techniques, and emotion recognition models. It is expected that the NPFC-Test dataset will support future studies in human–computer interaction and educational data science by providing structured evidence to analyze cognitive and emotional states in learning processes. In addition, it offers a replicable framework for capturing synchronized biometric and behavioral data in controlled academic settings.
Keywords: multimodal learning analytics; human–computer interaction; facial coding; emotions; EEG; digital biomarkers; valence; arousal; concentration (search for similar items in EconPapers)
JEL-codes: C8 C80 C81 C82 C83 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2306-5729/10/7/103/pdf (application/pdf)
https://www.mdpi.com/2306-5729/10/7/103/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jdataj:v:10:y:2025:i:7:p:103-:d:1690665
Access Statistics for this article
Data is currently edited by Ms. Cecilia Yang
More articles in Data from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().