Controlling your contents with the breath: Interactive breath interface for VR, games, and animations
Jong-Hyun Kim and
Jung Lee
PLOS ONE, 2020, vol. 15, issue 10, 1-27
Abstract:
In this paper, we propose a new interface to control VR(Virtual reality) contents, games, and animations in real-time using the user’s breath and the acceleration sensor of a mobile device. Although interaction techniques are very important in VR and physically-based animations, UI(User interface) methods using different types of devices or controllers have not been covered. Most of the proposed interaction techniques have focused on screen touch and motion recognition. The direction of the breath is calculated using the position and angle between the user and the mobile device, and the control position to handle the contents is determined using the acceleration sensor built into the mobile device. Finally, to remove the noise contained in the input breath, the magnitude of the wind is filtered using a kernel modeling a pattern similar to the actual breath. To demonstrate the superiority of this study, we produced real-time interaction results by applying the breath as an external force of VR contents, games, and animations.
Date: 2020
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0241498 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 41498&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0241498
DOI: 10.1371/journal.pone.0241498
Access Statistics for this article
More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().