Automated image transcription for perinatal blood pressure monitoring using mobile health technology
Nasim Katebi,
Whitney Bremer,
Tony Nguyen,
Daniel Phan,
Jamila Jeff,
Kirkland Armstrong,
Paula Phabian-Millbrook,
Marissa Platner,
Kimberly Carroll,
Banafsheh Shoai,
Peter Rohloff,
Sheree L Boulet,
Cheryl G Franklin and
Gari D Clifford
PLOS Digital Health, 2024, vol. 3, issue 10, 1-17
Abstract:
This paper introduces a novel approach to address the challenges associated with transferring blood pressure (BP) data obtained from oscillometric devices used in self-measured BP monitoring systems to integrate this data into medical health records or a proxy database accessible by clinicians, particularly in low literacy populations. To this end, we developed an automated image transcription technique to effectively transcribe readings from BP devices, ultimately enhancing the accessibility and usability of BP data for monitoring and managing BP during pregnancy and the postpartum period, particularly in low-resource settings and low-literate populations. In the designed study, the photos of the BP devices were captured as part of perinatal mobile health (mHealth) monitoring programs, conducted in four studies across two countries. The Guatemala Set 1 and Guatemala Set 2 datasets include the data captured by a cohort of 49 lay midwives from 1697 and 584 pregnant women carrying singletons in the second and third trimesters in rural Guatemala during routine screening. Additionally, we designed an mHealth system in Georgia for postpartum women to monitor and report their BP at home with 23 and 49 African American participants contributing to the Georgia I3 and Georgia IMPROVE projects, respectively. We developed a deep learning-based model which operates in two steps: LCD localization using the You Only Look Once (YOLO) object detection model and digit recognition using a convolutional neural network-based model capable of recognizing multiple digits. We applied color correction and thresholding techniques to minimize the impact of reflection and artifacts. Three experiments were conducted based on the devices used for training the digit recognition model. Overall, our results demonstrate that the device-specific model with transfer learning and the device independent model outperformed the device-specific model without transfer learning. The mean absolute error (MAE) of image transcription on held-out test datasets using the device-independent digit recognition were 1.2 and 0.8 mmHg for systolic and diastolic BP in the Georgia IMPROVE and 0.9 and 0.5 mmHg in Guatemala Set 2 datasets. The MAE, far below the FDA recommendation of 5 mmHg, makes the proposed automatic image transcription model suitable for general use when used with appropriate low-error BP devices.Author summary: Monitoring blood pressure (BP) is critical during pregnancy and the postpartum period, especially in low-resource settings. Transferring BP data from devices to medical records poses significant challenges, particularly for low-literate populations. To address this, we developed an automated image transcription technique that accurately transcribes BP readings from photos of BP devices, making this data more accessible for healthcare providers. Our research involved capturing BP device photos as part of mobile health (mHealth) programs in rural Guatemala and Georgia, USA. Data were collected from pregnant and postpartum women, supported by local midwives in these regions. We designed a deep learning model that first locates the BP reading on the device screen using the YOLO object detection model and then recognizes the digits using a convolutional neural network. The model demonstrated high accuracy, with a mean absolute error significantly below the FDA’s recommended limit, proving its suitability for general use. This approach enhances the integration of BP data into health records, improving BP monitoring and management in low-resource and low-literate populations, ultimately contributing to better maternal health outcomes.
Date: 2024
References: Add references at CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/digitalhealth/article?id=10.1371/journal.pdig.0000588 (text/html)
https://journals.plos.org/digitalhealth/article/fi ... 00588&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pdig00:0000588
DOI: 10.1371/journal.pdig.0000588
Access Statistics for this article
More articles in PLOS Digital Health from Public Library of Science
Bibliographic data for series maintained by digitalhealth ().