EconPapers    
Economics at your fingertips  
 

Emotional-Health-Oriented Urban Design: A Novel Collaborative Deep Learning Framework for Real-Time Landscape Assessment by Integrating Facial Expression Recognition and Pixel-Level Semantic Segmentation

Xuan Zhang, Haoying Han, Lin Qiao, Jingwei Zhuang, Ziming Ren, Yang Su and Yiping Xia ()
Additional contact information
Xuan Zhang: Institute of Urban and Rural Planning Theories and Technologies, College of Civil Engineering and Architecture, Zhejiang University, Hangzhou 310058, China
Haoying Han: Institute of Urban and Rural Planning Theories and Technologies, College of Civil Engineering and Architecture, Zhejiang University, Hangzhou 310058, China
Lin Qiao: Institute of Landscape Architecture, College of Agriculture and Biotechnology, Zhejiang University, Hangzhou 310058, China
Jingwei Zhuang: Institute of Landscape Architecture, College of Agriculture and Biotechnology, Zhejiang University, Hangzhou 310058, China
Ziming Ren: Department of Landscape Architecture, School of Civil Engineering and Architecture, Zhejiang Sci-Tech University, Hangzhou 310018, China
Yang Su: The Architectural Design & Research Institute of Zhejiang University Co., Ltd., Hangzhou 310030, China
Yiping Xia: Institute of Landscape Architecture, College of Agriculture and Biotechnology, Zhejiang University, Hangzhou 310058, China

IJERPH, 2022, vol. 19, issue 20, 1-20

Abstract: Emotional responses are significant for understanding public perceptions of urban green space (UGS) and can be used to inform proposals for optimal urban design strategies to enhance public emotional health in the times of COVID-19. However, most empirical studies fail to consider emotion-oriented landscape assessments under dynamic perspectives despite the fact that individually observed sceneries alter with angle. To close this gap, a real-time sentimental-based landscape assessment framework is developed, integrating facial expression recognition with semantic segmentation of changing landscapes. Furthermore, a case study using panoramic videos converted from Google Street View images to simulate changing scenes was used to test the viability of this framework, resulting in five million big data points. The result of this study shows that through the collaboration of deep learning algorithms, finer visual variables were classified, subtle emotional responses were tracked, and better regression results for valence and arousal were obtained. Among all the predictors, the proportion of grass was the most significant predictor for emotional perception. The proposed framework is adaptable and human-centric, and it enables the instantaneous emotional perception of the built environment by the general public as a feedback survey tool to aid urban planners in creating UGS that promote emotional well-being.

Keywords: urban green space; landscape assessment; deep learning; facial expression recognition; semantic segmentation (search for similar items in EconPapers)
JEL-codes: I I1 I3 Q Q5 (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
https://www.mdpi.com/1660-4601/19/20/13308/pdf (application/pdf)
https://www.mdpi.com/1660-4601/19/20/13308/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jijerp:v:19:y:2022:i:20:p:13308-:d:943356

Access Statistics for this article

IJERPH is currently edited by Ms. Jenna Liu

More articles in IJERPH from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jijerp:v:19:y:2022:i:20:p:13308-:d:943356