Analysis of article using Artificial Intelligence tools
Id | 2706 | |
Author | Zhang X.; Han H.; Qiao L.; Zhuang J.; Ren Z.; Su Y.; Xia Y. | |
Title | Emotional-Health-Oriented Urban Design: A Novel Collaborative Deep Learning Framework for Real-Time Landscape Assessment by Integrating Facial Expression Recognition and Pixel-Level Semantic Segmentation | |
Reference | Zhang X.; Han H.; Qiao L.; Zhuang J.; Ren Z.; Su Y.; Xia Y. Emotional-Health-Oriented Urban Design: A Novel Collaborative Deep Learning Framework for Real-Time Landscape Assessment by Integrating Facial Expression Recognition and Pixel-Level Semantic Segmentation,International Journal of Environmental Research and Public Health 19 20 |
|
Keywords | COVID-19; Deep Learning; Emotions; Facial Recognition; Humans; Semantics; greenspace; health impact; machine learning; perception; public attitude; urban design; algorithm; arousal; article; big data; built environment; deep learning; emotion; emotional stability; emotional well-being; facial recognition; grass; human; human experiment; nonhuman; perception; videorecording; physiology; semantics |
|
Link to article | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85140760158&doi=10.3390%2fijerph192013308&partnerID=40&md5=4b51783afa13b5498a298348cf04e160 |
|
Abstract | Emotional responses are significant for understanding public perceptions of urban green space (UGS) and can be used to inform proposals for optimal urban design strategies to enhance public emotional health in the times of COVID-19. However, most empirical studies fail to consider emotion-oriented landscape assessments under dynamic perspectives despite the fact that individually observed sceneries alter with angle. To close this gap, a real-time sentimental-based landscape assessment framework is developed, integrating facial expression recognition with semantic segmentation of changing landscapes. Furthermore, a case study using panoramic videos converted from Google Street View images to simulate changing scenes was used to test the viability of this framework, resulting in five million big data points. The result of this study shows that through the collaboration of deep learning algorithms, finer visual variables were classified, subtle emotional responses were tracked, and better regression results for valence and arousal were obtained. Among all the predictors, the proportion of grass was the most significant predictor for emotional perception. The proposed framework is adaptable and human-centric, and it enables the instantaneous emotional perception of the built environment by the general public as a feedback survey tool to aid urban planners in creating UGS that promote emotional well-being. © 2022 by the authors. |
|
Metodology | Technique |