感知
计算机科学
人工智能
虚拟现实
深度学习
可行走性
视觉感受
语义学(计算机科学)
建筑环境
工程类
心理学
神经科学
土木工程
程序设计语言
作者
Yunqin Li,Nobuyoshi Yabuki,Tomohiro Fukuda
标识
DOI:10.1016/j.scs.2022.104140
摘要
• Six categories of visual walkability perception (VWP) were proposed • VWP was measured with VR panoramic-based deep learning framework • A VWP classification deep multitask learning model was developed • Stepwise regression analysis identified contributing visual elements • VWP was interpreted with gradient-weighted class activation mapping Measuring perceptions of visual walkability in urban streets and exploring the associations between the visual features of the street built environment that make walking attractive to humans are both theoretically and practically important. Previous studies have used either environmental audits and subjective evaluations that have limitations in terms of cost, time, and measurement scale, or computer-aided audits based on natural street view images (SVIs) but with gaps in real perception. In this study, a virtual reality panoramic image-based deep learning framework is proposed for measuring visual walkability perception (VWP) and then quantifying and visualizing the contributing visual features. A VWP classification deep multitask learning (VWPCL) model was first developed and trained on human ratings of panoramic SVIs in virtual reality to predict VWP in six categories. Second, a regression model was used to determine the degree of correlation of various objects with one of the six VWP categories based on semantic segmentation. Furthermore, an interpretable deep learning model was used to assist in identifying and visualizing elements that contribute to VWP. The experiment validated the accuracy of the VWPCL model for predicting VWP. The results represent a further step in understanding the interplay of VWP and street-level semantics and features.
科研通智能强力驱动
Strongly Powered by AbleSci AI