透明度(行为)
忽视
期望理论
工作量
情感(语言学)
心理学
社会心理学
计算机科学
应用心理学
计算机安全
沟通
精神科
操作系统
作者
Lydia Harbarth,Eva Gößwein,Daniel Bodemer,Lenka Schnaubert
标识
DOI:10.1080/10447318.2023.2301250
摘要
Over-trusting AI systems can lead to complacency and decision errors. However, human and system variables may affect complacency and it is important to understand their interplay for HCI. In our experiment, 90 participants were confronted with traffic route problems guided by AI recommendations and thereby assigned to either a transparent system providing reasons for recommendations or a non-transparent system. We found transparent systems to lower the potential to alleviate workload (albeit not to neglect monitoring), but to simultaneously foster actual complacent behavior. On the contrary, we found performance expectancy to foster the potential to alleviate workload, but not complacent behavior. Interaction analyses showed that effects of performance expectancy depend on system transparency. This contributes to our understanding how system- and person-related variables interact in affecting complacency and stresses the differences between dimensions of complacency and the need for carefully considering transparency and performance expectancy in AI research and design.
科研通智能强力驱动
Strongly Powered by AbleSci AI