自动化
驾驶舱
提交
问责
航空
感知
计算机科学
航空安全
毒物控制
人机交互
计算机安全
风险分析(工程)
应用心理学
心理学
工程类
航空学
医学
数据库
医疗急救
机械工程
神经科学
法学
政治学
航空航天工程
作者
Kathleen L. Mosier,Linda J. Skitka,Susan T. Heers,M D Burdick
标识
DOI:10.1207/s15327108ijap0801_3
摘要
Automated aids and decision support tools are rapidly becoming indispensable tools in high-technology cockpits and are assuming increasing control of"cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate automation bias, a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation events or opportunities for automation-related omission and commission errors. Although experimentally manipulated accountability demands did not significantly impact performance, post hoc analyses revealed that those pilots who reported an internalized perception of "accountability" for their performance and strategies of interaction with the automation were significantly more likely to double-check automated functioning against other cues and less likely to commit errors than those who did not share this perception. Pilots were also lilkely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
科研通智能强力驱动
Strongly Powered by AbleSci AI