公司治理
医学
衡平法
心理信息
集合(抽象数据类型)
代理(统计)
医疗保健
梅德林
计算机科学
机器学习
政治学
业务
财务
程序设计语言
法学
作者
Benjamin A. Goldstein,Dinushika Mohottige,Sophia Bessias,Michael P. Cary
标识
DOI:10.1053/j.ajkd.2024.04.008
摘要
There has been a steady rise in the use of clinical decision support (CDS) tools to guide Nephrology, as well as general clinical care. Through guidance set by federal agencies and concerns raised by clinical investigators, there has been an equal rise in understanding whether such tools exhibit algorithmic bias leading to unfairness. This has spurred the more fundamental question of whether sensitive variables such as race should be included in CDS tools. In order to properly answer this question, it is necessary to understand how algorithmic bias arises. We break down three sources of bias encountered when using electronic health record data to develop CDS tools: (1) use of proxy variables, (2) observability concerns and (3) underlying heterogeneity. We discuss how answering the question of whether to include sensitive variables like race often hinges more on qualitative considerations than on quantitative analysis, dependent on the function that the sensitive variable serves. Based on our experience with our own institution's CDS governance group, we show how health system-based governance committees play a central role in guiding these difficult and important considerations. Ultimately, our goal is to foster a community practice of model development and governance teams that emphasizes consciousness about sensitive variables and prioritizes equity.
科研通智能强力驱动
Strongly Powered by AbleSci AI