Sorry, you need to enable JavaScript to visit this website.
Skip to main content
Commentary

Using human factors methods to mitigate bias in artificial intelligence-based clinical decision support.

Militello LG, Diiulio J, Wilson DL, et al. Using human factors methods to mitigate bias in artificial intelligence-based clinical decision support. J Am Med Inform Assoc. 2024;Epub Nov 21. doi:10.1093/jamia/ocae291.

Save
Print
January 8, 2025
Militello LG, Diiulio J, Wilson DL, et al. J Am Med Inform Assoc. 2024;Epub Nov 21.
View more articles from the same authors.

The potential for bias in artificial intelligence (AI) training data is a well-known problem, but the potential for bias resulting from a poorly designed user interface (UI) is less studied. The authors use their experience developing a machine learning-based clinical decision support tool to highlight three considerations in designing UIs for AI applications: (1) bias is not just about the algorithm, (2) it is possible to identify bias and interpretation errors before an application is released, and (3) risk communication strategies can influence bias in unexpected ways.

Save
Print
Cite
Citation

Militello LG, Diiulio J, Wilson DL, et al. Using human factors methods to mitigate bias in artificial intelligence-based clinical decision support. J Am Med Inform Assoc. 2024;Epub Nov 21. doi:10.1093/jamia/ocae291.