Trust in Clinical AI: Expanding the Unit of Analysis

Jacob T. Browne, Saskia Bakker, Bin Yu, Peter Lloyd, Somaya Ben Allouch

Research output: Contribution to conferencePaperAcademic

149 Downloads (Pure)

Abstract

From diagnosis to patient scheduling, AI is increasingly being considered across different clinical applications. Despite increasingly powerful clinical AI, uptake into actual clinical workflows remains limited. One of the major challenges is developing appropriate trust with clinicians. In this paper, we investigate trust in clinical AI in a wider perspective beyond user interactions with the AI. We offer several points in the clinical AI development, usage, and monitoring process that can have a significant impact on trust. We argue that the calibration of trust in AI should go beyond explainable AI and focus on the entire process of clinical AI deployment. We illustrate our argument with case studies from practitioners implementing clinical AI in practice to show how trust can be affected by different stages in the deployment cycle.
Original languageEnglish
Number of pages18
Publication statusPublished - 15 Jun 2022
Event1st International Conference on Hybrid Human-Artificial Intelligence: HHAI2022 - Amsterdam, Netherlands
Duration: 13 Jun 202217 Jun 2022

Conference

Conference1st International Conference on Hybrid Human-Artificial Intelligence
Country/TerritoryNetherlands
CityAmsterdam
Period13/06/2217/06/22

Fingerprint

Dive into the research topics of 'Trust in Clinical AI: Expanding the Unit of Analysis'. Together they form a unique fingerprint.

Cite this