Accountability for interpretation errors made by artificially intelligent algorithms used in medical imaging for clinical practice, a scoping review Abstract: Artificial Intelligence will change Radiology, and whilst we don’t know whether it will be for the better or not, we must create an environment that can foster innovation and keeps the safety of patients at the forefront of its agenda. Despite rapid advances over the past decade, only recently have early algorithms started finding their way on to radiologist workstations. Yet, we do not have an agreed robust framework to deal with the adoption of this new technology especially when things go wrong. Accountability in medical artificial intelligent decisions is a complex situation with multifactorial influences where it is impossible to assign blame to one party but rather, requires the concept of shared accountability. All stakeholders involved in the decision-making process that resulted in an error should own their role in achieving this result and ultimately share its burden. Project type: Literature review Imaging keywords: Radiology Application / disease keywords: Artificial Intelligence Computer-aided detection Computer-aided diagnosis Machine learning Medical artificial intelligent decisions Supervisor(s): Dr Luciana D’Adderio Programme: Neuroimaging for research MSc Year: 21-22 This article was published on 2024-08-22