The integration of machine learning (ML) into healthcare is accelerating, driven by the proliferation of biomedical data and the promise of data-driven clinical support. A key challenge in this context is managing the pervasive uncertainty inherent in medical reasoning and decision-making. Despite its recognized importance, uncertainty is often underrepresented in the design and evaluation of clinical AI systems. Here we report an editorial overview of a special issue dedicated to uncertainty modeling in medical AI, which gathers theoretical, methodological, and practical contributions addressing this critical gap. Across these works, authors reveal that fewer than 4% of studies address uncertainty explicitly, and propose alternative design principles—such as optimizing for clinical net benefit or embedding explainability with confidence estimates. Notable contributions include the RelAI system for real-time prediction reliability, empirical findings on how uncertainty communication shapes clinical interpretation, and benchmarks for out-of-distribution detection in tabular data. Furthermore, this issue highlights the use of causal reasoning and anomaly detection to enhance system robustness and accountability. Together, these studies argue that representing, communicating, and operationalizing uncertainty are essential not only for clinical safety but also for building trust in AI-driven care. This special issue thus repositions uncertainty from a limitation to a foundational asset in the responsible deployment of ML in healthcare.

Campagner, A., Biganzoli, E., Balsano, C., Cereda, C., Cabitza, F. (2025). Modeling unknowns: A vision for uncertainty-aware machine learning in healthcare. INTERNATIONAL JOURNAL OF MEDICAL INFORMATICS, 203(November 2025) [10.1016/j.ijmedinf.2025.106014].

Modeling unknowns: A vision for uncertainty-aware machine learning in healthcare

Campagner A.
Primo
;
Cabitza F.
2025

Abstract

The integration of machine learning (ML) into healthcare is accelerating, driven by the proliferation of biomedical data and the promise of data-driven clinical support. A key challenge in this context is managing the pervasive uncertainty inherent in medical reasoning and decision-making. Despite its recognized importance, uncertainty is often underrepresented in the design and evaluation of clinical AI systems. Here we report an editorial overview of a special issue dedicated to uncertainty modeling in medical AI, which gathers theoretical, methodological, and practical contributions addressing this critical gap. Across these works, authors reveal that fewer than 4% of studies address uncertainty explicitly, and propose alternative design principles—such as optimizing for clinical net benefit or embedding explainability with confidence estimates. Notable contributions include the RelAI system for real-time prediction reliability, empirical findings on how uncertainty communication shapes clinical interpretation, and benchmarks for out-of-distribution detection in tabular data. Furthermore, this issue highlights the use of causal reasoning and anomaly detection to enhance system robustness and accountability. Together, these studies argue that representing, communicating, and operationalizing uncertainty are essential not only for clinical safety but also for building trust in AI-driven care. This special issue thus repositions uncertainty from a limitation to a foundational asset in the responsible deployment of ML in healthcare.
Editoriale, introduzione, contributo a forum/dibattito
Machine learning; Medical artificial intelligence; Uncertainty;
English
25-giu-2025
2025
203
November 2025
106014
reserved
Campagner, A., Biganzoli, E., Balsano, C., Cereda, C., Cabitza, F. (2025). Modeling unknowns: A vision for uncertainty-aware machine learning in healthcare. INTERNATIONAL JOURNAL OF MEDICAL INFORMATICS, 203(November 2025) [10.1016/j.ijmedinf.2025.106014].
File in questo prodotto:
File Dimensione Formato  
Campagner-2025-Int J Med Informatics-VoR.pdf

Solo gestori archivio

Tipologia di allegato: Publisher’s Version (Version of Record, VoR)
Licenza: Tutti i diritti riservati
Dimensione 690.61 kB
Formato Adobe PDF
690.61 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/574823
Citazioni
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
Social impact