Interpretable Machine Learning in Healthcare

ICML 2021 Workshop | Friday, July 23, 2021 | Virtual Worldwide

Overview

Applying machine learning (ML) in healthcare is gaining momentum rapidly. However, the black-box characteristics of the existing ML approach inevitably lead to less interpretability and verifiability in making clinical predictions. To enhance the interpretability of medical intelligence, it becomes critical to develop methodologies to explain predictions as these systems are pervasively being introduced to the healthcare domain, which requires a higher level of safety and security. Such methodologies would make medical decisions more trustworthy and reliable for physicians, which could ultimately facilitate the deployment. On the other hand, it is also essential to develop more interpretable and transparent ML systems. For instance, by exploiting structured knowledge or prior clinical information, one can design models to learn aspects more aligned with clinical reasoning. Also, it may help mitigate biases in the learning process or identify more relevant variables for making medical decisions.

In this workshop, we aim to bring together researchers in ML, computer vision, healthcare, medicine, NLP, and clinical fields to facilitate discussions including related challenges, definitions, formalisms, and evaluation protocols regarding interpretable medical machine intelligence. Additionally, we will seek possible solutions such as logic and symbolic reasoning over medical knowledge graphs, uncertainty quantification, composition models, etc. We hope that the proposed workshop is fruitful in offering a step toward building autonomous clinical decision systems with a higher-level understanding of interpretability.

Important Dates

Submissions: June 10, 2021, AoE June 17, 2021, AoE

Notifications: July 3, 2021

Workshop: July 23, 2021

Call for Submissions

We invite participation in the 1st Workshop on Interpretable Machine Learning in Healthcare (IMLH), to be held as part of the ICML2021 conference.


Please check Submissions for more information about the topics and submission instructions.

Contact

Please contact Yuyin Zhou (zhouyuyiner@gmail.com) or Xiaoxiao Li (xxlzju@gmail.com) if you have any questions.