SlideShare a Scribd company logo
1 of 24
Directive Explanations for Monitoring the Risk of Diabetes
Onset:
Introducing Directive Data-Centric Explanations and
Combinations to Support What-If Explorations
Aditya Bhattacharya
aditya.bhattacharya@kuleuven.be
@adib0073
Jeroen Ooge
jeroen.ooge@kuleuven.be
@JeroenOoge
Gregor Stiglic
gregor.stiglic@um.si
@GStiglic
Katrien Verbert
katrien.verbert@kuleuven.be
@katrien_v
Explainable Decision Support
Systems in Healthcare
Explainable Decision Support Systems in Healthcare
ML-based
Decision Support Systems
XAI Methods
Explainable Decision Support Systems
Explainable Decision Support Systems in Healthcare
ML-based
Decision Support Systems
XAI Methods
Explainable Decision Support Systems
Healthcare Experts Explainable Interface for
monitoring the risk of
diabetes onset for patients
Understand the rationale
behind the predicted risk
of diabetes onset
Monitoring the Risk of
Type 2 Diabetes Onset
Visually Directive Explanation
Dashboard
Visually Directive Explanation Dashboard
Visually Directive Explanation Dashboard
Feature Importance Explanation
Visually Directive Explanation Dashboard
Data-Centric Explanation
Visually Directive Explanation Dashboard
Example-based Explanation
Explainable AI Methods
Feature Importance Explanations (Model-Centric Explanations)
• Feature importance explainability is a model-centric explanation method as it estimates the
importance of features in the model that have the most influence on its output or prediction.
• Examples of feature importance methods are permutation importance, partial dependence
plots, LIME based feature importance and Shapley values (SHAP) based feature importance.
Data-Centric Explanations
• Data-centric explainability focuses on examining the data used to train the model rather than the model's internal
workings. The idea is that by analyzing the training data, we can gain insights into how the model makes its
predictions and identify potential biases or errors.
• Examples of data-centric explanation approaches include summarizing datasets using common statistical methods
like mean, mode, and variance, visualizing the data distributions to compare feature values to those across the
remaining dataset, and observing changes in model predictions through what-if analysis to probe into the
sensitivity of the features.
• Additionally, data-centric explanations include creating more awareness about the data quality by sharing more
insights about the various data issues, such as data drift, skewed data, outliers, correlated features and etc., that
can impact the overall performance of the ML models.
Counterfactual Explanations (Example-based Explanations)
• Counterfactual explanations are example-based methods that provide minimum
conditions required to obtain an alternate decision.
• Rather than explaining the inner working of the model, counterfactuals can guide users to
obtain their desired predictions.
* Applied Machine Learning Explainability Techniques, A.
Bhattacharya
*
Research Questions
and
User Study
Research Questions
RQ1. In what ways do patients and HCPs find our visually directive explanation
dashboard useful for monitoring and evaluating the risk of diabetes onset?
RQ2. In what ways do HCP and patients perceive data-centric, model-centric, and
example-based visually directive explanations in terms of usefulness, understandability,
and trustworthiness in the context of healthcare?
RQ3. In what ways do visually directive explanations facilitate patients and HCPs to take
action for improving patient conditions?
Iterative User-Centric Design and Evaluation Process
Low-fidelity prototype High-fidelity prototype
Figma click-through prototype
Interactive web application prototype
11 healthcare experts
Qualitative study through 1:1 interviews
45 healthcare experts and 51 diabetes patients
Mixed-methods study through online questionnaires
Thematic analysis for evaluation
Evaluation through descriptive statistics, test of
proportion, and analyzing participant-reported
Likert scale question
Key-takeaways
Combining XAI methods to address different dimensions of explainability
* Applied Machine Learning Explainability Techniques, A.
Bhattacharya
*
Tailoring Directive Explanations for Healthcare Experts
o Increasing actionability through interactive what-if analysis
o Explanations through actionable features instead of non-actionable features
o Color-coded visual indicators
o Data-centric directive explanations
* These design implications are aligned with the recommendations from Wang et al. [2019] -
Designing Theory-Driven User-Centric Explainable AI
Summarizing the contribution of this research
1. Combining XAI methods to address different dimensions of explainability
2. Visually directive data-centric explanations that provide local explanations with a global overview
3. The design of a directive explanation dashboard that combines different explanation methods and
further compared them in terms of understandability, usefulness, actionability, and trustworthiness
with healthcare experts and patients.
4. Design implications for tailoring visually directive explanations for healthcare experts
Summarizing the contribution of this research
1. Combining XAI methods to address different dimensions of explainability
2. Visually directive data-centric explanations that provide local explanations with a global overview
3. The design of a directive explanation dashboard that combines different explanation methods and
further compared them in terms of understandability, usefulness, actionability, and trustworthiness
with healthcare experts and patients.
4. Design implications for tailoring visually directive explanations for healthcare experts
Summarizing the contribution of this research
1. Combining XAI methods to address different dimensions of explainability
2. Visually directive data-centric explanations that provide local explanations with a global overview
3. The design of a directive explanation dashboard that combines different explanation methods and
further compared them in terms of understandability, usefulness, actionability, and trustworthiness
with healthcare experts and patients.
4. Design implications for tailoring visually directive explanations for healthcare experts
Summarizing the contribution of this research
1. Combining XAI methods to address different dimensions of explainability
2. Visually directive data-centric explanations that provide local explanations with a global overview
3. The design of a directive explanation dashboard that combines different explanation methods and
further compared them in terms of understandability, usefulness, actionability, and trustworthiness
with healthcare experts and patients.
4. Design implications for tailoring visually directive explanations for healthcare experts
Thank you for your attention!
Directive Explanations for Monitoring the Risk of Diabetes
Onset:
Introducing Directive Data-Centric Explanations and
Combinations to Support What-If Explorations
Aditya Bhattacharya
aditya.bhattacharya@kuleuven.be
@adib0073
Jeroen Ooge
jeroen.ooge@kuleuven.be
@JeroenOoge
Gregor Stiglic
gregor.stiglic@um.si
@GStiglic
Katrien Verbert
katrien.verbert@kuleuven.be
@katrien_v

More Related Content

Similar to Visually Directive Dashboard for Diabetes Risk Monitoring

Required ResourcesThe following resources are required to comple.docx
Required ResourcesThe following resources are required to comple.docxRequired ResourcesThe following resources are required to comple.docx
Required ResourcesThe following resources are required to comple.docxaudeleypearl
 
Discussion 3Select a topic for your Topic 3 Executive Summary as.docx
Discussion 3Select a topic for your Topic 3 Executive Summary as.docxDiscussion 3Select a topic for your Topic 3 Executive Summary as.docx
Discussion 3Select a topic for your Topic 3 Executive Summary as.docxduketjoy27252
 
Customer Journey Analytics: Cracking the Patient Engagement Challenge for Payers
Customer Journey Analytics: Cracking the Patient Engagement Challenge for PayersCustomer Journey Analytics: Cracking the Patient Engagement Challenge for Payers
Customer Journey Analytics: Cracking the Patient Engagement Challenge for PayersHealth Catalyst
 
IRJET - Medicine Recommendation System
IRJET - Medicine Recommendation SystemIRJET - Medicine Recommendation System
IRJET - Medicine Recommendation SystemIRJET Journal
 
Generative AI in Health Care a scoping review and a persoanl experience.
Generative AI in Health Care a scoping review and a persoanl experience.Generative AI in Health Care a scoping review and a persoanl experience.
Generative AI in Health Care a scoping review and a persoanl experience.Vaikunthan Rajaratnam
 
Integrate RWE into clinical development
Integrate RWE into clinical developmentIntegrate RWE into clinical development
Integrate RWE into clinical developmentIMSHealthRWES
 
Centralization of Healthcare Insurance.docx
Centralization of Healthcare Insurance.docxCentralization of Healthcare Insurance.docx
Centralization of Healthcare Insurance.docxwrite31
 
The Inclusion of Nurses in the Systems Development Life Cycle.docx
The Inclusion of Nurses in the Systems Development Life Cycle.docxThe Inclusion of Nurses in the Systems Development Life Cycle.docx
The Inclusion of Nurses in the Systems Development Life Cycle.docxwrite5
 
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignmentvincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignmentvincent barner
 
Final Medical Guideline Paper
Final Medical Guideline PaperFinal Medical Guideline Paper
Final Medical Guideline PaperSumaiya Sarawat
 
Care Report My Nursing Experts.docx
Care Report My Nursing Experts.docxCare Report My Nursing Experts.docx
Care Report My Nursing Experts.docxstirlingvwriters
 
Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...
Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...
Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...Investnet
 
Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...
Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...
Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...Health Informatics New Zealand
 
McGrath Health Data Analyst SXSW
McGrath Health Data Analyst SXSWMcGrath Health Data Analyst SXSW
McGrath Health Data Analyst SXSWRobert McGrath
 
Cindy Brach - Becoming a Health Literate Organization
Cindy Brach - Becoming a Health Literate OrganizationCindy Brach - Becoming a Health Literate Organization
Cindy Brach - Becoming a Health Literate OrganizationPlain Talk 2015
 
Evolution Of Health Care Information Systems
Evolution Of Health Care Information SystemsEvolution Of Health Care Information Systems
Evolution Of Health Care Information SystemsLana Sorrels
 
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...Health Catalyst
 
Artificial intelligence in healthcare revolutionizing personalized healthcare...
Artificial intelligence in healthcare revolutionizing personalized healthcare...Artificial intelligence in healthcare revolutionizing personalized healthcare...
Artificial intelligence in healthcare revolutionizing personalized healthcare...Fit Focus Hub
 

Similar to Visually Directive Dashboard for Diabetes Risk Monitoring (20)

Required ResourcesThe following resources are required to comple.docx
Required ResourcesThe following resources are required to comple.docxRequired ResourcesThe following resources are required to comple.docx
Required ResourcesThe following resources are required to comple.docx
 
Discussion 3Select a topic for your Topic 3 Executive Summary as.docx
Discussion 3Select a topic for your Topic 3 Executive Summary as.docxDiscussion 3Select a topic for your Topic 3 Executive Summary as.docx
Discussion 3Select a topic for your Topic 3 Executive Summary as.docx
 
Customer Journey Analytics: Cracking the Patient Engagement Challenge for Payers
Customer Journey Analytics: Cracking the Patient Engagement Challenge for PayersCustomer Journey Analytics: Cracking the Patient Engagement Challenge for Payers
Customer Journey Analytics: Cracking the Patient Engagement Challenge for Payers
 
IRJET - Medicine Recommendation System
IRJET - Medicine Recommendation SystemIRJET - Medicine Recommendation System
IRJET - Medicine Recommendation System
 
Generative AI in Health Care a scoping review and a persoanl experience.
Generative AI in Health Care a scoping review and a persoanl experience.Generative AI in Health Care a scoping review and a persoanl experience.
Generative AI in Health Care a scoping review and a persoanl experience.
 
Integrate RWE into clinical development
Integrate RWE into clinical developmentIntegrate RWE into clinical development
Integrate RWE into clinical development
 
Centralization of Healthcare Insurance.docx
Centralization of Healthcare Insurance.docxCentralization of Healthcare Insurance.docx
Centralization of Healthcare Insurance.docx
 
The Inclusion of Nurses in the Systems Development Life Cycle.docx
The Inclusion of Nurses in the Systems Development Life Cycle.docxThe Inclusion of Nurses in the Systems Development Life Cycle.docx
The Inclusion of Nurses in the Systems Development Life Cycle.docx
 
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignmentvincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
 
Final Medical Guideline Paper
Final Medical Guideline PaperFinal Medical Guideline Paper
Final Medical Guideline Paper
 
Care Report My Nursing Experts.docx
Care Report My Nursing Experts.docxCare Report My Nursing Experts.docx
Care Report My Nursing Experts.docx
 
Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...
Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...
Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...
 
Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...
Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...
Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...
 
Promoting the Spread of Health Care Innovations
Promoting the Spread of Health Care InnovationsPromoting the Spread of Health Care Innovations
Promoting the Spread of Health Care Innovations
 
McGrath Health Data Analyst SXSW
McGrath Health Data Analyst SXSWMcGrath Health Data Analyst SXSW
McGrath Health Data Analyst SXSW
 
Cindy Brach - Becoming a Health Literate Organization
Cindy Brach - Becoming a Health Literate OrganizationCindy Brach - Becoming a Health Literate Organization
Cindy Brach - Becoming a Health Literate Organization
 
Evolution Of Health Care Information Systems
Evolution Of Health Care Information SystemsEvolution Of Health Care Information Systems
Evolution Of Health Care Information Systems
 
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
 
Artificial intelligence in healthcare revolutionizing personalized healthcare...
Artificial intelligence in healthcare revolutionizing personalized healthcare...Artificial intelligence in healthcare revolutionizing personalized healthcare...
Artificial intelligence in healthcare revolutionizing personalized healthcare...
 
Care Report.docx
Care Report.docxCare Report.docx
Care Report.docx
 

More from Aditya Bhattacharya

Explainable AI - making ML and DL models more interpretable
Explainable AI - making ML and DL models more interpretableExplainable AI - making ML and DL models more interpretable
Explainable AI - making ML and DL models more interpretableAditya Bhattacharya
 
Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...
Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...
Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...Aditya Bhattacharya
 
Machine learning and Deep learning on edge devices using TensorFlow
Machine learning and Deep learning on edge devices using TensorFlowMachine learning and Deep learning on edge devices using TensorFlow
Machine learning and Deep learning on edge devices using TensorFlowAditya Bhattacharya
 
Time series Segmentation & Anomaly Detection
Time series Segmentation & Anomaly DetectionTime series Segmentation & Anomaly Detection
Time series Segmentation & Anomaly DetectionAditya Bhattacharya
 
Application of Masked RCNN for segmentation of brain haemorrhage from Compute...
Application of Masked RCNN for segmentation of brain haemorrhage from Compute...Application of Masked RCNN for segmentation of brain haemorrhage from Compute...
Application of Masked RCNN for segmentation of brain haemorrhage from Compute...Aditya Bhattacharya
 
Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...
Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...
Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...Aditya Bhattacharya
 
Aditya Bhattacharya Chest XRay Image Analysis Using Deep Learning
Aditya Bhattacharya Chest XRay Image Analysis Using Deep LearningAditya Bhattacharya Chest XRay Image Analysis Using Deep Learning
Aditya Bhattacharya Chest XRay Image Analysis Using Deep LearningAditya Bhattacharya
 
Computer vision-must-nit-silchar-ml-hackathon-2019
Computer vision-must-nit-silchar-ml-hackathon-2019Computer vision-must-nit-silchar-ml-hackathon-2019
Computer vision-must-nit-silchar-ml-hackathon-2019Aditya Bhattacharya
 
Computer vision-nit-silchar-hackathon
Computer vision-nit-silchar-hackathonComputer vision-nit-silchar-hackathon
Computer vision-nit-silchar-hackathonAditya Bhattacharya
 

More from Aditya Bhattacharya (10)

ODSC APAC 2022 - Explainable AI
ODSC APAC 2022 - Explainable AIODSC APAC 2022 - Explainable AI
ODSC APAC 2022 - Explainable AI
 
Explainable AI - making ML and DL models more interpretable
Explainable AI - making ML and DL models more interpretableExplainable AI - making ML and DL models more interpretable
Explainable AI - making ML and DL models more interpretable
 
Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...
Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...
Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...
 
Machine learning and Deep learning on edge devices using TensorFlow
Machine learning and Deep learning on edge devices using TensorFlowMachine learning and Deep learning on edge devices using TensorFlow
Machine learning and Deep learning on edge devices using TensorFlow
 
Time series Segmentation & Anomaly Detection
Time series Segmentation & Anomaly DetectionTime series Segmentation & Anomaly Detection
Time series Segmentation & Anomaly Detection
 
Application of Masked RCNN for segmentation of brain haemorrhage from Compute...
Application of Masked RCNN for segmentation of brain haemorrhage from Compute...Application of Masked RCNN for segmentation of brain haemorrhage from Compute...
Application of Masked RCNN for segmentation of brain haemorrhage from Compute...
 
Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...
Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...
Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...
 
Aditya Bhattacharya Chest XRay Image Analysis Using Deep Learning
Aditya Bhattacharya Chest XRay Image Analysis Using Deep LearningAditya Bhattacharya Chest XRay Image Analysis Using Deep Learning
Aditya Bhattacharya Chest XRay Image Analysis Using Deep Learning
 
Computer vision-must-nit-silchar-ml-hackathon-2019
Computer vision-must-nit-silchar-ml-hackathon-2019Computer vision-must-nit-silchar-ml-hackathon-2019
Computer vision-must-nit-silchar-ml-hackathon-2019
 
Computer vision-nit-silchar-hackathon
Computer vision-nit-silchar-hackathonComputer vision-nit-silchar-hackathon
Computer vision-nit-silchar-hackathon
 

Recently uploaded

Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...fonyou31
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...anjaliyadav012327
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 

Recently uploaded (20)

Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
JAPAN: ORGANISATION OF PMDA, PHARMACEUTICAL LAWS & REGULATIONS, TYPES OF REGI...
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 

Visually Directive Dashboard for Diabetes Risk Monitoring

  • 1. Directive Explanations for Monitoring the Risk of Diabetes Onset: Introducing Directive Data-Centric Explanations and Combinations to Support What-If Explorations Aditya Bhattacharya aditya.bhattacharya@kuleuven.be @adib0073 Jeroen Ooge jeroen.ooge@kuleuven.be @JeroenOoge Gregor Stiglic gregor.stiglic@um.si @GStiglic Katrien Verbert katrien.verbert@kuleuven.be @katrien_v
  • 3. Explainable Decision Support Systems in Healthcare ML-based Decision Support Systems XAI Methods Explainable Decision Support Systems
  • 4. Explainable Decision Support Systems in Healthcare ML-based Decision Support Systems XAI Methods Explainable Decision Support Systems Healthcare Experts Explainable Interface for monitoring the risk of diabetes onset for patients Understand the rationale behind the predicted risk of diabetes onset Monitoring the Risk of Type 2 Diabetes Onset
  • 7. Visually Directive Explanation Dashboard Feature Importance Explanation
  • 8. Visually Directive Explanation Dashboard Data-Centric Explanation
  • 9. Visually Directive Explanation Dashboard Example-based Explanation
  • 11. Feature Importance Explanations (Model-Centric Explanations) • Feature importance explainability is a model-centric explanation method as it estimates the importance of features in the model that have the most influence on its output or prediction. • Examples of feature importance methods are permutation importance, partial dependence plots, LIME based feature importance and Shapley values (SHAP) based feature importance.
  • 12. Data-Centric Explanations • Data-centric explainability focuses on examining the data used to train the model rather than the model's internal workings. The idea is that by analyzing the training data, we can gain insights into how the model makes its predictions and identify potential biases or errors. • Examples of data-centric explanation approaches include summarizing datasets using common statistical methods like mean, mode, and variance, visualizing the data distributions to compare feature values to those across the remaining dataset, and observing changes in model predictions through what-if analysis to probe into the sensitivity of the features. • Additionally, data-centric explanations include creating more awareness about the data quality by sharing more insights about the various data issues, such as data drift, skewed data, outliers, correlated features and etc., that can impact the overall performance of the ML models.
  • 13. Counterfactual Explanations (Example-based Explanations) • Counterfactual explanations are example-based methods that provide minimum conditions required to obtain an alternate decision. • Rather than explaining the inner working of the model, counterfactuals can guide users to obtain their desired predictions. * Applied Machine Learning Explainability Techniques, A. Bhattacharya *
  • 15. Research Questions RQ1. In what ways do patients and HCPs find our visually directive explanation dashboard useful for monitoring and evaluating the risk of diabetes onset? RQ2. In what ways do HCP and patients perceive data-centric, model-centric, and example-based visually directive explanations in terms of usefulness, understandability, and trustworthiness in the context of healthcare? RQ3. In what ways do visually directive explanations facilitate patients and HCPs to take action for improving patient conditions?
  • 16. Iterative User-Centric Design and Evaluation Process Low-fidelity prototype High-fidelity prototype Figma click-through prototype Interactive web application prototype 11 healthcare experts Qualitative study through 1:1 interviews 45 healthcare experts and 51 diabetes patients Mixed-methods study through online questionnaires Thematic analysis for evaluation Evaluation through descriptive statistics, test of proportion, and analyzing participant-reported Likert scale question
  • 18. Combining XAI methods to address different dimensions of explainability * Applied Machine Learning Explainability Techniques, A. Bhattacharya *
  • 19. Tailoring Directive Explanations for Healthcare Experts o Increasing actionability through interactive what-if analysis o Explanations through actionable features instead of non-actionable features o Color-coded visual indicators o Data-centric directive explanations * These design implications are aligned with the recommendations from Wang et al. [2019] - Designing Theory-Driven User-Centric Explainable AI
  • 20. Summarizing the contribution of this research 1. Combining XAI methods to address different dimensions of explainability 2. Visually directive data-centric explanations that provide local explanations with a global overview 3. The design of a directive explanation dashboard that combines different explanation methods and further compared them in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. 4. Design implications for tailoring visually directive explanations for healthcare experts
  • 21. Summarizing the contribution of this research 1. Combining XAI methods to address different dimensions of explainability 2. Visually directive data-centric explanations that provide local explanations with a global overview 3. The design of a directive explanation dashboard that combines different explanation methods and further compared them in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. 4. Design implications for tailoring visually directive explanations for healthcare experts
  • 22. Summarizing the contribution of this research 1. Combining XAI methods to address different dimensions of explainability 2. Visually directive data-centric explanations that provide local explanations with a global overview 3. The design of a directive explanation dashboard that combines different explanation methods and further compared them in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. 4. Design implications for tailoring visually directive explanations for healthcare experts
  • 23. Summarizing the contribution of this research 1. Combining XAI methods to address different dimensions of explainability 2. Visually directive data-centric explanations that provide local explanations with a global overview 3. The design of a directive explanation dashboard that combines different explanation methods and further compared them in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. 4. Design implications for tailoring visually directive explanations for healthcare experts
  • 24. Thank you for your attention! Directive Explanations for Monitoring the Risk of Diabetes Onset: Introducing Directive Data-Centric Explanations and Combinations to Support What-If Explorations Aditya Bhattacharya aditya.bhattacharya@kuleuven.be @adib0073 Jeroen Ooge jeroen.ooge@kuleuven.be @JeroenOoge Gregor Stiglic gregor.stiglic@um.si @GStiglic Katrien Verbert katrien.verbert@kuleuven.be @katrien_v

Editor's Notes

  1. Explainable artificial intelligence is increasingly used in machine learning (ML) based decision-making systems in healthcare Existing XAI methods such as LIME, SHAP, Saliency Maps and others are predominantly designed for ML experts and little research has compared the utility of these different explanation methods in guiding healthcare experts who may not have technical ML knowledge, for patient care. Additionally, current XAI methods provide explanations through complex visualizations which are static and difficult to understand for healthcare experts. These gaps highlight the necessity for analyzing and comparing explanation methods with healthcare professionals (HCPs) such as nurses and physicians. (1 min)
  2. Explainable artificial intelligence is increasingly used in machine learning (ML) based decision-making systems in healthcare Existing XAI methods such as LIME, SHAP, Saliency Maps and others are predominantly designed for ML experts and little research has compared the utility of these different explanation methods in guiding healthcare experts who may not have technical ML knowledge, for patient care. Additionally, current XAI methods provide explanations through complex visualizations which are static and difficult to understand for healthcare experts. These gaps highlight the necessity for analyzing and comparing explanation methods with healthcare professionals (HCPs) such as nurses and physicians. (1 min)
  3. Our research particularly focuses on providing an explainable interface for an ML-based system used for monitoring the risk of diabetes onset which could be used by healthcare experts such as nurses and physicians. To understand the real needs of our users in detail, we first conducted an exploratory focus group discussion with 4 nurses. Method – We first showed them SHAP based explanations for explaining the model predicted risk of diabetes onset We then conducted a codesign session with our participants to understand the key components of the explainable interface. Results: As a result of this study, we formulated the responses of our participants into the following User Requirements: Additionally, our user conveyed that visualizations for SHAP based explanations are complex and they need simpler visualizations to communicate with patients * Is it important to highlight about the tasks? (2 slides, 1.5 mins)
  4. Our research particularly focuses on providing an explainable interface for an ML-based system used for monitoring the risk of diabetes onset which could be used by healthcare experts such as nurses and physicians. To understand the real needs of our users in detail, we first conducted an exploratory focus group discussion with 4 nurses. Method – We first showed them SHAP based explanations for explaining the model predicted risk of diabetes onset We then conducted a codesign session with our participants to understand the key components of the explainable interface. Results: As a result of this study, we formulated the responses of our participants into the following User Requirements: Additionally, our user conveyed that visualizations for SHAP based explanations are complex and they need simpler visualizations to communicate with patients * Is it important to highlight about the tasks? (2 slides, 1.5 mins)
  5. This research work presents our Visually Directive Explanation Dashboard, which we developed following an iterative user-centric design process to satisfy our user requirements. We included model-agnostic local explanation methods to meet our explanation goals considering our user requirements. Our dashboard included feature importance explanations – Important Risk Factors Data Centric explanations – VC1, VC2 and V5 Counterfactual Explanations – Recommendations to reduce risk Another video – separate (30-45 secs) We further tailored the representation of these explanation methods. We mainly included interactive explanations that supported what-if explorations instead of static representations. Our users can alter the selected feature value to observe any change in the predicted risk We also separated emphasized on actionable health variables over non-actionable ones as the users can alter these actionable variable to obtain their favourable outcome. We also categorized the actionable features as patient measures – these provide information patient vitals like blood sugar, BMI etc. and patient behaviours – which provides information from behavioral information captured through FINDRISC questionnaires Our customizations also include providing information about feasibility and impact of counterfactual information presented as recommendations.
  6. This research work presents our Visually Directive Explanation Dashboard, which we developed following an iterative user-centric design process to satisfy our user requirements. We included model-agnostic local explanation methods to meet our explanation goals considering our user requirements. Our dashboard included feature importance explanations – Important Risk Factors Data Centric explanations – VC1, VC2 and V5 Counterfactual Explanations – Recommendations to reduce risk Another video – separate (30-45 secs) We further tailored the representation of these explanation methods. We mainly included interactive explanations that supported what-if explorations instead of static representations. Our users can alter the selected feature value to observe any change in the predicted risk We also separated emphasized on actionable health variables over non-actionable ones as the users can alter these actionable variable to obtain their favourable outcome. We also categorized the actionable features as patient measures – these provide information patient vitals like blood sugar, BMI etc. and patient behaviours – which provides information from behavioral information captured through FINDRISC questionnaires Our customizations also include providing information about feasibility and impact of counterfactual information presented as recommendations.
  7. This research work presents our Visually Directive Explanation Dashboard, which we developed following an iterative user-centric design process to satisfy our user requirements. We included model-agnostic local explanation methods to meet our explanation goals considering our user requirements. Our dashboard included feature importance explanations – Important Risk Factors Data Centric explanations – VC1, VC2 and V5 Counterfactual Explanations – Recommendations to reduce risk Another video – separate (30-45 secs) We further tailored the representation of these explanation methods. We mainly included interactive explanations that supported what-if explorations instead of static representations. Our users can alter the selected feature value to observe any change in the predicted risk We also separated emphasized on actionable health variables over non-actionable ones as the users can alter these actionable variable to obtain their favourable outcome. We also categorized the actionable features as patient measures – these provide information patient vitals like blood sugar, BMI etc. and patient behaviours – which provides information from behavioral information captured through FINDRISC questionnaires Our customizations also include providing information about feasibility and impact of counterfactual information presented as recommendations.
  8. This research work presents our Visually Directive Explanation Dashboard, which we developed following an iterative user-centric design process to satisfy our user requirements. We included model-agnostic local explanation methods to meet our explanation goals considering our user requirements. Our dashboard included feature importance explanations – Important Risk Factors Data Centric explanations – VC1, VC2 and V5 Counterfactual Explanations – Recommendations to reduce risk Another video – separate (30-45 secs) We further tailored the representation of these explanation methods. We mainly included interactive explanations that supported what-if explorations instead of static representations. Our users can alter the selected feature value to observe any change in the predicted risk We also separated emphasized on actionable health variables over non-actionable ones as the users can alter these actionable variable to obtain their favourable outcome. We also categorized the actionable features as patient measures – these provide information patient vitals like blood sugar, BMI etc. and patient behaviours – which provides information from behavioral information captured through FINDRISC questionnaires Our customizations also include providing information about feasibility and impact of counterfactual information presented as recommendations.
  9. We wanted to address the following research questions using our Visually Directive Explanation Dashboard - In general we wanted to analyze and compare the understandability, usefulness, actionability, and trust of the different explanation methods included in our dashboard with HCPs who are our primary users and patients who could be our potential users.
  10. We followed an iterative user-centric design process for the design and evaluation of our dashboard. We first designed a low-fidelity click-through prototype in Figma in multiple iterations. Here you can see the final version of the low-fidelity process We conducted a qualitative user study through 1:1 interviews with 11 healthcare experts. We evaluated our qualitative interview data using thematic analysis. We also utilized the feedback to perform design changes for our high-fidelity prototype Particularly, we improved the discoverability of our interactive visual explanation methods through tooltips and explicit visual indicators Overall, the healthcare experts were positive about the utility of this dashboard and further suggested that patients can directly use this as a self-monitoring tool. So we included patients as our participants in the next study. We then designed and developed our high-fidelity web application prototype. We conducted a mixed-methods study with 45 healthcare experts and 51 patients through online questionnaires. We evaluated the data gathered through descriptive statistics, test of proportion and analyzing participant reported Likert scale questions and their justifications. We finally addressed our research questions and summarized our research findings considering collective feedback from our two user studies.
  11. Explainable artificial intelligence is increasingly used in machine learning (ML) based decision-making systems in healthcare Existing XAI methods such as LIME, SHAP, Saliency Maps and others are predominantly designed for ML experts and little research has compared the utility of these different explanation methods in guiding healthcare experts who may not have technical ML knowledge, for patient care. Additionally, current XAI methods provide explanations through complex visualizations which are static and difficult to understand for healthcare experts. These gaps highlight the necessity for analyzing and comparing explanation methods with healthcare professionals (HCPs) such as nurses and physicians. (1 min)
  12. We share our design implications for tailoring the visual representation of directive explanations for healthcare experts from our observations and results Our modified design of this visual component (VC3) used in our high-fidelity prototype enabled them to perform interactive what-if analysis, i.e. allowed them to change the feature values and observe the change in the overall prediction. Hence, we recommend the usage of interactive design elements that allows what-if analysis for representing directive explanations for HCPs. This recommendation also supports hypothesis generation In our approach, we included only actionable variables for visual components which supports what-if interactions and better identification of coherent factors [57 ]. We anticipated that allowing the ability to alter values of non-actionable variables can create confusion for HCPs, especially for representing counterfactual explanations. HCPs indicated that the color-coded representations of risk factors were very useful for getting quick insights. Hence, we recommend the usage of color-coded representations and visual indicators to highlight factors that can increase or decrease the predictor variable. This further facilitates the identification of coherent factors. HCPs indicated that our representation of data-centric explainability through the patient summary was very informative. They could easily identify how good or bad the risk factors are for a specific patient. Additionally, they could get an overview of how other patients are doing as compared to a specific patient through the data-distribution charts. Thus, our representation of data-centric explainability provided a local explanation but with a global perspective. Furthermore, data-centric directive explanations support forward reasoning by providing access to source and situational data and yet can be easily integrated with multiple explanation methods.
  13. We share our design implications for tailoring the visual representation of directive explanations for healthcare experts from our observations and results Our modified design of this visual component (VC3) used in our high-fidelity prototype enabled them to perform interactive what-if analysis, i.e. allowed them to change the feature values and observe the change in the overall prediction. Hence, we recommend the usage of interactive design elements that allows what-if analysis for representing directive explanations for HCPs. This recommendation also supports hypothesis generation In our approach, we included only actionable variables for visual components which supports what-if interactions and better identification of coherent factors [57 ]. We anticipated that allowing the ability to alter values of non-actionable variables can create confusion for HCPs, especially for representing counterfactual explanations. HCPs indicated that the color-coded representations of risk factors were very useful for getting quick insights. Hence, we recommend the usage of color-coded representations and visual indicators to highlight factors that can increase or decrease the predictor variable. This further facilitates the identification of coherent factors. HCPs indicated that our representation of data-centric explainability through the patient summary was very informative. They could easily identify how good or bad the risk factors are for a specific patient. Additionally, they could get an overview of how other patients are doing as compared to a specific patient through the data-distribution charts. Thus, our representation of data-centric explainability provided a local explanation but with a global perspective. Furthermore, data-centric directive explanations support forward reasoning by providing access to source and situational data and yet can be easily integrated with multiple explanation methods.
  14. We share our design implications for tailoring the visual representation of directive explanations for healthcare experts from our observations and results Our modified design of this visual component (VC3) used in our high-fidelity prototype enabled them to perform interactive what-if analysis, i.e. allowed them to change the feature values and observe the change in the overall prediction. Hence, we recommend the usage of interactive design elements that allows what-if analysis for representing directive explanations for HCPs. This recommendation also supports hypothesis generation In our approach, we included only actionable variables for visual components which supports what-if interactions and better identification of coherent factors [57 ]. We anticipated that allowing the ability to alter values of non-actionable variables can create confusion for HCPs, especially for representing counterfactual explanations. HCPs indicated that the color-coded representations of risk factors were very useful for getting quick insights. Hence, we recommend the usage of color-coded representations and visual indicators to highlight factors that can increase or decrease the predictor variable. This further facilitates the identification of coherent factors. HCPs indicated that our representation of data-centric explainability through the patient summary was very informative. They could easily identify how good or bad the risk factors are for a specific patient. Additionally, they could get an overview of how other patients are doing as compared to a specific patient through the data-distribution charts. Thus, our representation of data-centric explainability provided a local explanation but with a global perspective. Furthermore, data-centric directive explanations support forward reasoning by providing access to source and situational data and yet can be easily integrated with multiple explanation methods.
  15. We share our design implications for tailoring the visual representation of directive explanations for healthcare experts from our observations and results Our modified design of this visual component (VC3) used in our high-fidelity prototype enabled them to perform interactive what-if analysis, i.e. allowed them to change the feature values and observe the change in the overall prediction. Hence, we recommend the usage of interactive design elements that allows what-if analysis for representing directive explanations for HCPs. This recommendation also supports hypothesis generation In our approach, we included only actionable variables for visual components which supports what-if interactions and better identification of coherent factors [57 ]. We anticipated that allowing the ability to alter values of non-actionable variables can create confusion for HCPs, especially for representing counterfactual explanations. HCPs indicated that the color-coded representations of risk factors were very useful for getting quick insights. Hence, we recommend the usage of color-coded representations and visual indicators to highlight factors that can increase or decrease the predictor variable. This further facilitates the identification of coherent factors. HCPs indicated that our representation of data-centric explainability through the patient summary was very informative. They could easily identify how good or bad the risk factors are for a specific patient. Additionally, they could get an overview of how other patients are doing as compared to a specific patient through the data-distribution charts. Thus, our representation of data-centric explainability provided a local explanation but with a global perspective. Furthermore, data-centric directive explanations support forward reasoning by providing access to source and situational data and yet can be easily integrated with multiple explanation methods.
  16. This paper presents three primary research contributions Visually directive data-centric explanations that provide local explanations of the predicted risk for individual patients with a global overview of risk factors for the entire patient population. The design of a directive explanation dashboard that combines visually represented data-centric, feature-importance, and counterfactual explanations and further compared the different visual explanations in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. Design implications for tailoring explanations for healthcare experts based on observations of our user-centered design process and an elaborate user study
  17. This paper presents three primary research contributions Visually directive data-centric explanations that provide local explanations of the predicted risk for individual patients with a global overview of risk factors for the entire patient population. The design of a directive explanation dashboard that combines visually represented data-centric, feature-importance, and counterfactual explanations and further compared the different visual explanations in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. Design implications for tailoring explanations for healthcare experts based on observations of our user-centered design process and an elaborate user study
  18. This paper presents three primary research contributions Visually directive data-centric explanations that provide local explanations of the predicted risk for individual patients with a global overview of risk factors for the entire patient population. The design of a directive explanation dashboard that combines visually represented data-centric, feature-importance, and counterfactual explanations and further compared the different visual explanations in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. Design implications for tailoring explanations for healthcare experts based on observations of our user-centered design process and an elaborate user study
  19. This paper presents three primary research contributions Visually directive data-centric explanations that provide local explanations of the predicted risk for individual patients with a global overview of risk factors for the entire patient population. The design of a directive explanation dashboard that combines visually represented data-centric, feature-importance, and counterfactual explanations and further compared the different visual explanations in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. Design implications for tailoring explanations for healthcare experts based on observations of our user-centered design process and an elaborate user study