How informative is your XAI? Assessing the quality of explanations through information power
A growing consensus emphasizes the efficacy of user-centered and personalized approaches within the field of explainable artificial intelligence (XAI). The proliferation of diverse explanation strategies in recent years promises to improve the interaction between humans and explainable agents. This...
Saved in:
Main Authors: | Marco Matarese, Francesco Rea, Katharina J. Rohlfing, Alessandra Sciutti |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2025-01-01
|
Series: | Frontiers in Computer Science |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fcomp.2024.1412341/full |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
The role of IoT and XAI convergence in the prediction, explanation, and decision of customer perceived value (CPV) in SMEs: a theoretical framework and research proposition perspective
by: Kwabena Abrokwah-Larbi
Published: (2025-01-01) -
SmartSkin-XAI: An Interpretable Deep Learning Approach for Enhanced Skin Cancer Diagnosis in Smart Healthcare
by: Sultanul Arifeen Hamim, et al.
Published: (2024-12-01) -
Explainable AI chatbots towards XAI ChatGPT: A review
by: Attila Kovari
Published: (2025-01-01) -
Description-based Post-hoc Explanation for Twitter List Recommendations
by: Havva Alizadeh Noughabi, et al.
Published: (2024-12-01) -
XAI-Enhanced Machine Learning for Obesity Risk Classification: A Stacking Approach With LIME Explanations
by: Mohammad Azad, et al.
Published: (2025-01-01)