Residential College | false |
Status | 已發表Published |
Enhancing explainability in medical image classification and analyzing osteonecrosis X-ray images using shadow learner system | |
Wu, Yaoyang1; Fong, Simon1,4![]() ![]() | |
2025 | |
Source Publication | Applied Intelligence
![]() |
ISSN | 0924-669X |
Volume | 55Issue:2Pages:137 |
Abstract | Numerous applications have explored medical image classification using deep learning models. With the emergence of Explainable AI (XAI), researchers have begun to recognize its potential in validating the authenticity and correctness of results produced by black-box deep learning models. On the other hand, current diagnostic approaches for osteonecrosis face significant challenges, including difficulty in early detection, subjectivity in image interpretation, and reliance on surgical interventions without a comprehensive diagnostic foundation. This paper presents a novel Medical Computer-Aid-Diagnosis System—the Shadow Learning System framework—which integrates a convolutional neural network (CNN) with an Explainable AI method. This system not only performs conventional computer-aiding-diagnosis functions but also uniquely exploits misclassified data samples to provide additional medically relevant information from the machine learning model’s perspective, assisting doctors in their diagnostic process. The implementation of XAI techniques in our proposed system goes beyond merely validating CNN model results; it also enables the extraction of valuable information from medical images through an unconventional machine learning perspective. Our paper aims to enhance and extend the general structure and detailed design of the Shadow Learner System, making it more advantageous not only for human users but also for the deep learning model itself. A case study on femoral head osteonecrosis was conducted using our proposed system, which demonstrated improved accuracy and reliability in its prediction results. Experimental results interpreted using XAI methods are visualized to prove the confidence of our proposed model that generates reasonable results, confirming the effectiveness of the proposed model. |
Keyword | Computer-aid-diagnosis (Cad) Deep Learning ExplAinable Ai Medical Image Classification Neural Network Osteonecrosis |
DOI | 10.1007/s10489-024-05916-x |
URL | View the original |
Indexed By | SCIE |
Language | 英語English |
WOS Research Area | Computer Science |
WOS Subject | Computer Science, Artificial Intelligence |
WOS ID | WOS:001376610800004 |
Publisher | SPRINGERVAN GODEWIJCKSTRAAT 30, 3311 GZ DORDRECHT, NETHERLANDS |
Scopus ID | 2-s2.0-85211917607 |
Fulltext Access | |
Citation statistics | |
Document Type | Journal article |
Collection | DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE |
Corresponding Author | Fong, Simon |
Affiliation | 1.Department of Computer and Information Science, University of Macau, Taipa, SAR, Macao 2.Medical Imaging Department, The Sixth Affiliated Hospital of Jinan University, Guangdong, China 3.The First Affiliated Hospital of Guangzhou University of Chinese Medicine, Guangdong, China 4.Department of Artificial Intelligence, Chongqing Technology and Business University, Chongqing, China |
First Author Affilication | University of Macau |
Corresponding Author Affilication | University of Macau |
Recommended Citation GB/T 7714 | Wu, Yaoyang,Fong, Simon,Liu, Liansheng. Enhancing explainability in medical image classification and analyzing osteonecrosis X-ray images using shadow learner system[J]. Applied Intelligence, 2025, 55(2), 137. |
APA | Wu, Yaoyang., Fong, Simon., & Liu, Liansheng (2025). Enhancing explainability in medical image classification and analyzing osteonecrosis X-ray images using shadow learner system. Applied Intelligence, 55(2), 137. |
MLA | Wu, Yaoyang,et al."Enhancing explainability in medical image classification and analyzing osteonecrosis X-ray images using shadow learner system".Applied Intelligence 55.2(2025):137. |
Files in This Item: | There are no files associated with this item. |
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment