Skip to Main Content

An official website of the United States government

Principal Investigator
Name
Omoye Esangbedo
Degrees
BSc
Institution
University of Derby
Position Title
Student
Email
About this CDAS Project
Study
PLCO (Learn more about this study)
Project ID
PLCO-1843
Initial CDAS Request Approval
Mar 12, 2025
Title
Exploration of Explainable AI Models using Machine Learning for Prostate Cancer Diagnosis
Summary
Prostate cancer remains one of the most prevalent cancers among men, with early detection being crucial for improved patient outcomes. Traditional diagnostic methods, such as Prostate-Specific Antigen (PSA) tests and Digital Rectal Exams (DRE), have limitations, including false positives and subjective interpretation. Machine learning (ML) models offer a promising solution by automating diagnosis, improving accuracy, and reducing diagnostic errors. However, the adoption of AI in healthcare is hindered by the "black box" nature of complex models, raising concerns about trust, interpretability, and clinical integration.
This project aims to explore Explainable AI (XAI) techniques to enhance the transparency of ML-driven prostate cancer diagnosis. By leveraging datasets from prostate cancer screening and diagnostic procedures, the study will compare multiple ML models. Two XAI methods will be employed to interpret the models’ predictions, ensuring clinical relevance and trustworthiness.
The objectives of this study include:
Evaluating ML models for their accuracy in prostate cancer detection.
Applying XAI techniques to interpret model decisions.
Comparing model performance and interpretability to determine the most clinically useful approach.
Assessing the feasibility of AI integration into clinical workflows for enhanced decision-making.
Through this research, I aim to bridge the gap between AI advancements and real-world medical applications, fostering trust in AI-assisted prostate cancer diagnosis. The findings will provide insights into balancing accuracy with interpretability, making AI-driven healthcare more accessible and reliable.
Aims

Develop a Machine Learning Pipeline for Prostate Cancer Diagnosis
Utilize supervised ML models such as Random Forest, SVM, and Neural Networks.
Train models on prostate cancer screening and diagnostic datasets.
Apply Explainable AI (XAI) Techniques
Implement SHAP and LIME to interpret ML model predictions.
Compare XAI methods for their effectiveness in providing transparent explanations.
Evaluate Model Performance and Interpretability
Assess accuracy, sensitivity, and specificity of ML models.
Measure the clinical relevance and trustworthiness of XAI explanations.
Integrate Findings into a Clinically Relevant Framework
Analyse how AI-driven explanations align with clinical decision-making.
Provide recommendations for improving AI adoption in prostate cancer diagnosis.

Collaborators

Working on this by myself for my final year project.