Skip to Main Content

An official website of the United States government

Principal Investigator
Name
Hristo Petkov
Degrees
BSc Computer Science
Institution
University Of Strathclyde
Position Title
Ph.D. Student
Email
About this CDAS Project
Study
NLST (Learn more about this study)
Project ID
NLST-720
Initial CDAS Request Approval
Oct 17, 2020
Title
Explainable Artificial Intelligence in Radiotherapy: A CT imaging based neural network autonomous system for clinical decision support used in automatic radiation dose calculation for Lung Carcinoma (Lung Cancer)
Summary
The purpose of this research proposal is to allow the PhD student to conduct research related
to Lung Cancer Radiotherapy and Artificial Intelligence, in order to create a trustworthy
autonomous system for treatment planning clinical decision support capable of performing
automatic radiation dose calculation for patients diagnosed with lung cancer. The system in
question will be able to provide justifications for the produced results.

The result being the creation of a trusted autonomous clinical decision support system for
treatment planning capable of explaining the results it produces. Such a system will be able to
combine health observations made on patients’ data with health knowledge in order to
improve the decision-making of radiotherapists with the aim of providing improved health
care for patients with Lung Cancer.

Despite the fact that, the Artificial Intelligence approach could be able to provide
improvements to healthcare, there is an entirely different set of problems related to this
approach. One of the major problems with the Artificial Intelligence approach, which this
research aims to address, is the lack of explanation supporting the outcome produced by such
systems. The ability to produce a result, even if it is correct, without providing justification
for it is a massive problem, due to the fact that doctors cannot trust or verify the correctness
of the produced result.

Explainable Artificial Intelligence, if implemented properly, could provide a solution for the
trust issues between humans and Artificial Intelligence. The creation of an explainable
artificial intelligence based treatment planning clinical decision support system for patients
diagnosed with Lung Cancer would provide the necessary explanations which must support
the results produced by the system. This would ultimately result in an increase of the trust
factor between humans and Artificial Intelligence.
Aims

Aim 1: Create a trustworthy autonomous system for treatment planning clinical decision support capable of performing
automatic radiation dose calculation for patients diagnosed with lung cancer.

Aim 2: Provide a solution for the trust issues between humans and Artificial Intelligence using Explainable Artificial Intelligence

Collaborators

Feng Dong, Ph.D., University of Strathclyde