Skip to Main Content

An official website of the United States government

Government Funding Lapse

Because of a lapse in government funding, the information on this website may not be up to date, transactions submitted via the website may not be processed, and the agency may not be able to respond to inquiries until appropriations are enacted. The NIH Clinical Center (the research hospital of NIH) is open. For more details about its operating status, please visit  cc.nih.gov. Updates regarding government operating status and resumption of normal operations can be found at OPM.gov.

Principal Investigator
Name
Thomas Lukasiewicz
Degrees
Ph.D.
Institution
University of Oxford
Position Title
Professor
Email
About this CDAS Project
Study
PLCO (Learn more about this study)
Project ID
PLCO-638
Initial CDAS Request Approval
Jun 18, 2020
Title
Verifying Explanatory Methods
Summary
Deep learning is a field in artificial intelligence that has been showing revolutionary success in many domains. However, the decision-making processes of deep learning models are currently not human-interpretable. Hence, many explanatory methods are being designed with the goal of explaining these models. The PCLO dataset has been used by works in the field of explanatory methods (e.g., https://openreview.net/pdf?id=BJg_roAcK7). Our goal is to verify the faithfulness with which these explanatory methods explain models applied to the medical domain. Having access to the PLCO dataset would be essential to our goal.
Aims

* Verifying the correctness of explanatory methods in artificial intelligence

Collaborators

Eleonora Giunchiglia, University of Oxford
Jakob Foerster, University of Toronto
Thomas Lukasiewicz, University of Oxford
Phil Blunsom, University of Oxford