Title: Developing Deep Learning Methods for Biomedical Image Analysis
Student: Jun Bai
Major Advisors: Sheida Nabavi
Associate Advisors: Jinbo Bi, Clifford Yang and Caiwen Ding
Date/Time: Wednesday, November 16th, 2022, 9:30 AM
Location: WebEx
Remote Access: https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m2983e6ccf0efded84f176250bf37ff68
Meeting number: 2623 273 7360
Password: YMk2MfQyJ35
Join by phone: +1-415-655-0002 US Toll
Access code: 26232737360
Abstract:
Cancers and disease induce considerable anxiety in the general population. Accurate and early cancer detection and disease identification are the key for diminishing disease burden. In the recent decades, X-ray imaging technology transformed the field of cancer and disease diagnostics and improved survival rate. However, low prevalence of cancer and disease in the screening population and the complexity of X-ray limit the performance of radiologists and increase the risk of false and missed diagnoses. The relatively recent advances of deep learning have been a revolutionary force in interpretation of biomedical images and imaging diagnostics. This also presents a unique opportunity for co-development of both X-ray imaging and deep learning that can better improve the validity of results and patient outcomes. However, there are numerous challenges to be overcome in this co-development such as lack of data, enormous resolution of data, tininess of tumors and occult abnormalities. Hence, novel deep learning models need to be developed to address these challenges.
In this study, we explore the way in which deep learning can be integrated into X-ray image analysis workflows using breast cancer images — including Digital Breast Tomosynthesis (DBT) and Full Field Digital Mammogram (FFDM)— for more accurate and early detection of cancer. We first reviewed the current state of research in AI-based mammogram interpretation and presented some of the limitations of integrating into clinical practice and the opportunities thesis present in this burgeoning field. Then, we developed novel deep learning Siamese-based models (FFS) to identify breast cancer using patients’ current and prior FFDMs (2D) with enhanced distance learning network. In addition, we represented DBT (3D) mammograms as graphs, then employed a self-attention graph convolutional neural network model (MGCN) to learn the features of 3D mammograms effectively and efficiently, and to identify breast cancer. Finally, we developed an un-supervised feature correlation learning model (UFCN) to localize breast cancer using patients’ current and prior FFDMs.