- This event has passed.
Ph.D. Defense: Chao Shang
July 23 @ 2:00 pm - 3:00 pm EDT
Doctoral Dissertation Oral Defense
Title: End-to-End Structure-Aware Convolutional Networks on Graphs
Ph.D. Candidate: Chao Shang
Major Advisor: Dr. Jinbo Bi
Associate Advisors: Dr. Alexander Russell and Dr. Yufeng Wu
Review Committee Members: Dr. Derek Aguiar and Dr. Caiwen Ding
Date/Time: Thursday July 23, 2020 2:00 pm – 3:00 pm
Meeting link: https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m5c05b422e31d7fbaf981862e156ae8f5
Join by Phone: +1-415-655-0002 US Toll
Access code: 120 206 1777
Convolutional Neural Networks (CNNs) are powerful tools to model data of a grid-like structure, such as image, video, and speech. However, a broad range of scientific problems generate data that naturally lie in irregular grids with non-Euclidean metrics, such as molecular graphs, knowledge graphs, and traffic networks. The generalization of CNNs to non-Euclidean structured data such as graphs is not straightforward. The classical convolutions cannot be applied directly to graphs, due to the lack of global parameterization, a common system of coordinates, and shift-invariance properties.
In this dissertation, we propose several structure-aware CNN models to calculate graph convolutions efficiently over both small-scale and large-scale graphs. The proposed networks can be trained by an end-to-end training method where a stochastic gradient descent algorithm back-propagates over all network components rather than a stage-wise training scheme where the different components are tuned separately. (1) The first part of the dissertation focuses on large-scale knowledge graphs. Our new approach learns the graph connectivity structure so it can infer new edges in a knowledge graph and grow an input knowledge graph to be more complete. This model not only utilizes the node (or entity) attributes and edge relations in a knowledge graph but also preserves the so-called translational property between entities and relations. (2) We extend the convolution operation to small-scale hydrogen-depleted molecular graphs. Unlike the first model that learns from a single graph of massive size, this method learns a CNN model from a massive amount of small graphs. We propose a CNN model that determines consistent edge attentions to the same type of edges appearing in different molecular graphs and predicts a molecule’s properties based on the resultant molecular-graph-based CNN. This model exploits the general consistency of the bond energies and bond lengths across various molecular graphs. (3) The third part focuses on the exploration of the correlation and causation among multivariate time series in a graph where a node represents a time series variable and an edge indicates if a time series causes another time series. When the connectivity (edge) structure is not available or incomplete, we propose a graph neural network (GNN) model that learns the hidden graph structure simultaneously when constructing the predictive model to predict autoregressively the new events in the time series. Extensive experiments demonstrate the advantages of the proposed techniques over the state of the art in knowledge graph completion, molecular quantitative structure-activity relationship prediction, and multivariate time series forecasting.